We provide IT Staff Augmentation Services!

Aws Big Data Architect Resume

5.00/5 (Submit Your Rating)

NC

SUMMARY

  • Having 18 years of experience in IT industry working on technologies like Big Data, Hadoop, Scala, BI database consulting and cloud technologies like virtual machine (OVM) with private cloud setup, E - Business suite Web Logic Server, OBIEE, SOA, AWS technologies with professional certifications.
  • Excellent working experience with global multidisciplinary teams at all levels, including technical, non-technical, managers and business team members.
  • High experience in data, business, application and technology architecture as per TOGAF.
  • Expertise in the AWS distributions of Bigdata, Big data Insights, Hadoop and Horton works.
  • Experience in distributed computing architectures and NoSQL Databases such as DyanamoDB, knowledge on Cassandra, Mongo DB and HBase to solve big data requirement problems.
  • Experienced in relational Databases such as Oracle, MySQL, MSSQL Server and Teradata to solve and create models for Data warehouses and the complete integration with Big Data, ODS and the Data Lake.
  • Experience in Hadoop, analytics on databases such as Hive, Phoenix, Impala and Aster.
  • Strong ETL background, working with different ETL tools like SQL Server Integration Services, Informatica, AWS Glue, AWS Spark and inhouse job controls using Cloudwatch, Cloudtrail.
  • Relational data normalization and segregation due to corporate mergers and acquisitions.
  • Architect and implementation of virtual machines, with 3 node RAC cluster setup and disaster recovery, cloud infrastructure.
  • Installation and configuration of hypervisors on DELL and HP servers, discover, setup, configure virtual machines on hypervisors for creating databases and distribute compute load.
  • Worked on visualization tools like AWS QuickSight, Chart.js, D3.js and integration of data from S3 buckets.
  • Worked on programming languages like linux shell scripting, C, C++, Java.
  • Knowledge and working experience on cloud base tools and analytics machine learning tools like Amazon Machine Learning, Sagemaker, Cloud Machine Learning Engines and Experience in Supply Chain Management.
  • Hands on experience in Normalization (1NF, 2NF, 3NF and BCNF) Denormalization techniques for effective and optimum performance in OLTP and OLAP environments.
  • Experienced in the deploy of database platforms, IaaS, PaaS, SaaS and end to end solutions of Databases and applications.
  • Developing and designing POC's using Web services, ETL, storage, virtualization, application servers, data security, data bases, BI and Kinesis Analytics tools and testing all the components in UT, SIT, UAT.
  • Expertise in integration of data from different formats like Spreadsheets, Text files, JSON, XML files, sequential, logs, structured, semi structured and no structured. From different sources as, applications, core systems, external, media and RDBMS.
  • Solid knowledge of Databases, Data Marts, Operational Data Store (ODS), Dimensional Data Modeling (Star Schema Modeling, Snow-Flake Modeling for FACT and Dimensions Tables), including designing, developing and implementation of data models for enterprise-level applications and systems.
  • Hands on experience using AWS EMR as a big data platform and ML algorithms for prediction, mainly but not limited to: Regression, classification and Neural Networks on AWS S3.
  • Used python to build machine learning modules, used data to make predictions and used modules with persistence to reuse with different set of data.
  • Expertise in Data Architect, Data Modeling, Metadata, Data Migration, Data mining, Data Science, Data Profiling, Data Governance, Data Cleansing, Transformation, Integration, Data Import, and Data migrations. experience in architecting, loading and analyzing datasets with AWS Bigdata framework (MapReduce, HDFS, pig, hive, Flume, Sqoop, spark, NIFI, Scala, solar, storm) and NoSQL databases such as DynamoDB.
  • Previous experience in different roles as Application Architect, Data Architect, system architect, operations analyst, operations lead, business analyst, ETL developer and deputy of Sr Manager.
  • Experienced and PMP certified in project management methodologies, Scrum / Agile, Waterfall, SDLC and Devops model.

TECHNICAL SKILLS

ERP: Oracle E-Business suite with data flow from HR to Finance modules.

Cloud: AWS and Oracle cloud infrastructure, AWS ML, AWS EMR, S3.

Big Data Courses: AWS Big Data, Hadoop, Python.

Big Data tools: Kafka, Scala, HDFS, Redshift, DynamoDB, Kinesis.

Database/Oracle technologies: Oracle databases from 9i, 12c and 18c, cloud set up and migration, OBIEE, Oracle APEX, ERP system installation, configuration and maintenance, MS SQL, MySQL

Operating systems/Scripting: HP Unix, Sun OS 5.x, AIX Unix, Oracle Enterprise Linux, Windows NT 4.0, 2000.

Manager negotiation and communication: Motivating Employees, Building Your influence as a leader, Interpersonal Communication, The role of line manager, Communication and Leadership

Project Management: PMP certified project manager.

PROFESSIONAL EXPERIENCE

Confidential, NC

AWS Big data Architect

Responsibilities:

  • Create IoT thing, certificate, policies, topic. Subscribe to the topic. Configure an EC2 instance as IoT by installing the necessary AWS SDKs, setup python virtual environment. Installation and import of required packages, methods and functions.
  • Create kinesis firehose stream and configure an IoT thing to deliver data from that IoT to Kinesis stream and save the data in S3 bucket using IoT rule engine. Create Kinesis streams and ingest data to kinesis analytics to use sql editor to query and analyze the data.
  • Database migration from MySQL on premises to MySQL cloud(homogenous), creating the source, target end points, replication tasks with replication only option.
  • Create Amazon aurora databases and migration of databases into Aurora. Write complex SQL queries to extract data and for performance tuning.
  • Create DynamoDB tables, define RCU and WCUs with local secondary indexes, insert data into Dynamo DB tables and query using console and python scripts.
  • Create dynamo DB streams, lambda function, setup trigger on the function,
  • Configure this trigger function to insert data into MySQL database.
  • Create EMR cluster with spark and ganglia using m4.large instances. Configure S3 bucket and load data. Spin up spark instances by connect using SSH to the master node of the cluster, import required packages like csv and json. Load data from S3 bucket to spark cluster. Connect using HUE, load data into cluster from HUE query using the Hive SQL.
  • Call the csv from python venv and convert it into json format. Send the messages to IoT topic like an IoT thing.
  • Create EC2 instances with security configured using private and public keys for secure authentication and enable access using ssh.
  • Create VPC for network security and enable EC2 instances to be accessed within a private subnet and enable bastion host access to databases nodes.
  • Create Amazon aurora databases and use DMS migration services to migrate the databases.
  • Migrated databases for heterogenous RDBMS instances using DMS, creating IAM roles with policies attached.
  • Configure and use RDS database log files for monitoring health of the Aurora databases, check operating system metrics using enhanced monitoring, setup metrics using cloud watch logs.
  • Create clusters for EMR, Redshift and RDS for high availability and accessing these cluster end points using ssh and other AWS services like s3, lamda and elastic search.

Confidential, Auburn Hills MI

Sr. Bigdata Engineer

Responsibilities:

  • Objective of this project is to migrate all the services from in-house to cloud (AWS). This includes building a datalake as a cloud-based solution in AWS using AmazonS3 and makes it a single source of truth.
  • Provided meaningful and valuable information for better decision-making.
  • Migration of data includes various data types like Streaming data, structured data and unstructured data from various sources and also includes legacy data migration.
  • Utilize AWS services with focus on big data analytics, enterprise data warehouse and business intelligence solutions to ensure optimal architecture, scalability, flexibility,
  • Designed AWS architecture, Cloud migration, AWS EMR, DynamoDB, Redshift and event processing using lambda function.
  • Built NoSQL solution for non-structural data using AWS DynamoDB services
  • Built data warehousing solutions on analytics/reporting using AWS Redshift service.
  • Developed Python programs to consume data from APIs as part of several data extraction processes and store the data in AWS S3.
  • Implemented performance optimizations on the PrestoSQL Queries for improving query retrieval times.
  • Used Query execution plans in Presto for tuning the queries that are integrated as data sources for Tableau dashboards.
  • Designed of Redshift Data model and working on the Redshift performance improvements that helps faster query retrieval and also improves the dependent reporting/analytics layers.
  • Developed data transition programs from DynamoDB to AWS Redshift (ETL Process) using AWS Lambda by creating functions in Python for the certain events based on use cases.

Confidential

Data architect & DBA

Responsibilities:

  • Setup a private cloud using oracle virtual machine, with xen hypervisor installation using ILOM (.NET Integrator remote console) on HP DL360 super cluster.
  • Infrastructure build using Oracle virtual machines configuration with VLAN setups.
  • Installation and configuration of OVM Manager and discover virtual servers, create server pool and repository with physical disk for storage.
  • Configure 2 data centers as primary and DR sites, with manager on a separate network for backup and failover with the UUID from primary datacenter.
  • Create multiple network bonding on virtual machines on servers for various virtual machines.
  • Create virtual machines/servers and installation of Linux operating systems, WebLogic servers with configurations. worked on a wide range of oracle database technologies like RAC, RMAN SSO/OID, Portal, Data Guard, Oracle streams etc. Some of the projects are 24*7 Offshore /Onsite Support of Oracle11i Applications with 10g AS technologies like SSO/OID and Portal implemented. Oracle financial applications are integrated with external systems for data processing transactions, which require careful analysis and maintenance. All the databases and applications are installed on UNIX flavors.
  • Data normalization and de normalization for company acquisitions and mergers.
  • Upgrade of Oracle databases and E-Business suites components.
  • Architect and implementation of APEX on windows servers.
  • Platform migration from IBM AIX to Oracle Exadata.
  • SME in, maintenance, troubleshooting issues on clustered databases on Linux machines,
  • Performance tuning of PL/SQL queries and maintenance using AWR, explain plan, OEM and gather stats package, concurrent managers
  • Implementation of WNA/SPNEGO SSO for ECM, Portals, web center spaces on WebLogic servers.
  • Proactively monitor and providing 24x7 On-Call Technical Support to Mission Critical, Business Critical and Normal Databases and Applications.
  • Implementation and configuration of 10g SSO for Oracle E business suite R12.
  • Installation, upgrade and configuration of oracle databases from 9i to 12c.
  • Documentation of the upgrade projects, on call support issue and troubleshooting steps and long term fix. Analysis, study and documentation of the existing systems for upgradation and enhancement projects.
  • Installation and configuration of Oracle Enterprise Manager 12c with E-Business suite, OBIEE.
  • Installation/Configuration/domain creations and troubleshoot of Oracle Weblogic server.
  • Data guard build for 3 node RAC standby databases.
  • Cloning R12 E-Business suite as part of implementation and autoconfig customizations, patching on RAC databases.

We'd love your feedback!