We provide IT Staff Augmentation Services!

Lead Data Engineer Resume

4.00/5 (Submit Your Rating)

SUMMARY:

  • Motivated Senior / Lead Data Engineer wif over 15 years in Software Analysis, Design, Development, Testing, Implementation and Production Support wif various
  • Client/Server and Web Based Applications. Strong background in Big Data, Hive, Pig, AWS Cloud, Docker and ETLs and data modeling, data warehousing, machine learning, Inman and Kimball methodologies. Solid experience presenting results to both technical and non - technical stake holders, self-motivated and enthusiastic collaborator.
  • Extremely well versed in reliability theory and hands on experience performing reliability engineering for complex system and lastly, I always strive for excellence and being a trusted advisor is key for me.
  • Informatica Administration and Informatica cloud services
  • Building ETL data pipelines on Hadoop/Teradata using Hadoop/Pig/Hive/UDFs
  • Utilized Docker for the runtime environment for the CI/CD system to build, test, and deploy
  • Data modeling and Dimensional modeling using Kimball methodologies
  • Full Lifecycle (SDLC) of Data Warehouse projects such as Dimensional Data Modeling, etc.
  • Data warehousing concepts: Star schema, Snowflake star schema, etc... Data Lake, SQL, etc.
  • Written technical and Business Documentation for large proposals, project management and on time deliverables
  • Comprehensive noledge of system engineering, reliability life-cycle management, and reliability modelling

PROFESSIONAL EXPERIENCE:

Confidential

Lead Data Engineer

Responsibilities:

  • Managed a team of 5 Data Engineers
  • Responsible for building and supporting a Hadoop-based ecosystem designed for enterprise-wide analysis of structured, semi-structured, and unstructured data
  • Export/Import data from Teradata to Hive/HDFS using Sqoop and the Hortonworks Connector for Teradata
  • Implemented Continuous Integration and Continuous Delivery Process using Git, Docker images along wif Python and Shell scripts to automate routine jobs, which includes synchronize installers, configuration modules, packages and requirements for the applications
  • Used Presto - CLI Configured and connected to Hive Metastore for analytical queries
  • Hands on developing ETL data pipelines using pyspark on AWS EMR
  • Migrated an existing on-premises application toAWS. UsedAWSservices likeEC2andS3for small data sets processing and storage,Experiencedin Maintaining the Hadoop cluster onAWS EMR
  • Used spark and Scala for interactive queries, processing of streaming data and integration wif NoSQL database for huge volume of data
  • Hands on Django framework using PyCharm, hands on Airflow workflow management
  • Wrote AWS Lambda code in Python for nested Json files, converting, comparing, sorting etc.
  • Configuration of OAuth2 autantication on Cognito user pool for AWS API Gateway
  • Automation of spinning EMR and autoscaling, queue configuration wif in EMR for parallel processing (capacity-scheduler, Fair, FIFO)
  • Used Presto for connecting to Hive to analyze large volume of data
  • Hadoop Integration wif Hive connectors and used Presto for querying data / Hands-on experience developing ETL’s using Informatica Cloud Services
  • Jenkins CI/CD using Blue Ocean pipeline for multi branch complex workflows, distributed deployments between different environments and different applications
  • API creation, optimization of API query and exception handling, performance test for latency and better response.
  • Hands-on experience wif Informatica power center and power exchange in integrating wif different applications and relational databases

Confidential

Sr. Consultant

Responsibilities:

  • Developed various ETL flows using Informatica power center and power exchange: Salesforce to oracle (continues data capture) and oracle to Salesforce using external ids ODS to db2 etc...
  • As SFDC consultant used Informatica for cross application integration requirements. Mapping complex data structures, setting up understandable and reliable business rules
  • Functional requirements gathering from Operational & Business Users and translate the Functional requirements to Technical requirements & specifications

Confidential

Senior Engineer

Responsibilities:

  • Responsible for Business Requirement Designs from clients and worked wif Architecture/RAD teams for shaping out the Design of ETL tasks lists
  • Hands-on experience developing ETL’s using Informatica Cloud Services (ICS)
  • Hands-on experience wif Informatica power center and power exchange in integrating wif different applications and relational databases
  • Created AWS Code Pipeline, a service that builds, tests, and deploys code every time their is a code change, based on the release process models
  • Created pipeline that uses AWS Code Deploy to deploy applications from an Amazon S3 bucket and AWS Code Commit repository to Amazon EC2 instances running Amazon Linux
  • Conducted data analysis and maintenance of production data

Confidential

Senior Engineer

Responsibilities:

  • High Level Design (HLD) which comprises of Data Modeling, Design, Data Validation anomalies
  • Experience wif Performance Tuning for Oracle RDBMS using Explain Plan and HINTS
  • Logical/Physical Design of the databases
  • Resolved issues such as Loops, Fan Traps & Chasm Traps by creating Aliases and Contexts as required in removing the cyclic dependencies and testing them to ensure for correctness
  • Tuned queries to improve Performance of existing Reports for various functional areas
  • Tested aggregated awareness to ensure queried are pulling correct level of aggregation

Confidential

Senior Engineer

Responsibilities:

  • Developed framework for ETL workflows
  • Created necessary RDBMS model and table structures for sourcing customer profile information based on geography, nature, asset classes and credit ratings
  • Conducted data analysis and maintenance of production data
  • Designed ETL strategies for load balance, exception handling and design processes that can satisfy high data volumes. Setup new servers, strategized and created a successful migration plan

Confidential

Consultant

Responsibilities:

  • Identified and debugged technical principles which weren’t previously identified.
  • Developed Complex database objects such as Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL

We'd love your feedback!