We provide IT Staff Augmentation Services!

Lead Data Engineer Resume

5.00/5 (Submit Your Rating)

EXPERIENCE SUMMARY:

  • Strategic, focus and business oriented leader with 12 years of successful career in information technologies and business process improvement.
  • Extensive experienced in developing and performing ETL for clients across Health Care, Banking, Retail and Telecommunication industries.
  • Highly skilled in Informatica Power Centre, Power Exchange and Big Data Edition. Extensively worked in Informatica version 7.x, 8.x, and 9.x.
  • Strong experience in Talend, Alteryx, SSIS and DataStage ETL tools.
  • Experienced in reporting tools such as Power BI, BusinessObjects, SAS and Cognos and Tableau.
  • Experienced in Data Modeling and analyzing complex data and providing solutions.
  • Experienced in Hadoop ecosystem and technologies.
  • Highly skilled in Hive, Pig, Spark, Sqoop and Oozie.
  • Highly skilled in GCP cloud Platform.
  • Experienced in Agile methodology to deal with the changing requirements.
  • Highly skilled in Oracle, DB2 and SQL server databases. Achieved OCA from ORACLE in 2008.
  • Highly skilled in scheduling tools like Appworx, TIDAL, Autosys and DAC.
  • Experienced in ETL design and solution to integrate data from various source systems and OLTP to Enterprise Data Warehouse/hub/lake.
  • Experienced in Versioning tools like WINCVS, PVCS and GIT.
  • Experienced with preparing documentation for projects and provided analysis and production support for business.
  • Demonstrate strong leadership skills by successfully leading medium to large projects.

TECHNICAL SKILLS:

ETL/BI: Informatica V7.x, Informatica V8.x and Informatica V9.x, Informatica V10.x and Informatica Power Exchange, Big Data Edition, Talend, Alteryx, SSIS, Teradata, Power BI, Cognos, Business Objects, Tableau.

Databases: Oracle, HIVE, DB2, SQL Server, Siebel

Cloud: GCP

WORK EXPERIENCE:

Confidential

Lead Data Engineer

  • Involved in Platform Modernization project to get the data into GCP.
  • Designing and building data pipelines to load the data into GCP platform.
  • Interact with business to gather requirements and propose the effective and efficient design.
  • Implement best practices, methodologies and framework for enterprise data management department.
  • Provide consistent delivery of project, improve business processes, data management and cost effective solutions.
  • Conducted data profiling to understand source data structure and contents including data granularity, data format and data volume.
  • Migrated SAS datasets into Big Data using Hive Scripts and created reporting layer enabling business lines to create reports. Created Summary tables at various granularities in the reporting layer.
  • Performed data analysis to drive the target data architecture
  • Created data curation specifications based on the target architecture
  • Development of data transformation/loading pipeline according to the data curation specification
  • Create Enterprise Data Lake (EDL) that integrates all source system data in the lake using Hive QL and ETL/BI tools.
  • Create project documents, templates and contribute to the POC and code reviews.

Environment: GCP, DataFlow, BigQuery, Bigdata, HIVE, Sqoop, Spark, Oozie, Power BI, Talend, Alteryx, Informatica V10.x, SAS, Tidal, UNIX

Confidential

Lead Informatica Consultant

  • Technical lead responsible for delivering complex solutions and projects.
  • Interacted with business to gather requirements and to propose the effective and efficient design.
  • Created Enterprise Data Hub (EDH) which integrated all source system data in the hub using BDE version.
  • Created Data Cleansing/Data Validation tool for various source system data.
  • Used Power Exchange to process EBCDIC files from Mainframe server.
  • Used concepts such as push down optimization and Incremental aggregation for tuning activities
  • Led team of 4 Informatica Consultants and monitored their delivery output.
  • Created project documents, templates and was involved with POC and code review.

Environment: Informatica V9.6 BDE, Informatica V10.x, HIVE, Oracle, Power Exchange 9.x, Autosys, UNIX

Confidential

Solution Architect

  • Analyzed, defined and coordinated the participation of others in the definition of efficient, cost effective business intelligence solutions which supported client business processes and functional requirements
  • Examined business requirements and defined how the IT solution would address those requirements.
  • Prepared documentation on expected benefits, use cases, current/proposed process, workflows, data flows, process re - engineering studies, functional specifications, risk integration, end-user adoption plans/ guides and organization structures.
  • Organized assumptions and probes for refined data and provided analysis.
  • Proposed system requirements and/or business process changes to fulfill business user requirements, ensuring system development or business process solutions
  • Managed the ETL and report functional detailed design document and source to target mapping matrix.
  • Subject matter expert for business, project managers, development, QA and production support teams.
  • Ensured projects align with business and IT strategies by influencing the development of plans and processes in supported organizations.

Environment: Informatica V9.X, Oracle, Sql Server, Control M, UNIX and Siebel

Confidential

Senior ETL Consultant

  • Handled all ETL module in eCommerce implementation at “ Confidential ”
  • Created ETL for Product Management system.
  • Interacted with business and various downstream systems to gather requirements and to create design and mapping documents.
  • Conducted ETL activities such as table inserts and fact - dimensional loading
  • Used concepts such as push down optimization and Incremental aggregation for tuning activities
  • Created and updated extracts by using SSIS.
  • Created and enhanced project templates for the technical requirements, design and testing documents.
  • Provided code review for ETL code and to give sign off.

Environment: SSIS 2012, SQL Server

Confidential

Informatica Consultant

  • Interacted with business to gather requirements and proposed effective and efficient designs.
  • Developed extracts for vendors based on the financial data.
  • Created and enhanced project templates for the technical requirements, design and testing documents.
  • Provided code review for ETL code and gave sign off.
  • Used Power Exchange to process EBCDIC files.
  • Developed scheduling using Autosys for Informatica workflows.
  • Used concepts such as push down optimization and Incremental aggregation for tuning activities
  • Created Informatica mappings to load fact and dimension tables.
  • Assisted and supported team members in performance tuning activities in Inforamtica

Environment: Informatica V9.x Power Centre and Power Exchange 8.x, Oracle, Autosys, UNIX, PL/SQL

Confidential, Halifax, NS

Lead Informatica Consultant (Permanent)

  • Developed data warehouse for a leading Health Insurance Provider using Informatica as ETL tool.
  • Led a team of 2 and was responsible for offshore deliverables.
  • Interacted with client and business to gather requirements and actively involved in the design phase of the projects.
  • Involved in a project that rewrote PL/SQL code into Informatica ETL.
  • Implemented CDC using Informatica Power Exchange
  • Contributed to all data analysis and provided ETL solutions based on complex data.
  • Developed extracts for vendors based on the Enterprise data.
  • Created and enhanced project templates for technical requirements, design and testing documents.
  • Created standards and best practices for Informatica coding.
  • Used WinCVS as Version control tool.
  • Used concepts such as push down optimization and Incremental aggregation for tuning activities
  • Contributed to the code review for ETL code and gave sign off.
  • Developed scheduling system using TIDAL for Informatica workflows.
  • Created Informatica mapping to load fact and dimension tables.
  • Assisted and supported team members in performance tuning activities in Inforamtica.

Environment: Informatica V9.x, Oracle, Tidal, UNIX, WINCVS, P/LSQL

Confidential

Lead IT Analyst (Permanent)

  • Developed data warehouse for a leading retail giant, manufacturing, telecommunication and banking clients using Informatica as ETL tool.
  • Led team of 8 and was responsible for offshore deliverables.
  • Creates data mapping and data transformation rules from source system to target ODS/EDW.
  • Created supply chain management application for manufacturing client by creating a data mart.
  • Deployed admin code for the QA, UAT and production environments.
  • Created standard and best practices for Informatica coding.
  • Used concepts such as push down optimization and Incremental aggregation for tuning activities
  • Actively involved in the code review for ETL code and gave sign off.
  • Created test data management for testing.
  • Handled QA and UAT testing which interacts with QA team and business requirement analysis and fixed defects.

Environment: Informatica V8.x, Oracle, DB2, Talend, SQL Server, SSIS, Siebel and UNIX

Confidential

Programmer Analyst (Permanent)

  • Developed data warehouse for a leading Telecommunication clients in US.
  • Team member for various data warehousing projects and successfully implemented the Data warehouse concepts.
  • Designed and automated Rejection Handling System for project which receives source files and bad records sent back to business automatically.
  • Involved in QA testing and UAT testing to interact with QA team and business to requirement analysis and to fix the defects.
  • Handled independent Modules for on-time delivery with minimal defects and involved in data modeling.
  • Received SCOPE ( Special Contribution for Project Excellency Award) from Cognizant for on time delivery of the project
  • Created POC using Dataflux as data quality tool for a Telecommunication client.

Environment: Informatica V7.x, V8.x, Oracle, Talend, Appworx, UNIX, Talend

We'd love your feedback!