We provide IT Staff Augmentation Services!

Etl Developer/data Engineer Resume

4.00/5 (Submit Your Rating)

Plymouth, MN

SUMMARY

  • 13+ years of rich experience Analysis, Design, Development, Testing and Implementation of business application systems, ETL processes for Banking and Financial sectors.
  • 7+ years of strong ETL experience of using IICS, Informatica PowerCenter 10.4.0/9.6/9.5 Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and server tools and programming in SQL, PL/SQL, Stored procedures.
  • Highly experienced in implementing Data Integration project end to end from scratch till post production using Informatica tool on different platform.
  • Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses using IICS, Informatica PowerCenter (Repository Manager, Designer, Workflow Manager, Workflow Monitor) as ETL tool on Cassandra, Oracle, DB2 and SQL Server Databases.
  • Expertise in designing confirmed and traditional ETL Architecture involving Source databases, Main Frame systems (COBOL files), Oracle, SQL server, Cassandra, and Target databases Oracle, DB2, SQL Server, Flat Files, XML, Cassandra.
  • Reusable transformations and Mapplets are built wherever redundancy is needed.
  • Great exposure on Dimensional data modelling, design of Data Warehouse, ETL, development standards and best Practices.
  • Expertise in slowly changing dimensions to maintain historical as well as incremental data using Type me, Type II and Type III strategies.
  • Experience in creating Azure Data Factory pipe lines to load data from various sources to Azure storage. Experience on migrating on premise databases to Azure Data Lake, Azure SQL DB using Azure Data factory pipelines.
  • Experience in implementing ETL processes in Azure using Azure Databricks, Azure Data Factory, Event hubs, Azure storage, and Cosmos DB.
  • Experience in developing spark applications using Pyspark in Databricks for data extraction, transformation and aggregation from multiple file formats for transforming teh data.
  • Experience in using Azure key vault to store secrets and use them in Azure Data factory and Databricks.
  • Experience in implementing CI/CD using Github and Azure DevOps.
  • Strong experience in preparation of high level, low level design, source to target mapping specs, unit Test Cases, Estimation, Code Review, Migration and Deployment templates and documents.
  • Experience in writing UNIX shell scripts and automation of teh ETL processes using UNIX shell scripting.
  • Experience in using Automation Scheduling tools like CA7, Stonebranch and Autosys.
  • Hands-on experience in creating Python scripts to extract data from databases (SQL/T-SQL).
  • Extensive experience in banking and mutual funds domain.
  • Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration, and user acceptance testing.
  • Expertise in Agile Process flow and Waterfall methodologies.

TECHNICAL SKILLS

Operating Systems: Windows, Linux, z/OS

DBMS: IBM DB2, Cassandra, Oracle, SQL Server, Cosmos DB, Azure SQL

Languages: Python, SQL, T-SQL, Java, C++

Tools: Informatica PowerCenter 9.5/9.6/10.4.0 Stonebranch Scheduler, CA7 Scheduler, Control-M, Autosys, MoveIT CA DB2 Products, File-Aid, Version One, Jira

PROFESSIONAL EXPERIENCE

Confidential, Plymouth, MN

ETL Developer/Data Engineer

Responsibilities:

  • Working as Informatica developer and data engineer.
  • Coordinate wif teh business team and other tech team and plan and execute all ETL integration activities.
  • Working wif Business Team for Requirement gathering and creating functional design documents.
  • Coordinating wif cross commits to finalize teh data sourcing / mapping.
  • Participated in ETL profiling process and impact analysis.
  • Designing teh ETL process and defining teh strategies for data loads.
  • Developed number of Complex ETL mappings to load data into Dev/QA and Data warehouse target.
  • ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 9.6/10.4.0 on Linux server.
  • Created mappings to load complex XMLs in Informatica using flat files, Oracle as source.
  • Created mapping to load data into Cassandra, Oracle using XML and flat files as source.
  • Involved in development & on-going maintenance of Data ware house activities.
  • Identify efficiencies and ways to improve design and development processes.
  • Data analysis, data mapping and generating data feed to online banking front end applications through ETL process.
  • Used Oracle to write SQL queries that create/alter/delete tables and to extract teh necessary data.
  • PL/SQL & Stored procedure development in both Oracle and SQL server.
  • Built Azure Data Factory pipelines to migrate data from on-premise databases to Azure.
  • Implemented ETL process in Azure by reading data from event hubs, processing data in Databricks and loading in Cosmos DB.
  • Worked on loading real time data into blob storage using Azure event hub.
  • Worked on integrating teh in-house SQL database and load data into Azure Cosmos DB, Blob storage using Azure Data Factory.
  • Created Python scripts to do transformation logic in Databricks.
  • Created Python SQL scripts to extract data from SQL server, Oracle database.
  • Implement performance tuning on a variety of Informatica maps to improve their throughput.
  • Driving all ETL related projects/tasks/activities and its deliverables.
  • Streamlining teh ETL process, Load process, Prepare Documentation of various applications.
  • Involved in analysis and decommission of unused ETL flows

Confidential, Plymouth, MN

ETL Developer/Data Engineer

Responsibilities:

  • Worked as Informatica developer and data engineer as part of Digital Banking team.
  • Coordinate wif teh business team and other tech team and plan and execute all ETL integration activities.
  • Working wif Business Team for Requirement gathering and creating functional design documents.
  • Participated in ETL profiling process, impact analysis & designing strategies for data loads.
  • ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 9.6 on Windows server.
  • Created mappings to load complex XMLs in Informatica using flat files, Oracle as source.
  • Created mapping to load data into Cassandra, Oracle using XML and flat files as source.
  • Created process to consume COBOL files, flat files and load into Data Ware house.
  • Involved in development & on-going maintenance of Data ware house activities.
  • Data analysis, data mapping and generating data feed to online banking front end applications through ETL process.
  • PL/SQL & Stored procedure development in both Oracle and SQL server.
  • Implement performance tuning on a variety of Informatica maps to improve their throughput.
  • Driving all ETL related projects/tasks/activities and its deliverables.
  • Streamlining teh ETL process, Load process, Prepare Documentation of various applications.

Confidential, Denver, CO

Project Lead (ETL/Data)

Responsibilities:

  • Worked as Informatica developer and data engineer and acted as project lead for teh offshore team.
  • Working wif teh Business Team for requirement gathering and creating functional design documents.
  • Coordinate wif teh business team and other tech team and plan and execute all ETL integration activities.
  • Involved development and testing of different mutual funds mergers.
  • Involved in merger wif another company - responsible for data analysis, data mapping and developing data feeds for teh merge project.
  • Participated in ETL profiling process and impact analysis.
  • Designing teh ETL process and defining teh strategies for data loads.
  • ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 9.5
  • Created mappings, workflows to read COBOL files, flat files and generate extracts and load them into Oracle database.
  • Created workflows to generate extracts for different vendors.
  • Involved in development and on-going maintenance of existing data ware house system.
  • Responsible for developing and maintaining application for mutual funds.
  • Also worked as lead to get teh business requirements from business and get teh work done by teh offshore team.
  • Review teh work done by teh offshore team and get it approved by teh client.

Confidential, Weehawken, NJ

System Analyst/Project Lead

Responsibilities:

  • Interaction wif client team / onsite team for scope signs off.
  • Understanding existing applications in Document Management projects.
  • Developing and maintaining teh banking applications. Code changes were done using COBOL, CICS and DB2.
  • Providing production support to teh applications.
  • Performing incident management support to teh applications.
  • Debugging and fixing teh issues reported by business.
  • Identifying teh high jobs consuming high CPU time.
  • Analysing teh root cause for high consumption of CPU time and put a fix or workaround for CPU time reduction.
  • Complete teh unit testing and system testing of teh jobs.
  • Comparing teh results wif before and after teh optimization and providing cost benefits.

Confidential, Chicago, IL

Mainframe Developer

Responsibilities:

  • Review teh design document and prepare a development document to carry out teh development activities.
  • Preparing Test plans and test cases as per teh requirements.
  • Writing LE/Non-LE Assembler routines and macros for interfacing wif C++ modules
  • Writing C++ programs which will invoke assembler programs to use teh IBM Assembler macros and other CA’s existing service macros.
  • Developed new Assembler macros and service routines.
  • Preparing Test plans and test cases as per teh requirements.
  • Writing Assembler wrapper modules for using existing CA’s services
  • Writing C++ modules for teh Command Manager and Object manager modules
  • Writing Cobol-DB2 test drivers for testing some of teh features.
  • Writing COBOL test drivers.
  • Used XDC/SE Debugger for debugging
  • Conduct code review and testing as per teh test cases and test plans prepared earlier.
  • Adhere to quality standards as set forth by CA.
  • Participate in weekly customer calls.

We'd love your feedback!