We provide IT Staff Augmentation Services!

Hadoop Engineer And Etl Lead Developer Resume

Minneapolis, MN


  • Lead/Senior ETL Engineer with 14+ years of experience in data area on business domain like Actimize, HMDA, BSA & Compliance, Banking, Healthcare, customer segmentation, and insurance on design, develop and implementation of data warehouse & integration, data enrichment & cleaning and data transformation and load strategy, end to end design of enterprise data warehouse, data lake implementation and data governance and audit automation
  • Hands - on technology professional accustomed to working in complex, project-based environments. Multifaceted experience in software development, software quality assurance and user-acceptance testing.
  • Backed by strong credentials including master in computer science and application degree, Informatica developer certifications, AHM 250 healthcare certification, Java knowledge, huge experience on Informatica tools, well trained and great exposure on Hadoop technologies (spark, Scala, hive and beeline) and cloud (AWS and Snowflake) based solutions


Informatica Tools (Primary skills): Informatica 6/7/8/9/10, PowerCenter, DEI/DEQ 10.4, BDM, Axon 5.2, 7.0, Data Quality, Data Explore, Data Validation, Power Exchange. Administration (Installation & maintenance, Patch upgrade, User Security Control, Migration, deployment, Performance Tuning, backup & Recovery, High availability, grid setup etc.). Informatica Address Validator, Informatica Stencil and Metadata Manager and Business Glossary.

Hadoop Programing skills: Spark, Scala, HDFS, Hive, Beeline scripting

Database Programing skills: SQL, PL/SQL, Hive, T-SQL,UNIX Shell Scripting, BTEQ, Oracle &Teradata SQL Utilities

Secondary Programing skills: Java, Spring Boot, Thyme leaf, Bootstrap & Apache Camel.

Databases: Teradata, MS SQL Server, Oracle 9/10/11g, DB2

Data Modeling: Erwin, Reverse Engineering, forward engineering, canonical data modeling, data merging and metadata modeling

Data Warehouse Concepts: Snowflake, Dimensional Modeling, Ralph Kimball, Inmon approach and Traditional approach.

Agile Concepts & Tool: Version One, IRA tool, Methodology, Story, Sprint, Product Owner, scrum Master and Scrum Team

Scheduling: Control-M, CA7, Crontab, Autosys, ESP Workstation, Tidal, Tivoli and Stonebranch

BI Analytics: Dremio, Tableau, Cognos 8,10 Report Authoring, Cognos 8,10 Framework Manager

CI /CD & Configuration: Jenkins, Rancher, Kubernetes, GitHub, Serena, Informatica with Shell scripting Automated deployment process

Methodologies: SDLC, Waterfall, Agile, SCRUM, Lean, Test-driven and Model-drive, extreme Programming, Informatica Velocity.

Cloud: AWS, Snowflake Cloud Data Platform(DWH)


Hadoop Engineer and ETL lead developer

Confidential, Minneapolis, MN


  • Understand Actimize Risk Case Manager (RCM) processing, automation solution systems, detailed functionality knowledge of creation, modification, and review of the Currency Transaction Report (CTR), Monetary Instrument
  • Logs (MIL), Designated Exemption of Person (DOEP), and Suspicious Activity Report (SAR)
  • Recent exposure on Confidential Bank merger with Chemical Bank on BSA Risk and Compliance data on ETL using Informatica, Customer Deduplication, Transactions, Transfers and on CTR’s and SAR’s on Actimize Application
  • Design and Develop Informatica mapping and workflow to consume and merge the Chemical Bank data into existing Data model
  • Write SQL and Hive Queries to read the data from Enterprise Data platform
  • Write Stored Procedure, packages, views to meet the business requirement.
  • Tune the Informatica code, SQL, PL/SQL, create indexes and stored procedures to improve the performance.
  • Utilize visualization techniques to display data and the results of analysis in clear straightforward presentations that understood by non-technical readers.
  • Design workflow for management of alerts and cases in the Risk Case Manager, including filing of Suspicious Activity Report (SAR) & Currency Transaction Report (CTR) at Financial Crimes Enforcement Network (FinCEN).
  • Design and Implement Audit Balance and Control (ABC) framework using Informatica PowerCenter, Java with Spring Boot and Thyme Leaf, and MS-SQL Server.
  • Package all tables objects, stored procedure, views, batch script, Informatica mapping and workflows, parameter files to production
  • Participate in Code review, peer review & Deployment
  • Designed and implemented Customer Dedup process as part of merger
  • Design and Implemented Customer lineage process as part of merger


Hadoop Engineer


  • Informatica Job performance tuning and make sure development as per standards
  • Informatica Administration activities like migration, maintain environment, user and role management, setup Informatica Data Quality 10, Business Glossary, Metadata manager, LDAP security, SSL certificates. Roles and groups
  • Balancing Report development in Java + Spring boot
  • Support Database packages and scripts both in Oracle and in SQL Server, PL/SQL & Stored procedure development in both Oracle and SQL server.
  • Batch Scripting, Apache Tomcat server building, Support Actimize application
  • Driving all ETL related projects/tasks/activities and its deliverables.
  • Setup and support offshore team for application maintenance and enhancement and for data analysis
  • Define ETL road map, ETL development Guiding Principles, deployment process and code review check list
  • Design and development, requirement gathering and co-ordination with offshore team.
  • Involved in preparation of Support model and doing Production support - Break fix, tuning / performance improvement. Building best practice, naming and development standards
  • Streamlining the ETL process, Load process, Prepare Documentation of various application
  • As an Informatica Administration responsible to maintenance of Informatica server, including Installation, patch Upgradation, domain and user securities and maintenance of application.
  • Assisting Data governance team on data lineage, data profiling and score card building
  • Assisting data modeler and data analyst on the data flow, data trace back, data transformation back to source
  • Define ETL design /framework, prepare technical design documentation, communicate with business stakeholders for details requirement or any clarifications.
  • Making sure smooth sailing of environment, application in production
  • Responsible to follow the process on change request, production control and deployment, access request and database table changes and production job schedule
  • Preparation of support model plan, documentation and transition plan for new application taking over on support
  • Responsible to prepare required documentations done from team.

Senior Solution Engineer



  • Responsible for Requirements gathering, and prepare Technical Design and ETL mapping.
  • Development on new enhancement to exiting Confidential
  • Worked on RFP on Data Quality project in Informatica for client DMV, FL in Confidential
  • Given Informatica training for junior developers as part of value add to organization in Confidential

Technical Lead



  • Credit Data Mart built using Agile Methodology in a span of 6 months from Confidential using Onshore and offshore model effectively
  • Understanding FSDM model and its architecture for data Integration from various source systems until FSDM. Heavy data load using Informatica Pushdown Optimization (PDO) and Informatica Teradata Parallel Transportation (TPT) using ELT methods vs ETL method
  • Coordination between business stakeholders, SME, IA’s and DA’s for functional clarifications
  • Involved in migration, Implementation and support activities and in data mapping activities from technical perspective.
  • Involved in set up of automated job scheduling using CA7 and in Development (implementation) and Admin tasks. Prepared Confidential amp;T coding standards, guidelines and best practices.

Technical Environment: Informatica PowerCenter, IDQ, Informatica Metadata manager, Business Glossary, Teradata, Oracle, MS-SQL Server, UNIX, Mainframe, HP QC and CA7.

Senior ETL Developer



  • Designed and implemented one of the complex load control Informatica mapping's, which records the mapping load details in the control tables and tracks the workflow start date, time and completed date and time, daily load, monthly load and quarterly load, workflow status and insert new record for next run.
  • Using Informatica, Designed and developed complex Informatica mapping is for IBR and OBR PeopleSoft source systems of record type in Header and detail layout with high priority validation rules with complex business logic and implemented the Target load order concept.
  • Build self-sustained support model for Confidential support jobs and other data service applications running in production
  • Strong experience in preparation of high level, low level design, source to target mapping specs, unit Test Cases, Estimation, Code Review, Informatica Administration, Migration and Deployment templates and documents.

Environment: Informatica 8.6, Oracle, UNIX, Toad, My Eclipse.

Senior Developer

Confidential, Chicago, IL


  • Designed and developed Informatica mappings, workflows to load data into Enterprise Data warehouse in Teradata database.
  • Involved in Production Support in resolving high priority issues within in SLA.
  • Design and Implemented generic error handling concepts for all fact loading using Informatica mapplets as a reusable code.

Environment: Informatica 6/7.1, Teradata, Oracle, UNIX, Autosys, ESP Workstation, HPOV, Micro Strategy, Harvest.

ETL Developer



  • Used Informatica Designer to Extract & Transform the data from various source systems by incorporating various business rules and transformations.
  • Handled slowly changing dimensions of Type 1/ Type 2 to populate current and historical data to Dimensions and Fact tables in the data warehouse.
  • Involved in performance tuning of the sessions and SQL, PL/SQL and stored procedure tuning.

Hire Now