We provide IT Staff Augmentation Services!

Sr. Etl Informatica Developer Resume

Southfield, MI


  • Over 11 years of IT experience in Business/Application Software Development and support in Oracle Environment mainly in Client/Server - architecture under UNIX and Windows. Successful experience and strong skills in various aspects of software development life cycle (SDLC) in conversation with SEI-CMM level 5 quality procedures.
  • Worked on various industry domains such as Life Science, Health care, Financial Projects, Commodities Exchange, Media & Entertainment. Possess good communication skills and has dealt with variety of global clients.
  • Experience in development and design of ETL (Extract, Transform and Loading data) methodology for supporting data transformations and processing, in a corporate wide ETL Solution. Experience in working with DAC 11g/10g to define the Informatica workflow and sequence and schedule Informatica workflows.
  • Experience in Informatica mapping specification documentation, tuning mappings to increase performance, proficient in creating and scheduling workflows and expertise in Automation of ETL processes with scheduling tools such as Autosys and Tidal.
  • Experience in Performance Tuning at Mapping, Session and database level. Identifying and resolving performance bottlenecks at various levels in Business Intelligence applications. Applied the Mapping Tuning Techniques such as Pipeline Partitioning to speed up data processing. Conducted Session Thread Analysis to identify and fix Performance Bottlenecks.
  • Substantial experience in Code Design, Development, and debugging skills, offshore team management. Experience to lead projects and manage timelines for self and for the team.
  • Substantial experience of best PL/SQL programming techniques, performance tuning and database design.
  • Possess strong analytical abilities and ability to organize and manage time efficiently and to perform well under pressure. Exposure to Handle Project Management aspect of live projects including Project driving implementation scope finalization, Project planning, Process Implementation, Team management, Requirements analysis, Cost Estimation, risk management, projects plan & effort estimation, Project tracking, customer interaction, Release Management, Defect tracking tools and deliverables.
  • Worked with business team to develop project schedules, provide status reports, identify risk and mitigation plan, and lead issue resolution till deployment and support process.
  • Good knowledge of tools like Toad, Pl/SQL Developer, Informatica, DIH, Hadoop, HDFS, Hadoop Ecosystem, Hive. Experience in analysing data using Hive QL, importing and exporting data using Sqoop from Relational database systems to HDFS and vice versa.


ETL Tools: Informatica Power Center 10.1,9.6, Power Exchange 9.5, 9.0.1, 8.x, 7.x (Designer, Workflow Manager, Workflow Monitor, Repository Manager, Repository Server, Administration Console) Metadata Manager, Datastage,, IDQ, IDE, MDM.

Reporting Tools: Qlikview, SSRS, SSIS

Databases: RDBMS like Oracle 12c/Exadata/11g/10g, MS SQL Server 2008/2005/2000 , MS Access, IBM, Netezza,Teradata

Data Modelling: Data Modelling, Dimensional Data Modelling, Star Schema Modelling,

Snowflake Modelling, FACT and Dimensions Tables, Physical and: Logical Data Modelling.

DB Tools: TOAD, SQL Developer, SQL Assistant, Visio, ERWIN

Operating System: UNIX, Linux, Windows XP/7

Languages: SQL, PL/SQL, XML

Scheduling Tools: Tivoli, Autosys, Control-M, Dollar Universe


Confidential, Southfield, MI

Sr. ETL Informatica Developer


  • Analysing the requirement for loading the data from EDW to internal database server.
  • Involved in designing the Workflows, Worklets, Mappings, Sessions and configuring the Informatica Server using Informatica Power Center.
  • Integrated the Informatica server with DAC and customized the data warehouses in DAC.
  • Performance tuning of informatica mappings data warehouse loads, including informatica server manager and PL/SQL procedures. Done Unit testing at various level of ETL and wrote quires in SQL to ensure loading data in to target.
  • Developed metadata tables, views, PL/SQL stored procedures, for moving the data from staging area to data mart. Created scripts to create new tables, views, queries for new enhancement in the application using TOAD. Created indexes on the tables for faster retrieval of the data to enhance database performance.
  • Worked on Informatica Power Center tools - Source Analyzer, Warehouse Designer, Mapping & Mapplet Designer, and Transformation Developer. Created mappings using transformations like Source Qualifier, Aggregator, Expression, lookup, Router, Filter, Update Strategy, Joiner, Normalizer, Union, Stored procedure, and XML transformations.
  • Converted existing PL/SQL Packages to ETL Mappings using Informatica Power Center.
  • Creating the technical documents. Debugging code, testing and fixing of defects through ClearQuest.
  • Implement Informatica MDM workflow including data profiling configuration specification. Define and build best practices regarding creating business rules within informatica MDM solution
  • Creating tables to provide infrastructure for data load and reporting. Creating the technical documents.
  • Version control of documents using JIRA. Development in accordance with Agile Methodology framework.

Environment: UNIX, Oracle, Shell Scripting, Informatica PWC, MDM, Toad, SQL Developer


Database Developer, Onsite Team Lead


  • Analysing the requirement for creating reports for Claims and their Utilization pattern. Responsible for requirement gathering, assign responsibilities to offshore team
  • Backend development using in SQL and PL/SQL scripting. Creating Report code in SQL. Creating tables to provide infrastructure for reporting. Performance tuning of queries.
  • Involved in data loading using PL/SQL and SQL*Loader calling UNIX scripts to download and manipulate files.
  • Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN PLAN, SQL*TRACE. Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan.
  • Effectively made use of Table functions, Indexes, Table partitioning of fact tables, Analytical functions and views. Developed complex database objects like stored procedures, functions, Packages and Triggers using SQL and PL/SQL.
  • Worked on database migration project from Oracle 12g to Exadata
  • Managed team for design, development, testing and implementation of jobs from dev to prod environment. Coordinated with multiple teams for successful code deployment.
  • Created Informatica DIH workflows and mappings for data load.
  • Involved in creation of dimensional schema design including dimension tables, fact tables. Good knowledge on Query optimizer, Executions plans and Indexes.
  • Created IBM-Datastage parallel jobs to extract source data to load into new datawarehousing schema.
  • Creating the technical documents. Testing, debugging and fixing of defects through Squids.
  • Version control of documents using TFS. Code deployment in accordance with Agile Methodology framework.

Environment: UNIX , Oracle 12g,Exadata,PL/SQL scripting, Shell Scripting, Informatica DIH, Datastage, DB2, Qlikview, Tortoise SVN, Dollar Universe


Database Developer/Hadoop Developer, Offshore Team Lead .


  • Leading a module. Working with clients to gather requirements. Analysing the requirement
  • Understanding the functional specifications document. Writing the technical design document. Unit and Integration Testing. Bug Fixing of the application in various areas
  • Backend development using in SQL and PL/SQL scripting. Creating Report code in SQL. Creating tables to provide infrastructure for reporting. Performance tuning of queries.
  • Experience in writing NZSQL, NZLOAD and stored procedures. Experience in Netezza database design and workload management
  • Managing knowledge gained for utilizing it for other module testing. Communication to stakeholders - Business partners and IT Management
  • Improving the quality of application by using Performance Tuning concepts.
  • Worked with semi structured and unstructured data.
  • Worked on analysing Hadoop cluster and different big data analytics tools including Hive, Spark, Sqoop.
  • Involved in importing and exporting data from local/external file system and RDMS to HDFS.
  • Designed data warehouse using Hive. Created and managed Hive tables in Hadoop.
  • Created and maintained Technical documentation for launching Hadoop clusters and executing Hive queries.
  • Preparing the LLD documents, as per the WaterFall Model following by the client estimating the new Requirements preparing the plan to meet the deadline.

Environment: HP Itanium Processors, Oracle RAC Database , Windows XP, UNIX , PL/SQL, Shell Scripting , Informatica 9.1, Oracle 10G, TOAD, Dollar Universe, SQL Developer, Netezza, Hadoop

Hire Now