We provide IT Staff Augmentation Services!

Senior Etl Informatica Developer Resume

Cincinnati, OH


  • Over 7+ years of experience in Information Technology with emphasis on Data Warehouse/Data Mart development using developing strategies for extraction, transformation, and loading (ETL) in Informatica Power Center from various database sources.
  • Experienced with full life cycle of Software Development (Planning, Analysis Design, Deployment, Testing, Integration and Support).
  • Extensively worked on different Connectors (AWS s3, Workday, Twitter, Glassdoor) to retrieve the data Using ICS (Informatica cloud Services).
  • Excellent Knowledge on Slowly Changing Dimensions (SCD Type1, SCD Type 2, SCD Type 3), Change Data Capture, Dimensional Data Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data Marts, OLAP and FACT and Dimensions tables, Physical and Logical data modeling.
  • Extensive involvement in composing UNIX shell scripts and computerization of the ETL forms utilizing UNIX shell scripting.
  • Strong work experience in Data Warehouse life cycle development, performed ETL procedure to load data from different sources like SQL Server, Oracle, Mainframe, Teradata and flat files into data marts and data warehouse using Informatic Power Center - Designer, Workflow Manager, and Workflow Monitor.
  • Excellent in designing ETL procedures and strategies to extract data from different heterogeneous source systems like oracle 11g/10g, SQL Server 2008/2005, DB2 10, Flat files, XML, SAP R/3 etc.
  • Created mappings in mapping Designer to load data from various sources using transformations like Transaction Control, Lookup (Connected and Un-connected), Router, Filter, Expression, Aggregator, Joiner and Update Strategy, SQL, Stored Procedure and more.
  • Have experience with IDQ, MDM with knowledge on Big Data Edition Integration with Hadoop and HDFS.
  • Developing Oracle PL/SQL stored procedures, Functions, Packages, SQL scripts to facilitate the functionality for various modules.
  • Extensive knowledge of various Performance Tuning Techniques on Sources, Targets, Mappings and Workflows using Partitions/Parallelization and eliminating Cache Intensive Transformations.
  • Strong RDBMS concepts and experience in creating, maintaining and tuning Views, Stored Procedures User Defined Functions and System Functions using SQL Server, T-SQL.
  • Involved in the Analysis, Design, Development, Testing and Implementation of business application systems for Health care, Pharmaceutical, Financial, Telecom and Manufacturing Sectors.
  • Hands on experience working in LINUX, UNIX and Windows environments.
  • Good knowledge on data quality measurement using IDQ and IDE
  • Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using Erwin and ER-Studio.
  • Working experience using Informatica Workflow Manager to create Sessions, Batches and schedule workflows and Worklets, Re-usable Tasks, Monitoring Sessions
  • Proficient with Informatica Data Quality (IDQ) for cleanup and massaging at staging area.
  • Designed and Developed IDQ mappings for address validation / cleansing, doctor master data matching, data conversion, exception handling, and report exception data.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
  • Experience in handling initial/full and incremental loads.
  • Expertise in scheduling workflows Windows scheduler, Unix and scheduling tools like CRTL-M &Autosys
  • Designed, Installed, Configured core Informatica/Siperian Hub components such as Informatica Hub Console, Hub Store, Hub Server, Cleanse Match Server, Cleanse Adapter, IDD & Data Modeling.
  • Experience in support and knowledge transfer to the production team.
  • Worked with Business Managers, Analysts, Development, and end users to correlate Business Logic and Specifications for ETL Development.
  • Experienced in Quality Assurance, Manual and Automated Testing Procedures with active involvement in Database/ Session/ Mapping Level Performance Tuning and Debugging.
  • Excellent communication, presentation, project management skills, a very good team player and self-starter with ability to work independently and as part of a team.


ETL Tools: Informatica Powercenter 10.x/9.5.x/9.1/8.6.x/8.5.x., Informatica Power Exchange 9.x, Big data edition 9.6.1, Data Stage, Pentaho, Informatica Data Explorer (IDE), Informatica 9.5.1, Informatica Data Quality (IDQ), SSIS.

Data Modeling: Star Schema, Snowflake Schema, Erwin 4.0, Dimension Data Modeling.

Databases: Oracle11g/10g/9i, SQL Server2008/2005, IBM DB2,, Sybase, MS Access

Control: M, CA7 Schedule, CA ESP Workstation, Autosys, Informatica Scheduler

Programming: SQL, PL/SQL, Transact SQL, HTML, XML, C, C++, Korn Shell, Bash, Perl

Operating Systems: Windows 7/XP/NT/95/98/2000, DOS, UNIX and LINUX

Big Data / Hadoop: HDFS, Hive, Spark, Hbase, and Sqoop.

Other Tools: SQL*Plus, Toad, SQL Navigator, Putty, WINSCP, MS-Office, SQL Developer.


Senior ETL Informatica Developer

Confidential, Cincinnati, OH


  • Gathered user Requirements and designed Source to Target data load specifications based on business rules.
  • Used Informatica Power Centre 9.0.1.for extraction, loading and transformation (ETL) of data in the DataMart.
  • Participated in the review meetings with functional team to signoff the Technical Design document.
  • Involved in Design, Analysis, Implementation, Testing and support of ETL processes
  • Worked with Informatica Data Quality 9.6 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.6.
  • Validated the following HIPAA EDI transactions as 837(Health Care Claims or Encounters), 835(Health Care Claims payment/Remittance), 270/271 (Eligibility request/Response) and 834(Enrollment/Disenrollment to a health plan) by developing mappings.
  • Developed IDQ mappings using various transformations like Labeler, Standardization, Case Converter, Match & Address validation Transformation.
  • Designed, Developed & Supported Extraction, Transformation & Load Process (ETL) for data migration with Informatica Power Center.
  • Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
  • Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and capturing the deleted records in the source systems.
  • Worked extensively with the connected lookup Transformations using dynamic cache.
  • Worked with complex mappings having an average of 15 transformations.
  • Coded PL/SQL stored procedures and successfully used them in the mappings.
  • Coded Unix Scripts to capture data from different relational systems to flat files to use them as source file for ETL process and also to schedule the automatic execution of workflows.
  • Scheduled the Jobs by using Informatica scheduler& Jobtrac
  • Created and scheduled Sessions, Jobs based on demand, run on time and run only once
  • Monitored Workflows and Sessions using Workflow Monitor.
  • Performed Unit testing, Integration testing and System testing of Informatica mappings.
  • Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning at various levels like mapping level, session level, and database level.
  • Provided production support by monitoring the processes running daily.
  • Participated in weekly status meetings, and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.
  • Coordinating with the Offshore team and directly interacting with the client for clarifications & resolutions
  • Introduced and created many project related documents for future use/reference.
  • Designed and developed ETL Mappings to extract data from Flat files and Oracle to load the data into the target database.
  • Developing several complex mappings in Informatica a variety of Power enter transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using Informatica Power Center.
  • Built complex reports using SQL scripts.
  • Created complex calculations, various prompts, conditional formatting and conditional blocking etc., accordingly.
  • Created complex mappings to load the data mart & monitored them. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator transformations.
  • Ran the workflows on a daily and weekly basis using workflow monitor.

Environment: Informatica 9.0.1, PL/SQL, Informatica Data Quality IDQ 9.6, Informatica 8.6.1, 9.5, Oracle 9i, UNIX, SQL, PL/SQL, Informatica Scheduler, SQL*loader, SQL Developer, Framework Manager, Transformer, Teradata, Oracle 11g, TOAD, Windows Server 2008, UNIX.

Senior Informatica PowerCenter Developer

Confidential, St Louis, MO


  • Assisted Business Analyst with drafting the requirements, implementing design and development of various components of ETL for various applications.
  • Worked closely with ETL Architect and QC team for finalizing ETL Specification document and test scenarios.
  • Extracted data from oracle database and spreadsheets, CSV files and staged into a single place and applied business logic to load them in the central oracle database.
  • Designed and developed Informatica Mappings and Sessions based on user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Migration of code between the Environments and maintaining the code backups.
  • Integration of various data sources like Oracle, SQL Server, Fixed Width & Delimited Flat Files, DB2.
  • Involved in the Unit Testing and Integration testing of the workflows developed.
  • Extensively worked with Korn-Shell scripts for parsing and moving files and even for re-creating parameter files in post-session command tasks.
  • Imported Source/Target Tables from the respective databases and created reusable transformations like Joiner, Routers, Lookups, Filter, Expression and Aggregator and created new mappings using Designer module of Informatica.
  • Used the Address Doctor to validate the address and performed exception handling, reporting and monitoring the system. Created different rules as mapplets, Logical Data Objects (LDO), workflows. Deployed the workflows as an application to run them. Tuned the mappings for better performance.
  • Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Profiled data on Hadoop to understand the data and identify data quality issues.
  • Imported and exported data from relational databases to Hadoop Distributed file system-using Sqoop.
  • Developed shell scripts for running batch jobs and scheduling them.
  • Handling User Acceptance Test & System Integration Test apart from Unit Testing with the help of Quality Center as bug logging tool. Created & Documented Unit Test Plan (UTP) for the code.
  • Involved in Production Support

Environment: Informatica PowerCenter 9.6, Oracle 11g, SQL Server, PL/SQL, Unix and WINSCP, Bigdata Edition 9.6.1, Hadoop, HDFS, HIVE, Sqoop.

Informatica Developer

Confidential, Schenectady, NY


  • Responsible to meet with business stakeholders and other technical team members to Gather and analyze application requirements.
  • Worked on source analyzer, Target Designer, Mapping and Mapplet Designer, Workflow manager & Workflow Monitor.
  • Created mappings for initial load in Power Center Designer using the transformations Expression, Router and Source Qualifier.
  • Created complex mappings for full load into target in Power Center Designer using Sorter, Connected Lookup, Unconnected Lookup, Update Strategy, Router, Union etc.
  • Created Mapplets to reuse all the set of transformations for all mappings.
  • Mappings, Mapplets and Sessions for data loads and data cleansing. Enhancing the existing mappings where changes are made to the existing mappings using Informatica Power center.
  • Involved in development of Logical and Physical data models that capture current state Developed and tested all the Informatica Data mappings, sessions and workflows - involving several Tasks.
  • Responsibilities include creating the sessions and scheduling the sessions.
  • Worked on SAS Data management.
  • Created various tasks to give various conditions in the workflows.
  • Involving in extracting the data from Oracle and Flat files. Developed and implemented various enhancements to the application in the form of Production and new production rollouts.
  • Worked on identifying facts, dimensions and various other concepts of dimensional modeling which are used for data warehousing projects.
  • Extensively worked on confirmed Dimensions for the purpose of incremental loading of the target database.
  • Created parameters and variables for the reusable sessions.
  • Analyzed the various bottlenecks at source, target, mapping and session level.
  • Tuning of the mappings and SQL Scripts for a better performance.
  • Performed Unit testing on the Informatica code by running in the debugger and writing simple test scripts in the database thereby tuning it by identifying and eliminating the bottlenecks for the optimum performance.
  • Assign work and responsible for providing technical expertise for the design and execution of ETL projects to onshore and offshore developers.

Environment: Informatica8.6, IDQ 8.6.1, Teradata, Oracle 10g, PLSQL, DB2, XML, SQL* PLUS, MS Excel, UNIX (AIX), UNIX Shell

ETL/Informatica Developer

Confidential Mclean, VA


  • Understanding the Business Design Documents & creating an overall design based on requirements & Business reviews.
  • Designed and reviewed the ETL solutions in Informatica Power Center.
  • Analyzed the requirements to identify the necessary tables that need to be populated into the staging area.
  • Developed Informatica mappings to load data into various fact tables and dimension tables.
  • Created Mappings using Mapping Designer to load data from various sources like Oracle, Flat Files, MS SQL Server and XML.
  • Used various transformations of Informatica, such as Source Qualifier Transformation, Expression Transformation, Look-up transformation, Update Strategy transformation, Filter transformation, Router transformation, Joiner transformation etc. for developing Informatica mappings.
  • Prepared ETL standards, naming conventions and wrote ETL flow documentation.
  • Importing source and target tables from their respective databases.
  • Created and Monitored Workflows using Workflow Manager and Workflow Monitor.
  • Used shortcuts (Global/Local) to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
  • Performed Repository Administration tasks (Creating Repositories, Users, Assigning privileges, creating backups and recovery).
  • Involved in designing the ETL testing strategies for functional, integration & system testing for Data warehouse implementation.
  • Implemented versioning of folders in the Informatica Designer tool.
  • Used Parameter files to specify DB Connection parameters for sources.
  • Used debugger to test the mapping and fixed the bugs.
  • Created mappings with flat-file from different ERP systems and used for Change Data Capture to reduce load on SAP BI.
  • Prepared unit test cases to meet the business requirements and performed unit testing of mappings.

Environment: Informatica PowerCenter 9.5.1, Teradata 14, Oracle 11g, SQL Server 2012, PL/SQL, T-SQL, Toad, Erwin, Teradata SQL Assistant, SQL Server Management Studio, JIRA, Unix, Win 7.

ETL Developer



  • Analyze project plan to gather the business requirement.
  • Designed and Customized data models for Data Mart supporting data from multiple sources on real time.
  • Extract data from flat files, Oracle, SQL Plus, MS SQL Server 2008 and to load the data into the target database.
  • Extensively used Informatica Power Center 7.1 an ETL tool to extract, transform and load data from remote sources to DW.
  • Created Complex mappings using transformation like Filter, Expression, Joiner, Aggregator, Router and Stored Procedure transformations for populating target table in efficient manner.
  • Developed complex joins in the mappings to process data from different sources.
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors of target data load.
  • Developed work flow tasks like reusable Email, Event wait, Timer, Command, Decision.
  • Performed unit testing of Informatica sessions, batches and the target Data.
  • Involved in performance tuning of mappings, transformations and (workflow) sessions to optimize session performance.
  • Effectively utilized shared/ persistent caching techniques and incremental aggregation strategies to improve performance.
  • Designed and developed UNIX shell scripts as part of the pre-session and post-session command to automate the process of loading, pulling, renaming and pushing data from and to different servers.
  • Designed and developed SQL, PL/SQL and UNIX shell scripts.

Environment: Informatica Power center Designer 7, Oracle 9.x/10g, TOAD, Flat Files, UNIX, MS SQL Server 2008, SQL, PL/SQL, and SQL PLUS

Informatica ETL Developer


  • Involved in business analysis and technical design sessions with business and technical staff to develop requirements document and ETL specifications.
  • Involved in designing dimensional modeling and data modeling using Erwin tool.
  • Created high-level Technical Design Document and Unit Test Plans.
  • Developed mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Update strategy and Sequence generator.
  • Wrote complex SQL override scripts at source qualifier level to avoid Informatica joiners and Look-ups to improve the performance as the volume of the data was heavy.
  • Responsible for creating workflows. Created Session, Event, Command, and Control Decision and Email tasks in Workflow Manager
  • Prepared user requirement documentation for mapping and additional functionality.
  • Extensively used ETL to load data using Powercenter from source systems like Flat Files into staging tables and load the data into the target database Oracle. Analyzed the existing systems and made a Feasibility Study.
  • Analyzed current system and programs and prepared gap analysis documents
  • Experience in Performance tuning & Optimization of SQL statements SQL trace
  • Involved in Unit, System integration, User Acceptance Testing of Mapping.
  • Supported the process steps under development, test and production environment
  • Participated in the technical design along with customer team, preparing design specifications, functional specifications and other documents.
  • Used Transformation Developer to create the reusable Transformations.
  • Used Informatica Powercenter Workflow manager to create sessions, batches to run with the logic embedded in the mappings.
  • Wrote SQL Scripts for the reporting requirements and to meet the Unit Test Requirements.
  • Validated the Mappings, Sessions & Workflows, Generated & Loaded the Data into the target database.
  • Used Informatica’s features to implement Type I, II changes in slowly changing dimension tables and also developed complex mappings to facilitate daily, weekly and monthly loading of data.
  • Extensively worked on Oracle SQL's for Data Analysis and debugging.
  • Handled scripts for pre validating the source File structures before loading into the Staging by comparing the source file headers against the base lined header
  • Worked on Teradata Utilities (Multiplied, fast Load, and Export/Import) to improve performance.
  • Wrote shell scripts and control files to load data into staging tables and then into Oracle base tables using SQL*Loader.
  • Used PMCMD command to automate the Powercenter sessions and workflows through UNIX.
  • Validated the Mappings, Sessions & Workflows, Generated & Loaded the Data into the target database.
  • Involved in troubleshooting existing ETL bugs.

Environment: Informatica Powercenter 8.6, ETL, Flat files, Oracle 10g, MS SQL Server 2008, PL/SQL, Shell Programming, TIBCO, SQL * Loader, Toad, Excel and Unix scripting, Sun Solaris, Windows 2002

Hire Now