We provide IT Staff Augmentation Services!

Sr. Etl Informatica Developer Resume

Dayton, OH


  • 8+ years of IT professional experience with involvement in Analysis, Design and Development of different ETL applications and using Business Intelligence Solutions in Data Warehousing and Reporting with different databases.
  • Expertise in Full Life Cycle development of Data Warehousing.
  • Strong experience in providing ETL solutions using Informatica Power Center 9.x/8.x/7.x
  • Experience with dimensional modeling using star schema and snowflake models.
  • Highly proficient in integrating data with multiple Databases involving Teradata, Oracle, My Sql, SQL Server, DB2,Mainframe and Flat Files like Delimited, Fixed width.
  • Developed complex mappings in Informatica using various transformations like Source Qualifier, Joiner, Aggregator, Update Strategy, Rank, Router, Java, Lookup - Connected & Unconnected, Sequence Generator, Filter, Sorter, Stored Procedure transformation etc.
  • Good Experience with Change data Capture for pulling Delta data.
  • Worked on Initial Load Mapping templates for historical load.
  • Expertise in handling & creating process for Slowly Changing Dimension (SCD) Type 1 & Type 2 to maintain history in dimension tables.
  • Proficiency in developing SQL with various Relational Databases.
  • Experience in creating Stored Procedures, Functions, Views and Triggers.
  • Having solid Experience in Informatica and Teradata combination process.
  • Strong experience using Teradata utilities like MLOAD, FLOAD, TPUMP, FAST EXPORT and TPT for improving Teradata target load performance. Have also created BTEQ scripts to load data in to base tables.
  • Strong Experience in doing Performance Tuning of Teradata BTEQs.
  • Strong Experience in working with Tuning of Different Relational data bases.
  • Extensively worked with Informatica performance tuning involving source level, target level and mapping level bottlenecks.
  • Expertise on creating mappings for performing Data Quality, Data Cleaning & Data Validation.
  • Created shell scripts to run the Informatica workflows and controlling the ETL flow.
  • Experience in doing DWH Production support on rotational basis.
  • Experience with using Scheduling tools - Informatica Schedular, Maestro, Cron Job, Control M, and Autosys.
  • Experienced in working both Waterfall & Agile Methodologies.
  • Experience with Agile Tools Rally, Jira & Scrum Process.
  • Experience working with offshore and onsite co-ordination.
  • Able to work independently and collaborate proactively & cross functionally within a team.
  • Good team player with ability to solve problems, organize and prioritize multiple tasks.


  • Informatica Power Center - 8 Plus Years
  • Oracle, Teradata, My Sql, Sql Server, DB2 UDB/BLU, Sybase
  • File System - Flat Files, XML
  • Unix/Linux
  • ETL, DB Performance Tuning
  • Maestro, Cron Tab, Control - M, Auto Sys and Informatica Schedular
  • DOMAIN Knowledge - Health care, Tele Communication, Banking & Finance


ETL Tools: Informatica Power Center 9.6.1/9.5.1/8.6.1/8.1.1

RDBMS: Oracle 10g/9i/8i, Teradata 15/14/12, DB2, SQL Server, DB2, MySQL, Sybase

Modelling: Dimensional Data Modeling, Star Schema Modeling, Snow - flake Schema Modeling, Erwin, Microsoft Visio.

Reporting Tools: Cognos, Business Objects, Dashboard

QA Tools: Quality Center

Languages: Java, XML, UNIX Shell Scripting, SQL

Operating System: Unix, Linux, Windows


Confidential, Dayton, OH

Sr. ETL Informatica Developer


  • Analyzed the Business Requirement Documents (BRD) & Source to Target Mapping Document (STM) for better ETL Process before the development.
  • Used Informatica Power Center to pull data from different source systems/ files, cleanse, transform and load data into the Teradata using Teradata Utilities.
  • Worked with Oracle Golden Gate system to extract data from it.
  • Worked to establish the pumps for Oracle Golden Gate.
  • Worked on Informatica- Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet, and Transformation Developer.
  • Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.
  • Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Filter, Connected and Unconnected Lookups, Joiner, java, update strategy and stored procedure.
  • Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.
  • Used command task extensively to execute the unix scripts from informatica.
  • Developed mappings for SCD Type 1 and SCD Type 2 dimensions.
  • Created process to handle Change Data Capture (CDC).
  • Involved in source data Profiling and Tuning of queries for Source Data Extraction
  • Involved in Initial loads, Incremental loads and Daily loads to ensure that the data is loaded in the tables in a timely and appropriate manner.
  • Involved in Enhancing existing Production Objects for additional reporting requirements.
  • Tuned the performance of mappings by following Informatica best practices and also applied several methods to get best performance by decreasing the run time of workflows.
  • Worked extensively on Informatica Partitioning when dealing with huge volumes of data and also partitioned the tables in Teradata for optimal performance.
  • Loaded data in to the Teradata tables using Teradata Utilities Bteq, Fast Load, Multi Load, and Fast Export, TPT.
  • Extensively worked in the performance tuning of Teradata BTEQ.
  • Worked with power exchange to extract the data from mainframe.
  • Worked extensively with different Caches such as Index cache, Data cache and Lookup cache (Static, Dynamic and Persistence) while developing the Mappings.
  • Involved in moving the BIG data Hive QL process to Informatica & Teradata.
  • Involved in analysis for future migration of Informatica to SSIS and knowledge on SSIS tool.
  • Responsible for Unit Testing, Integration Testing and helped with User Acceptance Testing.
  • Involved in doing DWH data Analysis for reporting & also written SQL for ADHOC reporting purpose.
  • Written Unix Wrappers for different purposes - PMCMD in Built, Archiving Logs, Purging data, FTP the Files.
  • Co-ordinate with reporting team for better understanding of DWH data.
  • Involved in writing technical design documentation along with deployment activities.
  • Scheduling Informatica jobs and implementing dependencies if necessary using Maestro.
  • Involved in Heavy Production Support on rotational basis.

Environment: Informatica Power Center 9.6.1/9.5.1, SSIS, Teradata 15/14/13.10, Oracle 10g,Oracle Golden Gate, DB2, Hadoop, Hive QL, XML, Flat Files, Teradata Sql Assistant, Toad, Sql, PL/SQL, Unix shell scripting, Cognos, SVN, Unix, Windows.

Confidential, Alpharetta, GA

Sr. ETL Informatica Developer


  • Worked with Business Analyst and Requirements understanding.
  • Worked on Informatica PowerCenter tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.5 environment.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time.
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Review existing code, lead efforts to tweak and tune the performance of existing Informatica processes.
  • Scheduling the sessions to extract, transform and load data in to warehouse database on Business requirements.
  • Leading Informatica team, coordinating with various group to understanding requirements, test code, migrate code and involved in production support.
  • Hands on experience using Informatica to integration with Teradata.
  • Worked in Power Exchange to connect and import sources from external systems like SAP R/3, DB2, Salesforce, Mainframes, and AS/400.
  • Supported team members in programming, review and coding of ETL applications.
  • Design the architecture for production support in order to scheduling the tasks (without any manual intervention) using Unix Shell Scripting.
  • Experience in Unit testing, System testing while moving the code from development to production
  • Created Teradata utilities like MLOAD, FLOAD and Tpump for loading data to various phases.
  • Worked on Real time component like registration and restart token setup of PowerExchange.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
  • Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
  • Involved in Performance tuning Confidential source, target, mappings, sessions, and system levels.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.

Environment: Informatica Power Center 9.5.1/9.1.1, IDQ 9.5.1, Power Exchange, SQL, PL/SQL, Oracle 11g, Flat Files, Autosys, Cron Job, Sybase, UNIX AIX, Agile, Teradata V14, Toad 9.0, Cognos 8.

Confidential, Phoenix, AZ

Sr. ETL/Informatica Developer


  • Interacted with Business Analyst to understand the business requirements.
  • Involved in Understanding the logical and physical data models with Modellers & Data Architects.
  • Involved in staging the data from external sources and was responsible for moving the data into the Warehouse using ETL Informatica.
  • Involved in building the ETL architecture using Informatica 9.1.1/ 8.6.1 and Source to Target mapping to load data into Data warehouse.
  • Used Informatica as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.
  • Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups (Connected, Unconnected), Expression, Aggregator, Update strategy & stored procedure transformation.
  • Used SCD Type 2 to populate the data in a generic way. Single mapping is used to load multiple tables.
  • The scope of this design includes a generic design for implementing tables whose content will be managed in part by end-users of the Data Warehouse. The content changes can be made via the standard production change request process.
  • Created mappings using flat files and relational databases as sources to build update Mappings.
  • Created reusable transformations and mapplets and used them in mappings.
  • Written SQL override queries in source analyzer to customize mappings.
  • Debug mappings to gain troubleshooting information about data and error conditions using Informatica Debugger.
  • As the requirement is to maintain the history of every change for all columns, have implemented the slowly changing dimension type 2 with effective start and end date of the record.
  • Providing periodic update to customer on the coding, unit testing,release and act as Coordinator between development and business team.
  • Handled UNIX system tasks by generating Pre and Post-Session UNIX Shell Scripts.
  • Analyzed source data and formulated the transformations to achieve the customer requested reports.
  • Performed Unit testing and moved the data into QA.
  • Handled UNIX system tasks by generating Pre and Post-Session UNIX Shell Scripts.
  • Analyzed source data and formulated the transformations to achieve the customer requested reports.
  • Performed Unit testing and moved the data into QA.
  • Participated in scheduling workflows, error checking, production support, maintenance as well as in testing of ETL procedures using Informatica session logs.
  • Involved in Performance tuning on sources, targets mappings and SQL (Optimization) tuning.
  • Documentation of the project activity all along the course of the project along with Failure Recovery plans.

Environment: Informatica Power Center 9.5.1/9.1.1/8.6.1, MS SQL server2005/2008, My Sql, Oracle 11g, PL/SQL, TOAD, Flat Files, Windows, Unix.

Confidential, Phoenix, AZ

Informatica Developer


  • Designed and developed ETL strategies and mappings from source systems to target systems. ETLstrategies were designed to cater initial and incremental load.
  • Worked with source teams to resolve data quality issues raised by end users.
  • Mappings were developed in Informatica employing transformations like Joiner, Expression, Aggregate, Rank, Lookup, Update Strategy, Filter and Router.
  • Applied Slowly Changing Dimension and Dynamic lookup techniques.
  • Used debugger to validate the mappings and gain troubleshooting information about data and error conditions.
  • Developed post and pre session shell scripts and scheduled Event based sessions Confidential the Informatica workflow manager.
  • Informatica Metadata repository was created using the Repository Manager as a hub for interaction between the various tools.
  • Used Metadata manager for validating, promoting, importing and exporting repositories from development environment to testing environment.
  • Performed data quality analysis, gathered information to determine data sources, data targets, data definitions, data relationships, and documented business rules.
  • Worked on loading Flat Files in to Data warehouse.
  • Created Stored procedures, collections and packages.
  • Used Informatica Power center workflow manager to create sessions, batches to run with the logic embedded in the mappings
  • Implemented performance-tuning techniques by identifying and resolving the bottlenecks in source, target, transformations, mappings and sessions.
  • Responsible for identifying the missed records in different stages from source to target and resolving the issue.
  • Extensively used PL/SQL Procedures/Functions to build business rule.
  • Used parameters and variables (sessions/mappings) extensively for incremental loads.
  • Used SQL tools like TOAD to run SQL queries and validate the data.
  • Analyzed Session Log files in case the session failed to resolve errors in mapping or session configurations.
  • Setting up tasks to schedule the loads Confidential required frequency using Power Center workflow manager.
  • DB2 Performance tuning both system and database management of DB and DBM configuration. Analyzing queries both Adhoc and Online and provide trace and log analysis.
  • Worked with DB2 Explain, Runstats, REORG and other ONLINE and Offline utilities.

Environment: Informatica PowerCenter8.6.1/8.1, Oracle 10g/9i, Db2, Flat Files, Toad, Putty, Winscp, UNIX,Windows.


Jr. ETL/Informatica Developer


  • Used update strategy to effectively migrate data from source to target.
  • Moved the mappings from development environment to test environment.
  • Designed ETL Process using Informatica to load data from Oracle, Flat Files to target Oracle Data Warehouse.
  • Interacted with the business community and database administrators to identify the Business requirements and data realties.
  • Created various transformations as per the business logic like Source Qualifier, Normalizer, Lookup, Stored Procedure, Sequence Generator, Router, Filter, Aggregator, Joiner, Expression and Update Strategy.
  • Created design document Informatica mappings based on business requirement.
  • Created Informatica mappings using various Transformations like Joiner, Aggregate, Expression, Filter and Update Strategy.
  • Wrote stored procedures, functions, Packages and used in many Forms and Reports.
  • Wrote database triggers for automatic updating the tables and views.
  • Involved in the performance improvement project.
  • Involved in designing of testing plan (Unit Testing and System Confidential esting).
  • Tested scripts by running workflows and assisted in debugging the failed sessions.
  • Improving workflow performance by shifting filters as close as possible to the source and selecting tables with fewer rows as the master during joins.
  • Used persistent caches whenever data from workflows were to be retained.
  • Used connected and unconnected lookups whenever appropriate, along with the use of appropriate caches.
  • Created tasks and workflows in the Workflow Manager and monitored the sessions in the Workflow Monitor.
  • Perform Maintenance, including managing Space, Remove Bad Files, Remove Cache Files and monitoring services.
  • Set up Permissions for Groups and Users in all Development Environments.
  • Involved in the team meetings and providing status report to project manager.

Environment: Informatica Power Center 7.1/8.1.1, Oracle 9i, Flat Files, OBIEE, PL/SQL, UNIX, Autosys, ERWIN, TOAD, UNIX Shell Scripting.

Hire Now