We provide IT Staff Augmentation Services!

Sr.etl/informatica Developer Resume

4.00 Rating

Cedar Rapids, IA


  • About 7 years of experience in Information Technology with a strong background in Database development and Data warehousing and ETL process using Informatica Power Center 9/8.x/7.1.3/7.1.1/6.2, Power Exchange using Designer (Source Analyzer, Warehouse designer, Mapping designer, Metadata Manager, Mapplet Designer, Transformation Developer), Repository Manager, Repository Server, Workflow Manager & Workflow Monitor.
  • Experience in integration of various data sources with Multiple Relational Databases like DB2, Oracle, SQL Server and Worked on integrating data from flat files like fixed width and delimited.
  • Have extensively worked in developing ETL for supporting Data Extraction, transformations and loading using Informatica Power Center.
  • Expertise in full life cycle of ETL (Extraction, Transformation and Loading) using Informatica Power Center (Repository Manager, Server Manager, Mapping Designer, Workflow Manager, Workflow monitor).
  • Involved in the data analysis for source and target systems
  • Implemented performance tuning techniques at application, database and system levels
  • Experience in data extraction from heterogeneous sources using Informatica Power center
  • Experience in UNIX shell programming.
  • Solid understanding of Relational (ROLAP) and Multidimensional (MOLAP) modeling, broad understanding of data warehousing concepts, star and snowflake schema database design methodologies and Meta data management.
  • Expertise in working with Oracle Stored Programs, Packages, Cursors, Triggers, Database Link, Snapshot, Roles, Privileges, Tables, Constraints, Views, Indexes, Sequence, and Synonyms Dynamic SQL and SQL*Loader in distributed environment.
  • Experience in using Informatica to populate the data into Teradata DWH.
  • Experience in SQLtuning using Hints, Materialized Views.
  • Expertise in understanding Fact Tables, Dimension Tables, Summary Tables
  • Involved in the designing and building of Universes using Designer.
  • Good exposure to development, testing, debugging, implementation, documentation, user training & production support.
  • Ability to work effectively in a supervisory andnon supervisory environments as a team member as well as an individual.
  • Excellent analytical, programming, written and verbal communication skills with ability to interact with individuals at all levels.
  • Have good experience with onsite and offshore coordination


ETL Tools: Informatica (Power Center/Power Mart) 9/8.6.1/8.1/7.1.2/6.2/5.1, Power Exchange, Power Connect, SQL Server SSIS/ DTS, Datastage 8.x/7.x

Databases: Oracle 10g/9i/8i, IBM UD2 DB2, Sybase, MS SQL Server 2008/2005/2000, Teradata v2r12/v2r6/v2r5

Programming Languages: C, C++, SQL, PL/SQL, UNIX, XML, Java, .Net, c#

BI Tools: Business Objects XI r3.1/r2/6.5.1/6.1a/5.1, Cognos

Web Technologies: JavaScript 1.2, HTML 4.0

Others: Erwin 4.1.2/3.5.2, TOAD, SQL Loader, MS Office, Winscp (FTP), Autosys, Rational Clear Case/Clear Quest/ Req.pro, Control - M, Tivoli(IBM), MS.Visio, Harvest, Mercury Quality center(defects)


Confidential, Cedar Rapids, IA

Sr.ETL/Informatica Developer


  • Responsible for gathering the requirements both functional and technical and documentation.
  • Worked with the business analysts in requirement analysis to implement the ETL process
  • Requirement analysis/documentation, developing functional and technical specifications, DW and ETL designing, developing detailed mapping specifications, DFD's and scheduling charts.
  • Developing data models and designing data warehouse in view of the project requirements.
  • Designing, developing, testing, performance tuning and scheduling Datastage jobs.
  • Developing and implementing data masking, encoding, decoding measures using the Datastage and UNIX scripting.
  • Extensively worked on Datastage routines, custom stage and wrappers to handle complex transformations, calculations, encoding, decoding etc.
  • Configuring the Datastage server for enhanced performance and resolving memory scarcity issues.
  • Developed both batch and real time ETL and reporting process in Datastage.
  • Optimized SQL query and streamlined the data flow processes.
  • Analyzed the business requirements and functional specifications.
  • Extracted data from oracle database and spreadsheets and staged into a single place and applied business logic to load them in the central oracle database.
  • Extensively worked on Informatica Data Explorer (IDE) to profile data and monitor the data issues.
  • Used Informatica Power Center 9 for extraction, transformation and load (ETL) of data in the data warehouse.
  • Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator and Stored Procedure.
  • Developed complex mappings in Informatica to load the data from various sources.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Used FTP connections to write the target to a different remote location.
  • Parameterized the mappings and increased the re-usability.
  • Used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings.
  • Created procedures to truncate data in the target before the session run.
  • Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
  • Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
  • Extensive knowledge in creating slowly changing dimensions (Type-1 and Type-2).
  • Extensively used Informatica debugger to figure out the problems in mapping. Also involved in troubleshooting existing ETL bugs.
  • Created a list of the inconsistencies in the data load on the client side so as to review and correct the issues on their side.
  • Created the ETL exception reports and validation reports after the data is loaded into the warehouse database.
  • Written documentation to describe program development, logic, coding, testing, changes and corrections.
  • Created Test cases for the mappings developed and then created integration Testing Document.
  • Followed Informatica recommendations, methodologies and best practices.

Environment: Informatica Power Center 8.6/9, Oracle 10g/11g, MS Access, MS SQL Server 2008, SQL, PL/SQL, T-SQL, SQL*Plus, TOAD, Erwin, Windows XP, UNIX, Oracle Applications 11i.

Confidential, CA

Sr. ETL Developer

  • Analyzing the source data coming from different sources and working with business users and developers to develop the Model.
  • Involved in Dimensional modeling to Design and developSTAR Schema, UsingER-winto design Fact and Dimension Tables.
  • Tuned performance of Informatica session forlarge data filesby increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Worked withMLOAD, FASTLOAD, TPUMPandBTEQutilities of Teradata for faster loading and to improve the performance.
  • Creation of customizedMLOADscripts onUNIXplatform for Teradata loads.
  • Created Test tables and worktables on development and production on Teradata.
  • Developed views on departmental and claims engine database to get the required data.
  • Developed application views in Teradata and using expression and router implemented theChange Data Capture (CDC)process.
  • Involved insource data profiling& source system analysis.
  • Involved in designing theLogical and Physicaldata models for the data warehouse
  • Developed SQL code fordata validationsanddata computation processon source DB2 transaction system and on target warehouse
  • Involved in massivedata cleansingprior todata staging.
  • Developedshell scripts, cron jobsfor job execution and automation on server side.
  • Define strategy for ETL processes, procedures and operations
  • Prepared a handbook of standards and Documented standards for Informatica code development.
  • Administrated users and user profiles and maintained the repository server.
  • Tuning the complex mappings based on source, target and mapping, session level
  • Extensively usedpmcmdcommandson command prompt and executed Unix Shell scripts to automate workflows and to populate parameter files
  • Extensively used transformations like lookup, router, Aggregator, sequence generator, filter, update strategy, joiner.
  • Handled slowly changing dimensions of Type1/ Type 2 to populate current and historical data to dimensions and fact tables in the Data Warehouse.
  • Document the process for further maintenance and support.

Environment:Informatica Power Center8.5.1/7.1.4,Teradata v2r12, Business Object XI, Oracle 10g, MS SQL Server, Toad, PL/SQL, SQL, XML

Confidential, Houston TX



  • Involved with requirement gathering and analysis for the data marts focusing on data analysis, data quality, data mapping between ODS, staging tables and data warehouses/data marts.
  • Designed and developed processes to support data quality issues and detection and resolutions of error conditions.
  • Working with the Business Analysts and the QA team for validation and verification of the development.
  • Extract data from flat files, Oracle, DB2, Mainframe files, SQL Server 2008, and to load the data into the target database.
  • Wrote T-SQL scripts to validate and correct inconsistent data in the staging database before loading data into databases.
  • Analyzed the session logs, bad files and error tables for troubleshooting mappings and sessions.
  • Implemented various scenarios related to slowly growing targets and slowly changing dimensions(Type1, Type2, Type3)
  • Implemented various business rules of data transformations using various Informatica transformations like Normalizer, Source Qualifier, Update Strategy, Look up(connected/unconnected/static cached/dynamic cached), Sequence Generator, expression, Aggregator, XML(source and generator), Stored Procedures.
  • Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions).
  • Worked with newer Informatica transformations like Java transformation, Transaction Control.
  • Used Teradata as a Source and a Target for few mappings. Worked with Teradata loaders within Workflow manager to configure Fast Load and MultiLoad sessions.
  • Experience with Teradata as the target for the datamarts, worked with BTEQ, Fast Load and MultiLoad
  • Provided administrative functions like creating repositories, backing up repositories, setting up users, assigning permissions and setting up folders in Repository manager.
  • Wrote shell script utilities to detect error conditions in production loads and take corrective actions, wrote scripts to backup/restore repositories, backup/restore log files.
  • Heavily involved with performance tuning of Oracle database - using TKProf utility, working with partitioned tables, implementing layer of materialized views to speed up lookup queries, working with Bitmap indexes for dimension tables, DBMS Stats package to update statistics, using SQL hints.
  • Wrote PL/SQL stored procedures/functions to read and write data for the Control Processes at ODS and CDM levels.
  • Extensively used pmcmd command to invoke the workflows from Unix shell scripts
  • Scheduled workflows usingautosys job plan.
  • Did QA of ETL processes, migrated Informatica objects from development to QA and production using deployment groups.
  • Provided production support and involved with root cause analysis, bug fixing and promptly updating the business users on day-to-day production issues.
  • Co-ordinated with the off-shore teams and mentored junior developers.

Environment: InformaticaPower Center 8.6, Oracle 10g, Autosys, Oracle 9i, Erwin 4.5, CMS, MS PowerPoint, MS VisioTOAD 9.0, PL/SQL, UNIX, SQL Loader*, SQL server 2005, MS SQL Server 2005/2008.

Confidential, Des Moines, IA

Role: Informatica Developer


  • Responsible for gathering the requirements both functional and technical and documentation.
  • Worked with the business analysts in requirement analysis to implement the ETL process
  • Created Data dictionary using ERWIN modeler.
  • Involved in LowlevelDesign for the scripts of the database sequences, constraints, triggers and stored procedures.
  • Creating Lowleveldocuments for creating maps to load the data from the ODS through the warehouse.
  • Merging several flat files into one XML file.
  • Used loading techniques like Slowly Changing Dimensions and Incremental Loading using parameter files and mapping variables
  • Experience with Data Quality.
  • Used the Power Center client tools Designer, Workflow Manager, and Workflow Monitor
  • Worked with various Active transformation like Filter, Sorter, Aggregator, Router, and Joiner transformations
  • Worked with Slowly Changing Dimensions Type1, Type2, and Type3 for Data Loads
  • Developed batch file to automate the task of executing the different workflows and sessions associated with the mappings.Created workflows using Workflow manager for different tasks like sending email notifications, timer that triggers when an event occurs, and sessions to run a mapping.
  • Used designer debugger to test the data flow and fix the mappings. Tuned Informatica Mappings and Sessions for optimum performance
  • Implemented integration of Datamart with Weblogic Server for creating connection pools and Data Sources of Oracle, SQL drivers.
  • Responsible for Performance Tuning at the Mapping Level, Session Level, Source Level and Target Level
  • Worked with the Joiner transformation using Normal Join, Master outer join, Detail Outer Join and Full Outer Join
  • Worked with Connected look up and un connected lookup and configured the same for implementing complex logic
  • Designed Mappings using Informatica Designer to load incrementally.
  • Created re-usable transformations and mapplets
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning
  • Worked with Session Logs, and Workflow Logs for Error handling and Troubleshooting in DEV environment
  • Involved in Unit Testing, User Acceptance Testing (UAT) to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Worked as an onshore-offshore coordinator

Environment: Informatica Power center 7.1.4, Informatica Power Exchange, Oracle 9i/10g, Sql Server 2005, Toad 8.6DB2, UNIX AIX 5.2, ERWIN 4.0, Flat Files, Hyperion 8.3

Confidential, NY

Sr. DW Consultant


  • Worked with users to understand their reporting requirements and translate those requirements to extract data, and load data in the form of a report.
  • Tested Informatica 8.0 with all functionalities for migration of Informatica 6.2/7.1.2 to 8.0
  • Migrated repository & folders from 7.1.2 to 8.0
  • Analyzed business and systems specifications and developed logic flowcharts
  • Designed and developed mappings using Source Qualifier, Expression, Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Java, Update Strategy, joiner and Rank transformations.
  • Implemented Informatica Power Center for building Star Schema in Oracle Data Warehouse from different OLTP systems
  • Implemented bulk load method with SQL Loader for loading history data to staging area.
  • Defined UNIX Batch scripts for automation of execution of Informatica workflows.
  • Executed Multi load scripts for daily batch jobs.
  • Responsible to tune ETL procedures and schemas to optimize load and query Performance.
  • Implemented business rules by using database triggers.
  • Generated Drill Up, Drill Down and Drill Through reports using Business objects based on user requirements.
  • Improved Application performance by fine-tuning application using TKPROF and EXPLAIN PLAN
  • Created several materialized views for reporting purpose and better performance.
  • Expertise in setting up UNIXcronjobs using cron tab in UNIX.
  • Scheduled informatica jobs using AutoSys scheduler to run at regular intervals.
  • Experience in SQLtuning using Hints, Materialized Views.
  • Extensively worked in the performance tuning for the mappings and ETL Procedures both at designer and session level.
  • Using Dynamic SQL and SQL*Loader in distributed environment.
  • Expertise in working with Oracle Stored Programs, Packages, Cursors, Triggers, Database Link, Snapshot, Tables, Constraints, Views, Indexes, Sequences.

Environment: Informatica Power Center 8.0/7.1.2, Oracle 10g, SQL, PL/SQL,Business Objects, UNIX, Shell Scripts, TOAD


DW Developer


  • Requirement gathering and Understand the Functional Business processes and Requirements given by the Business Analyst.
  • Involve in Designing High level Technical Documentation based on specification provided by the Manager
  • Created mappings, workflows for Nissan extended services north America NESNA and Nissan acceptance holding company NAHC
  • Extraction, Loading and Unit testing has been done using both SAP AND NON-SAP sources
  • Worked on different parallelism concepts in AbInitio.
  • Experienced in ETL Administration responsibilities for the enterprise Data using informatica tool
  • Involved in writing Shell Scripting for ETL JOBS to run.
  • Power Exchange Change Data Capture has been done for data updates
  • Technical Documentation has been done for all the mappings for presentation with Business logic
  • Worked as an offshore coordinator.
  • Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various needs of the transformations while loading the data.
  • Used Informatica -Designer for developing mappings, using transformations, which includes aggregation, Updating, lookup, and summation. Developed sessions using Server Manager and improved the performance details.
  • Created transformations like Aggregate, Expression, Filter, Sequence Generator, Joiner, and Stored procedure transformations.
  • Created reusable transformations called mapplets and used them in mappings in case os reuse of the transformations in different mappings.
  • Involved in the process design documentation of the Data Warehouse Dimensional Upgrades. Extensively used Informatica for loading the historical data from various tables for different departments.
  • Involved in creating Technical Specification Document (TSD) for the project.
  • Involved in gathering business requirements, logical modeling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
  • Involved in the development of Data Mart and populating the data marts using Informatica.
  • Created sessions to run the mappings. Created mapplets to improve the Performance.
  • Worked with offshore clients and maintained a good relation with them.

We'd love your feedback!