We provide IT Staff Augmentation Services!

Etl Informatica Developer Resume

Rockville, MD


  • Around 5+ years of extensive experience with Informatica Power Center in all phases of Analysis, Design, Development, Implementation and support of Data Warehousing applications using Informatica Power Center Informatica 10.x/ 9.x/ 8.x, IDQ, Informatica Developer, IDE, SSIS, IDS.
  • Experience in analysis, design and development of enterprise level data warehouses using Informatica. Experience in complete Software Development Life Cycle (SDLC) (Requirement Analysis, Design, Development & Testing)
  • Experience in Data Modeling & Data Analysis experience using Dimensional Data Modeling and Relational Data Modeling, Star Schema/Snowflake Modeling, FACT & Dimensions tables, Physical & Logical Data Modeling.
  • Expertise in developing standard and re - usable mappings using various transformations like expression aggregator, joiner, source qualifier, lookup and router.
  • Experienced in integration of various data sources like Oracle 11g,10g/9i/8i, MS SQL Server 2008/2005/2000 , XML files, Teradata, Netezza,Sybase,DB2, Flat files, XML, Salesforce sources into staging area and different target databases.
  • Designed complex Mappings and have expertise in performance tuning and slowly-changing Dimension Tables and Fact tables.
  • Excellent working experience with Insurance Industry with strong Business Knowledge in Auto, Life and Health Care - Lines of Business
  • Worked on scheduling tools Informatica Scheduler, Autosys, Tivoli/Maestro & CONTROL-M.
  • Experience in PL/SQL Programming and in writing Stored Procedures, Functions etc.
  • Experience in creating complex mappings using various transformations, and developing strategies for Extraction, Transformation and Loading (ETL) mechanism by using Informatica 10.X,9.X/8.X/7.X/6.X
  • Experience in source systems analysis and data extraction from various sources like Flat files, Oracle 11g/10g/9i/8i, IBMDB2 UDB, XML files.
  • Extensively worked on Informatica Data Quality and Informatica Power center throughout complete IDQ Project.
  • Designed and Developed IDQ mappings for address validation / cleansing, doctor master data matching, data conversion, exception handling, and report exception data.
  • Documented the number of source / target rows and analyzed the rejected rows and worked on re-loading the rejected rows.
  • Performed the data profiling and analysis making use of Informatica Data Quality (IDQ).
  • Experience in UNIX shell scripting, Perl scripting and automation of ETL Processes.
  • Experience in support and knowledge transfer to the production team.
  • Prepared user requirement documentation for mapping and additional functionality.
  • Extensively used ETL to load data using Power Center / Power Exchange from source systems like Flat Files and Excel Files into staging tables and load the data into the target database Oracle. Analyzed the existing systems and made a Feasibility Study.


ETL Tools: Informatica Power Center 10.x/9.x/8.x, Informatica Master Data Management (MDM), Data Quality Tool (IDQ), Informatica Cloud(Basic).

Database: Oracle 12c, Oracle 9i/10g/11g, SQL Server 2014/2008/2005 and Teradata

Data Modelling Tools: Erwin 4.1, MS Visio, SQL *Loader, Star and Snowflake Schema.

Scheduling Tools: AutoSys,Control M, Informatica Scheduler

Languages: SQL, TSQL, PL/SQL, C, C++

Scripting Languages: UNIX Shell Scripting, Korn Shell, Bash shell scripting

Operating Systems: Windows, MSDOS, Linux, Unix


Confidential - Rockville, MD

ETL Informatica Developer


  • Requirements gathering, analyze, design, code, test highly efficient and highly scalableintegration solutions usingInformatica, Oracle, SQL, Source systems viz.
  • Involved in the technical analysis ofdata profiling, mappings, formats, data types, and developmentof data movement programs usingPower ExchangeandInformatica.
  • DevelopedETLmappingdocumentwhich includes implementing the data model, implementing the incremental/full load logic and theETLmethodology.
  • Involved in writingTeradataSQLbulk programs and in Performance tuning activities forTeradata SQLstatements usingTeradataEXPLAINand usingTeradata Explain, PMONto analyze and improve query performance.
  • Developed variousETLprocess for complete end to endData Integration anddoneData IntegrationbetweenDB2andOracleusingETLProcess
  • Design, development ofmappings, transformations,sessions, workflows andETLbatch jobsto load data into Source/s to Stage usingInformatica, T/SQL,UNIXShell scripts, Control - M scheduling.
  • Developed various mappings for extracting the data from different source systems usingInformatica,PL/SQLstored procedures.
  • Developed mappings for extracting data from different types of source systems (flat files, XML files, relational files, etc.)into our data warehouse usingPower Center.
  • UsedInformaticapower center10.1.0to Extract, Transform and Loaddata intoNetezzaData Warehouse from various sources likeOracle and flat filesandresponsible for creatingshell scriptsto invoke theinformaticaworkflows through command line.
  • DevelopedUNIX Shell scriptsfor data extraction, running thepre/post processes andPL/SQLprocedures.
  • UsedTeradata SQL Assistant, Teradata Administrator and PMONand data load/export utilities likeBTEQ,Fast Load, Multi Load, Fast Export, Tpump, and TPT on UNIX/Windowsenvironments and running thebatch processforTeradata.
  • Developed standard mappings using various transformations likeexpression, aggregator, joiner, source qualifier, router, lookup, and filter.
  • Involved in business analysis and technical design sessions with business and technical staff todevelopEntityRelationship/data models, requirements document, andETLspecificationsand Dig deep into complexT-SQLQuery and Stored Procedure to identify items that could be converted toInformaticaCloud ISD.
  • Involved in gathering and documenting business requirements from end clients and translating them into report specifications for theTableauplatform.
  • CreatedInformaticamappings withPL/SQLstored procedures/functions to in corporate critical business functionality to load data.

Environment: Informatica Power Centre 10.2.0, Power exchange, Oracle 12c, Netezza, UNIX, Netezza Aginity, Teradata, SQL & PLSQL, SQL Server, Korn Shell Scripting, XML, T-SQL, TOAD, UNIX, Win, LINUX, Excel,, tableau, Flat Files, Tivoli.

Confidential, Columbus, OH

ETL /Informatica Developer


  • Participate in design and analysis sessions with business analysts, source-system technical teams, and end users.
  • Responsible for developing and maintainingETLjobs, includingETLimplementation and enhancements, testing and quality assurance, troubleshooting issues andETL/Query performance tuning
  • Designing, developing, maintaining and supporting Data Warehouse orOLTPprocesses via Extract, Transform and Load(ETL)software usingInformatica.
  • Manage and expand currentETLframeworkfor enhanced functionality and expanded sourcing.
  • Translate business requirements intoETLand report specifications. Performed error handing using session logs.
  • Worked onPower CenterTools like designer, workflow manager, workflow monitor and repository manager.
  • Involved in creating database objects like tables, views, procedures, triggers, and functions using T-SQL to provide definition, structure and to maintain data efficiently.
  • Done Code reviews ofETLand SQLprocesses. Worked on upgradingInformaticafrom version9.x to 10.x.
  • Created complex mappings inPower Center Designerusing Aggregate, Expression, Filter, and Sequence
  • Identifying performance issues in the code and tuning it to complete the jobs within the given SLA time.
  • Performance tuning atInformaticaand database levels depending on the bottlenecks identified with the jobs.
  • Developed mappings usingInformaticaPower CenterTransformations - Lookup (Unconnected, Connected), Filter, Expression, Router, Joiner, Update Strategy, Aggregator, Stored Procedure, Sorter, and Sequence Generator.
  • Effectively usedInformaticaparameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
  • Worked withInformaticaData Quality 9.x (IDQ)toolkit, analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities ofIDQ.
  • Performed Relational Data Modeling, ER Diagrams (forward & reverse engineering), dimensional modeling,OLAPmultidimensional Cube design & analysis, define slowly changing dimensions and surrogate key management.
  • Worked with the testing team to resolve bugs related to day oneETL mappingsbefore production.
  • MaintainedETLrelease document for every release and migrated the code into higher environments through deployment groups.
  • Created change request before code deployment into production to get required approvals from business.
  • Created weekly project status reports, tracking the progress of tasks according to schedule and reported any risks and contingency plan to management and business users.

Environment: InformaticaPower Center10.x/9.x,IDQ,OLAP,Oracle, MS SQL Server, PL/SQL, SSIS, SSRS, SSAS, SQL*Plus, SQL*Loader, Windows, UNIX.

Confidential, Mayfield Heights, OH

ETL/Informatica Developer


  • Interacted with Business Analysts to understand the business requirements.
  • Involved in Understanding the logical and physical data models with Modellers & Data Architects.
  • Involved in staging the data from external sources and was responsible for moving the data into the Warehouse using ETL Informatica.
  • Involved in building the ETL architecture using Informatica 9.1.1/ 8.6.1 and Source to Target mapping to load data into Data warehouse.
  • Used Informatica as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.
  • Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups (Connected, Unconnected), Expression, Aggregator, Update strategy & stored procedure transformation.
  • Used SCD Type 2 to populate the data in a generic way. Single mapping is used to load multiple tables.
  • The scope of this design includes a generic design for implementing tables whose content will be managed in part by end-users of the Data Warehouse. The content changes can be made via the standard production change request process.
  • Created mappings using flat files and relational databases as sources to build update Mappings.
  • Created reusable transformations and mapplets and used them in mappings.
  • Written SQL override queries in source analyzer to customize mappings.
  • Debug mappings to gain troubleshooting information about data and error conditions using Informatica Debugger.
  • As the requirement is to maintain the history of every change for all columns, have implemented the slowly changing dimension type 2 with effective start and end date of the record.
  • Providing periodic updates to customers on the coding, unit testing, release and act as Coordinator between development and business team.
  • Handled UNIX system tasks by generating Pre and Post-Session UNIX Shell Scripts.
  • Analyzed source data and formulated the transformations to achieve the customer requested reports.
  • Performed Unit testing and moved the data into QA.
  • Handled UNIX system tasks by generating Pre and Post-Session UNIX Shell Scripts.
  • Analyzed source data and formulated the transformations to achieve the customer requested reports.
  • Performed Unit testing and moved the data into QA.
  • Participated in scheduling workflows, error checking, production support, maintenance as well as in testing of ETL procedures using Informatica session logs.
  • Involved in Performance tuning on sources, targets mappings and SQL (Optimization) tuning.
  • Documentation of the project activity all along the course of the project along with Failure Recovery plans.

Environment: Informatica Power Center 9.5.1/9.1.1/8.6.1 , MS SQL server 2005/2008, My Sql, Oracle 11g, PL/SQL, TOAD, Flat Files, Windows, Unix.

Hire Now