We provide IT Staff Augmentation Services!

Data Warehouse Analyst/etl Developer Resume



  • 8 Years of IT industry experience including analysis, design, development, testing, and defect tracking of Data Warehousing and Client Server Applications.
  • Highly proficient in Data Warehousing ETL using Informatica Power Center 6.1./7.1/8.1/8.6/9 ,0.1, Oracle 10g/9i/8.x/7.x, Business Objects 5.0, Cognos Impromptu 5.0, Transformer 6.5, Power Play 6.5.
  • Expertise in Data Warehousing Concepts like Ralph Kimball Methodology, Bill Inmon Methodology, Star Schema, Snow Flake Schema, Fact Table, Dimension Table, and Dimension Data Modeling using Erwin 4.2/4.0/3.5.5/3.5.2 , Microsoft Visio.
  • Extensive integration experience in working on different databases like Oracle, MS Sql, SAS, flat files.
  • Expertise in building Enterprise Data Warehouses (EDW), Operational Data Store (ODS), Data Marts and Decision Support Systems (DSS) using Multidimensional and Dimensional modeling (Star and Snowflake schema) Concepts.
  • Extensive experience in implementing SCD Type 2 using Informatica Power Exchange 9.x/8.x/7.x.
  • Extensive work experience in the areas of HealthCare, Finance, Insurance, Banking and pharmaceutical.
  • Extensively worked on Informatica Designer Components - Source Analyzer, Warehousing Designer, Transformations Developer, Mapplet and Mapping Designer.
  • Have clear understanding of Data warehousing and Business Intelligence concepts with emphasis on ETL and life cycle development Using Power Center, Repository Manager, Designer, Workflow Manager and Workflow Monitor.
  • Hands on experience with complex mappings from varied transformation logics like Unconnected and Connected lookups, Router, Aggregator, Joiner, Update Strategy and re-usable transformations.
  • Extensive knowledge in architecture design of Extract, Transform, Load environment using Informatica Power Mart and Power Center.
  • Involved in all aspects of ETL- requirement gathering, coming up with standard interfaces to be used by operational sources, data cleaning, coming up with data load strategies, unit testing, integration testing, regression testing and UAT in development.
  • Actively involved in Performance Tuning targets, mappings and sessions, Error handling Product support on various Platforms.
  • Excellent knowledge of studying the data dependencies using Metadata stored in the Informatica Repository and preparing batches for the existing sessions to facilitate scheduling of multiple sessions.
  • Extensive experience with Callidus TrueComp
  • Extensively worked on the ETL mappings Analysis, Design and documentation of OLAP reports requirements. Solid understanding of OLAP concepts and challenges, especially with large data sets.
  • Experienced in various data sources like SQL Server, Oracle, DB2, Fixed Width and Delimited Flat Files, Informix and Sybase etc.
  • Expert in Unix Shell Scripting
  • Excellent Analytical and logical programming skills with a good understanding at the conceptual level and possess excellent presentation, interpersonal skills with a strong desire to achieve specified goals.
  • Have strong inclination to get my career on the fast track by learning and implement new technologies.
  • Ability to function both in a team environment as well as individual capacity.


Environment : UNIX, Sun Solaris 5.8/5.6, AIX 5.3/4.3, HP-UX, DOS, Linux, Windows 98/NT/2000/XP/Vista/7, SOAP.

Database : Oracle 11g/10g/9i/8i, SQL Server, IBM DB2, Teradata, Netezza,

Languages : XML, UML, UNIX, Shell Scripting (Bourne, Corn), SQL, PL/SQL, T-SQL, C#, .NET framework, Java.

Tools : Putty, F-Secure SSH Client 5.3, SQL* Plus, TOAD, eclipse, SQL*Loader.

ETL Tools : Informatica 9/8.x/7.x, Oracle Warehouse Builder, SQL Server (DTS)

Reporting Tools : OBIEE 11.1/10.x, Microsoft Analysis Services, Oracle reports

Data Modeling : Star Schema, Snow Flake Schema, Extended Star Schema, Physical And Logical Modeling, Ralph-Kimball Methodology, Bill-Inmon Methodology, MS Visio.

Applications : Siebel


Confidential, MI

Data Warehouse Analyst/ETL Developer


  • Created documentation of Design Specifications and Technical Sessions.
  • Extensively used Informatica Power Center 9.0.1 to create and manipulate source definitions, target definitions, mappings, mapplets, transformations, re-usable transformations, etc.
  • Involved in design and development of complex ETL mappings and stored procedures in an optimized manner.
  • Implemented partitioning and bulk loads for loading large volume of data.
  • Involved in loading the data from Source Tables to ODS (Operational Data Store) Tables using Transformation and Cleansing Logic using Informatica.
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading
  • Based on the requirements, used various transformation like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner, Stored procedure transformations in the mapping.
  • Developed mapplets and worklets for reusability.
  • Implemented weekly error tracking and correction process using Informatica.
  • Involved in performance tuning of mappings, transformations and (workflow) sessions to optimize session performance.
  • Worked on Informatica to load the data in to Netezza
  • Developed Informatica SCD type-I, Type-II and Type III mappings. Extensively used almost all of the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, mapplets and others.
  • Developed workflow tasks like reusable Email, Event wait, Timer, Command, Decision.
  • Created Stored Procedures, Packages in PL/SQL with Oracle in order to create, update several tables like Order processing Information table and Audit Log tables.
  • Used PMCMD command to start, stop and ping server from UNIX and created UNIX SHELL to automate the process
  • Extensively used Informatica Data Analyzer to create reports relative to corporate business planning modules such are: Demand Manager, Demand Fulfillment, Master Planner and Material Allocator.
  • Extensive hands on Schema design, XML import/export, Scheduling, System management using Data Analyzer.
  • Creating Test cases and detailed documentation for Unit Test, System, Integration Test and UAT to check the data quality.

Environment: Informatica PowerCenter 8.6.1/9.0.1 (Designer, Workflow manager, Workflow monitor), Oracle 10g/9i, XML Files, Flat Files, TOAD 9.7.2, Netezza, PL/SQL, ERWIN 7.3, SQL* PLUS, Teradata, Windows NT, UNIX Shell scripting, Autosys.

Confidential, Towson, MD

Role: Sr. ETL Developer/Informatica


  • To obtain the data about the customers from different systems and aggregate within the data warehouse using Informatica
  • Worked with data modelers in preparing logical and physical data models and adding/deleting necessary fields using Erwin
  • Implemented populate slowly changing dimension to maintain current information and history information in dimension tables.
  • Worked on different data sources such as TeraData, Oracle, DB2, Flat files, etc.
  • Used UNIX korn shell scripts to load data from flat files into TeraData Development and Production environments. Created UNIX scripts to pre-process the flat files before the Mload/FastLoad process
  • Created and scheduled jobs on Windows to extract data from Oracle, DB2, SQL Server and Excel using the utility OLELoad to load into Teradata.
  • Wrote DB2 stored procedures for implementing business rules and transformations.
  • Involved in the preparation of Technical design documents, Source to target (S2T) document, Review checklist and Program Specifications or Technical Specifications.
  • Developed and tested Store procedures, Functions and packages in PL/SQL for Data ETL.
  • Prepared test cases and involved in unit testing of mappings, system testing and user acceptance testing.
  • Written documentation to describe program development, logic, coding, testing, changes and corrections.

Environment: Informatica Power Center 9.X/8.6.1,OBIEE, UDB DB2 8.1, Oracle 10g/9i, SQL, PL/SQL, XML, Cognos Series 8.3/7.0, MS Access, Windows 2003, UNIX, Business Objects 6.5,Solaris 10.

Confidential, MD

Role: Sr. ETL Informatica Developer


  • Written documentation to describe program development, logic, coding, testing, changes and corrections.
  • Documented technical design documents based on the BRD and SRS.
  • Automated transformation of Unstructured (Spreadsheets, documents), semi-structured (HL7, SWIFT) and Complex structured (ACCORD XML) data done using Informatica Power Exchange.
  • Collect and link metadata from diverse sources into a center catalog and automatically connect to different data sources to accelerate the extraction and linkage of metadata using Metadata Manager
  • Knowledge in upgrading from Informatica version 7.1.4 to 8.6.0.
  • Worked with command line program pmcmd to interact with the server to start and stop sessions and batches, to stop the Informatica server and recover the sessions.
  • Wrote SQL, PL/SQL, stored procedures & triggers, cursors for implementing business rules and transformations.
  • Created complex Cognos reports using calculated data items, multiple lists in a single report.
  • Prepared functional and technical documentation of the reports created for future references

Environment: Informatica Power Center 8.6.1/7.1.4 , Oracle 10g/9i, SQL, PL/SQL, XML, Cognos Series 8.3/7.0, AS 400,SAP BW,MM, MS Access, Windows 2003, UNIX, UDB DB2 8.1, Business Objects 6.5

Confidential, San Francisco, CA

Data Warehouse Analyst/ETL Developer


  • Worked with power center tools like Designer, Workflow Manager, Workflow Monitor, and Repository Manager.
  • Worked on Designer tools like Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer and Mapping Designer.
  • Working with Informatica Meta Data Manager in analyzing the production data issues.
  • Working with Informatica Data Quality(IDQ) tool to build profiles and ensure data quality to the consumers.
  • The 2nd process of stage 1 involved generating statistical reports in SAS which could be read by application software used by the analytical Team.
  • worked on Callidus TrueComp for collects and integrates the different data feeds which is useful to compile sales data
  • Involved in Data Profiling and Data cleansing process.
  • Responsible for determining the Mapping bottlenecks with Informatica and fixing the issues by tuning the Mappings for better performance.
  • Coordinate production change requests and production releases.
  • Troubleshooting load failure issues and data quality issues on a day to day basis.
  • Maintain the daily ETL schedule and recover the daily failures and generate the daily reports for users.
  • Using SQL Maintained and enhance the existing data warehouse, exports, and reports.
  • Used Informatica Debugger to troubleshoot data and error conditions.
  • Maintain documents for all the Development work done and error fixings performed.
  • Involved in deploying and redesigning of several ETL processes for the existing research line of business
  • Used Debugger wizard to troubleshoot data and error conditions.
  • Experience in preparing reports on the performance of the UNIX operating systems and applications run on these systems.
  • Worked on writing codes for languages such as C, Perl, Shell, etc
  • Involved in training the users about working on various UNIX based applications.
  • Responsible for Best Practices like naming conventions, and Performance Tuning.
  • Developed Reusable Transformations and Reusable Mapplets.
  • Worked extensively with session parameters, Mapping Parameters, Mapping Variables and Parameter files for Incremental Loading
  • Used tools like TOAD and SQL navigator to run the queries and validate the data loaded into Marts and different other layers like Data Warehouse and Persistent Staging.
  • Involved in taking Knowledge Transfer from a different vendor before two weeks of Go Live and handled different performance issues affectively.
  • Responsible to Run Unix scripts to startup the Java Web Service to generate the Load confirmation emails to users.
  • Coordinate with Offshore team on daily basis to handle production support issues and get daily updates from the team on the development work.

Environment: Informatica Power Center(Repository Manger, Designer, Workflow Manager, Workflow Monitor, Source Analyzer, Warehouse Designer, Callidus TrueComp, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Designer,SAS, IDQ,Task developer, Worklet Designer), Power Exchange (SAP), UNIX, oracle 9i, 11g, SQL, PL/SQL, TOAD, Informatica Meta Data Manager, Informatica Data Quality, Erwin,TERADATA

Confidential, CA

ETL Developer/Database Developer


  • Developed ETL procedures to ensure conformity, compliance with minimal redundancy, translated business rules and functionality requirements into ETL procedures.
  • Performed Data mapping, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data.
  • Created a test plan and a test suite to validate the data extraction, data transformation and data load and used SQL and Microsoft Excel.
  • Importing source/target tables from the respective databases and created reusable transformations and mappings using Designer Tool set of Informatica.
  • Developed Informatica mappings by usage of aggregator, SQL overrides in lookups, source filter and source qualifier and data flow management into multiple targets using router transformations. Creating sessions and work-flows for the Informatica mappings.
  • Used various transformations of Informatica, such as Source Qualifier Transformation, Expression Transformation, Look-up transformation, Update Strategy transformation, Filter transformation, Router transformation, Joiner transformation etc., for developing Informatica mappings
  • Created Informatica mappings to load the data from staging to dimensions and fact tables.
  • Data must be loaded into a staging area and then transformed using SQL * Loader. Reduce load time by using partitions and concurrent sessions running at a time
  • Involved in developing Logical and Physical Data Models based on functional design using Erwin Data Modeler.
  • Used External tables in transformation and loading of data with a single SQL DML statement.
  • Used automated shell scripts to automate the export data into flat files for backup and delete data from staging tables for the given time period and sending emails about the statistics on regular basis.

Environment: Informatica Power Center 7.1.1, PL/SQL, UNIX Shell Scripting, DB2 SQL, Oracle 10g, XML, Erwin, Business Objects 5.1, UML, SQL*Loader, eclipse.

Confidential, Columbus, OH

Informatica Developer


  • Involved in all phases of SDLC from requirement, design, development, testing, pilot, training and rollout to the field user and support for production environment
  • Extensively Designed, Developed, Tested Informatica mappings, mapplets and Reusable Transformations to load data from various sources
  • Extensively Worked on Informatica tools such as Source Analyzer, Data Warehouse Designer, Transformation Designer, Mapplet Designer and Mapping Designer.
  • Extensively used all the transformations like source qualifier, aggregator, filter, joiner, Sorter, Lookup, Update Strategy, Router, Sequence Generator etc. and used transformation language likes transformation expression, constants, system variables, data format strings etc.
  • Created workflow design document, mapping design document, Integration test plan document, Unit test plan document etc.
  • Extensive experience in loading the data from multiple sources like flat files, other RDBMS databases.
  • Involved in running the loads to the data warehouse and data mart involving different environments.
  • Used the update strategy to effectively flag the data from source to target.
  • used PL/SQL programming in backend and front-end functions, procedures, packages to implement business rules.
  • Created, Configured and Scheduled the Sessions and Batches for different mappings using workflow Manager and UNIX Scripts.
  • Integrated various sources into the Staging area in Data warehouse.
  • Created necessary Repositories to handle the Metadata in the ETL process.
  • Created the Source and Target Definitions in Informatica Power Center Designer.
  • Developed shell scripts to automate the data loading process and to cleanse the flat file inputs
  • Involved in Data Quality Assurance, created validation routines, unit test cases and test plan.
  • Provide production support as needed.

Environment: Informatica Power Center 8.1.1, Power Connect, Oracle 9i, PL/SQL, SQL, SQL*PLUS, SQL*LOADER, RDBMS, Teradata, Power Connect, Power Center, ERWIN 3.5, UNIX Shell Scripting.


ETL Developer


  • Used ETL using Informatica (Power Center) to load data from MS SQL Server6.5, Sybase to the target Oracle8i.
  • Worked on Informatica Source Analyzer, Data warehousing designer, Mapping Designer and Transformations.
  • Worked with different transformations like Stored Procedure, Filter, Expression, Aggregator and Joiner.
  • Involved in the development of Informatica mappings and also tuned them for better performance.
  • Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data.
  • Created Catalogs, Filters, Calculations, and Prompts using Impromptu Administrator.
  • Created different report files like IMR, IQD and Created Powercubes in Transformer.
  • Created transformations like Stored Procedure, Aggregate, and Expression.
  • Involved in Logical and Physical modeling using ERwin.
  • Tuned the mappings for better performance maximum efficiency.

Environment: Informatica Power Center 5.1 , Cognos Impromptu Administrator, Transformer, IWR, Oracle 8i, SQL, PL/SQL, IIS Server5.0, Microsoft Analysis Server 2000, JSP, XML, MS SQLServer2000, ERWIN 4.0, Microsoft PowerPoint, Visio, Word, Excel, UNIX-HP, Windows NT 4.0.


Informatica Developer


  • Creation of mappings using Informatica data integration technology to Extract, Transform and Load data.
  • Test Cases Preparation and Execution using Toad.
  • Implemented Slowly Changing Dimension (SCD) Type 1 and Type 2 for inserting and updating Target tables for maintaining the history with the help of Parameters or Variables and session variables where necessary.
  • Coding and Unit-Testing of Informatica mappings.
  • Support for Code Migration in various environments from Development to Test, Test to Stage and to Production Environment
  • Preparation of the following various technical and functional documents (where applicable depending on the request types) Defect Report Document, Unit Test cases Document, Code Review Checklist, Checklist for Test Case Preparation and Template for Unit Test case document.
  • Played the role of SCM coordinator.

Environment: Windows, Informatica 7.1.4, Oracle, TOAD, SharePoint.

Hire Now