We provide IT Staff Augmentation Services!

Informatica Developer Resume Profile



  • Overall 7 years of professional IT experience in Data warehousing, Design, Modeling, Development, Analysis, Implementation and Testing.
  • Involved in Full Life Cycle Development Waterfall Agile of building a Data Warehouse on Windows and Unix Platforms for Investment Banking, Financial and Health CareIndustries.
  • Expert knowledge in working with Data Warehousing tools ETL tools like Informatica Power Center 9.5/9.1/8.6/8.1/7.1/6.1/5.1,Power Mart and Power Exchange.
  • Extensively used Informatica Warehouse Designer to create and manipulate Source and Target definitions, Mappings, Mapplet and Transformations such as such as Source Qualifier, Lookup, Filter, Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, XML, Sorter and Sequence Generator.
  • Involved in the POC of Big data solution to implement efficient summarization DW processes using Hadoop platform with Vertica as the DW database.
  • Designed and Developed Data Marts by using Star Schema and Snowflake SchemaMethodology.
  • Develop and execute load scripts using Teradata client utilities MULTILOAD, FASTLOAD and BTEQ.
  • Well Experienced in doing Error Handling and Troubleshooting using various log files.
  • Experience in implementingupdate strategies, incremental loadsandChange Data Capture CDC and handling SCDs Slowly Changing Dimensions using Informatica.
  • Extensively worked on Data migration, Data cleansing and Data Staging of operational sources using ETL processes and providing data mining features for data warehouses.
  • Extensive experience in Performance Tuning in Identifying and fixing bottlenecksalso tuned the complex Informatica mappings for better Performance.
  • Good command on Database as Oracle 11g/10g/9i/8i ,BT 13, SQL Server 2000, MS Access 2003.
  • Worked in Industrial, Retail, Financial, Healthcare and Banking sectors.
  • Experience in UNIX working environment, writing UNIX shell scripts for Informatica pre post session operations.
  • Experience in working through phases of Rational Unified Process RUP .
  • Performed System Analysis and QA testing and involved in Production Support.
  • Excellent communication and interpersonal skills. Ability to work effectively working as a team member as well as an individual.
  • Worked on HP Application Lifecycle Management HP ALM to write test cases.

Technical Skills:

Data Warehousing/ ETL

InformaticaPowerCenter9.5/9.1/8.6/8.1/7.1, InformaticaPowerMart9/8.x/7.x, Workflow Manager, Workflow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Repository manager , Metadata, Datamart, OLAP, OLTP, Cognos 7.0/6.0 , ERWIN 4.x/3.x., ODI Oracle Data Integrator

Dimensional Data


Dimensional Data Modeling, Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, Erwin and Oracle Designer.

Databases Tools

Oracle 11g/10g/9i/8i/8.x, Teradata, Vertica, DB2 UDB 8.5, SQL Server, Teradata, OEM Grid control, RMAN, MS SQL Server 2005/2000/7.0/6.5,SQL Plus, SQL Loader, Net8, TOAD.

Scheduling Tools

Appworx 6.0, Informatica Workflow Manager, Tidal, Maestro/Tivoli, Apache, Tomcat, IIS.

Programming Languages

Unix Shell Scripting, SQL, PL/SQL, Perl.


UNIX, Win XP/NT 4.0, Sun Solaris 2.6/2.7, HP-UX 10.20/9.0, IBM AIX 4.2/4.3

Professional Experience:


Role: Informatica Developer

Confidential is an American online brokerage company based in Omaha, Nebraska, TD Ameritrade's different brokerage platforms Think or Swim and Trade Architect generated most revenues for the firm. TD Ameritrade Holding Corporation NYSE: AMTD is the owner of TD Ameritrade Inc. Services offered include common and preferred stocks, futures, ETFs, option trades, mutual funds, fixed income, margin lending, and cash management services.


  • Worked with business analysts for requirement gathering, business analysis, and translated the business requirements into technical specifications to build the Enterprise data warehouse.
  • Extensively worked on Power Center 9.1 Designer client tools like Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Worked on Teradata and oracle 11g and AS 400 databases.
  • Develop and execute load scripts using Teradata client utilities MULTILOAD, FASTLOAD and BTEQ.
  • Extensively worked on Transformations like Lookup, Joiner, SQL and Source Qualifier Transformations in the Informatica Designer.
  • Extensive experience with SSIS , Data Dictionary for SQL Server database migration task
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner.
  • Modify and develop new ETL programs, transformations, indexes, data staging areas, summary tables, and data quality routine based upon redesign activities.
  • Worked Extensively on Teradata SQL Assistant to analyze the existing data and implemented new business rules to handle various source data anomalies.
  • Worked on performance tuning of the ETL processes. Optimized/tuned mappings for better performance and efficiency.
  • Defined Target Load Order Plan and to load data correctly into different Target Tables.
  • Used Informatica debugger to test the data flow and fix the mappings.
  • Loaded data to and from Flat files and databases like Oracle, Teradata and DB2.
  • Modified existing and developed new ETL programs, transformations, indexes, data staging areas, summary tables, and data quality routine based upon redesign activities.
  • Moved the mappings, sessions, workflows, maplets from one environment to other.
  • Worked on UNIX Shell scripting and called several shell scripts using command task in Workflow manager.
  • Developed UNIX Shell scripts to archive files after extracting and loading data to Warehouse.
  • Used Informatica Power exchange 9.1 for Change Data Capture CDC .
  • Monitored sessions using the workflow monitor, which were scheduled, running, completed or failed. Debugged mappings for failed sessions.
  • Involved in Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.

Environment:Informatica Power Center 9.1, Power Exchange 9.1, Oracle 11g, Netezza, Win7, SQL Plus, Toad, AS 400, UNIX,


Role: ETL Tester/ Informatica Developer

Confidential is a leading global financial services firm with assets of 2 trillion and operations in more than 60 countries. The firm is a leader in investment banking, financial services for consumers, small business and commercial banking, financial transaction processing, asset management, and private equity.


  • Design, Development and Documentation of the ETL Extract, Transformation Load strategy to populate the Data Warehouse from the various source systems.
  • Prepared data marts on policy data, policy coverage, claims data, client data and risk codes.
  • Extensively used Informatica PowerCenter 8.6 to create and manipulate source definitions, target definitions, mappings, mapplets, transformations, re-usable transformations, etc.
  • Involved in design and development complex ETL mappings and stored procedures in an optimized manner. Used Power exchange for mainframe sources.
  • Involved in loading the data from Source Tables to ODS Operational Data Source and XML files using Transformation and Cleansing Logic using Informatica.
  • Based on the logic, used various transformation like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner, XML, Stored procedure transformations in the mapping.
  • Involved in performance tuning of mappings, transformations and workflow sessions to optimize session performance.
  • Used SSIS , Data dictionary for Migration of data in SQL Server database.
  • Developed Informatica SCD type-I, Type-II and Type III mappings and tuned them for better performance. Extensively used almost all of the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, mapplets and others.
  • Snowflake Schema was mainly used with Geography, Customer, Product, and Time as basic dimensions.
  • Creating Test cases for Unit Test, System, Integration Test and UAT to check the data quality.

Environment: Informatica Power Center 8.6.1/8.1.1,Oracle Business Intelligence OBIEE , Oracle 10g/9g ,MS SQL Server 2008, TOAD for SQL Server, Flat Files, PL/SQL, IBM DB2 Mainframes, Windows 2000, Teradata12, XML.


Role: ETL Informatica Developer

Confidential is one of the premier non-conforming wholesale mortgage lenders in the United States providing innovative mortgage products and fast closings to independent mortgage brokers throughout the country. Assigned as ETL Developer at EquiFirst to design, develop, and implement the ETL process to load the Products Data Mart, which helped generate Business Intelligence and Performance Management reports.


  • Interacted with the Business Users to analyze the Business Requirements,High Level Document HLD , Low Level Document LLD and transform the business requirements into the technical requirements.
  • Developed data models, created source to target mappings and technical specifications for the development of Informatica ETL mappings to load data into various target tables and defining ETL standards.
  • Involved in the POC of Big data solution to implement efficient summarization processes using Hadoop platform with Vertica as the DW database.
  • Contributed towards documenting IDW process flows as per AGILEmethodology.
  • This project involves production support and development for the systems -Order management Metasolv , Billing application Arbor BP , Customer Relationship Management Oracle CRM , Integration Manager TIBCO and also implementing new Change Requests.
  • Developed database Schemas like Star schema,Snowflake schema used in relational, dimensional and multidimensional data modeling using ERWIN and XSD XML SCHEMA DEFINITION .
  • Designed and Developed OraclePL/SQL Package for initial loading and processing of Derivative Data.
  • Worked with various Informatica client tools like Source Analyzer,Warehouse Designer, Mapping Designer,Mapplet Designer,Transformation Developer,Informatica Repository Manager and Workflow Manager.
  • Implemented weekly error tracking and correction process using Informatica.
  • Develop and execute load scripts using Teradata client utilities MULTILOAD, FASTLOAD and BTEQ.
  • Implemented Performance tuning in Mappings and Sessions by identifying the bottlenecks and Implemented effective transformation Logic.
  • Extensively used Stored Procedures, Functions and Packages using PL/SQL for creating Stored Procedure Transformations.
  • Used massively parallel processing MPP architectures to provide high query performance and platform scalability.
  • Developed Unix Korn shell wrapperscripts to accept parameters and scheduled the processes using Autosys.
  • Developed Business Objects in accordance to client's needs and requirements and implement Business Objects development and testing.
  • Extensively used ODI Oracle Data Integrator to migrate the data from the different sources to target in Data warehouse.
  • Involved in Unit Testing, User Acceptance Testing UAT to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Performed Data Quality checks, and developed ETL and Unix Shell Script processes to ensure flow of data of the desired quality.
  • Performed Unit and Integration Testing of Informatica Sessions, Batches and Target Data.
  • Actively involved in the Production support and also transferred knowledge to the other team members.

Environment: Informatica Power Center9.1/8.6.1,Oracle 11g/10g, Teradata V2R5, Vertica, TOAD for Oracle,SQL Server 2008, PL/SQL,DB2,SQL,Erwin 4.5, Business Objects,Unix Shell Scripting PERL , UNIX AIX , Windows XP,Autosys.


Role: DW Informatica Developer

Confidential is a provider of banking, mortgage, investing, credit card, insurance, and consumer and commercial financial services. My responsibility was implementation and maintenance of databases, Application Development, Generating reports and Analyzing data.


  • Gathering of user requirements and source system analysis and establishing mappings between source to target attributes.Source data analysis and design Documentation.
  • Parsed high-level design spec to simple ETL coding and mapping standards.
  • Designed and developed ETL Mappings using Informatica to extract data from Mainframe DB2 tables,flat files and Oracle, and to load the data into the target database.
  • Extensively used various transformations like Source Qualifier, Joiner, Aggregation, Update Strategy, lookup, Rank and Filter.
  • Created PL/SQL procedures to populate base datamart aggregation structure.Analyzed and fine-tuned PL/SQL scripts.
  • Developed ETL into data mart for phase-II data elements from staging.
  • Created complex Informatica mappings to load the data mart and monitored them. The mappings involved extensive use of transformations like Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator.
  • Transformation.
  • Defined Target Load Order Plan and Constraint based loading for loading data correctly into different Target Tables.
  • Administrate Informatica PowerCenter including Migrations, Repository Backup/Restores and Upgrades.
  • Administrating User Privileges, Groups and Folders which includes creation, update and deletion.Migration of New and Changed Informatica objects across the environments using Folder to Folder and Deployment Group methods.
  • Used predefined shell scripts to run jobs and dataloads.
  • Worked on SQL tools like TOAD to run SQL Queries to validate the data.
  • Created, updated and maintained ETL technical documentation.

Environment:Informatica PowerCenter 7.1.3, Teradata V2R6, Oracle10g, DB2, Trillium 7.6, PL/SQL, UNIX Shell Scripting, Erwin 3.5, Windows XP, AIX, Microsoft Visio, SQL Plus, TOAD, Business Objects 6.5.


Role: Informatica Developer

Initiatives is a national nonprofit health system which operates in 17 states and includes 80 hospitals, 40 long-term care, assisted and residential living facilities, home health agencies and two community health-services organizations.


  • The prime responsibility of the team is to develop a Patient Care Data Warehouse PCDW .This data warehouse is used to access detailed clinical data on a platform and also to facilitate the enterprise-wide data analysis within the health care environment.
  • Designed and deployed overall ETL strategy including CDC, SCD's, Partition Management, Materialized Views and other complex mapping logics.
  • Migrated from Informatica PowerCenter Repository version 7.1 to 8.1 and applied patches over 8.1.
  • Implemented appropriate Error handling logic, data load methods and capturing invalid data from the source system for further data cleanup.
  • Involved in designing logical and physical database designing using ERWIN tool.
  • Developing shared objects in Informatica for source/target/lookup transformations, developed complex mappings, sessions/workflows/worklets, database connections
  • Entered metadata descriptions at both the transformation and port level in complex mappings
  • Design the ETL process and schedule the stage and mart loads for the data mart.
  • Creation of customized Mload scripts on UNIX platform for Teradata loads
  • Debugging and Troubleshooting Informatica Mappings.
  • SQL Query tuning including Explain Plan, using Optimizer Hints, Indexes and Histograms.
  • Involved in Postproduction support and enhancements.
  • Analyzed the Specifications and involved in identifying the source data needs to be moved to data warehouse
  • Involved in Project scheduling and Project Estimations.
  • Worked closely with the Requirements Manager and business Analysts on the requirements collection process.
  • Provide technical support to the IT staff in identifying and resolving problems. Act as a liaison with internal teams to research, analyze and propose solutions to technical, operational and test scenarios.
  • Assisted the tester in developing the test cases and reviewed them.
  • Have thorough experience on Defect Management.
  • Supervised the Environment migration process from Development to Test.
  • Good Experience with ODI Oracle Data Integrator for Migration of data from the different level of sources such Oracle Database, SQL Server Database.

Environment:Informatica 8.5.1,Oracle 10g, Informatica Power Center 8.1.1, Teradata, Erwin 4.5, Oracle Applications, Flat files, PL/SQL, TOAD 9.0, SQL,UNIX, Mainframes JCL Jobs , Quality Center 9.0


Role: ETL developer, Report Generator

Confidential is an Indian multinational information technology IT services, business solutions and consulting company headquartered in Mumbai, Maharashtra. It is a subsidiary of the Tata Group and is listed on the Bombay Stock Exchange and the National Stock Exchange of India. Its main function is to provide IT services and solutions at the various client locations by onsite support as well as offsite support especially in USA and European countries .TCS is the largest Indian company by market capitalization 6 7 and is the largest India-based IT services company by 2013 revenues.


  • Analyzed user requirements and developed Business rules associated with the ETL processes. Understanding the Functional Requirement.
  • Writing Unix Scripts for SFTP, zipping, unzipping and archiving the flat files.
  • Developed various Sessions, Batches for all Mappings for data loading from Source TXT files, Tables to Target tables.
  • Created mappings for Dimension Tables using SCD Type 2.
  • Used Debugger wizard to remove bottlenecks at source, transformation, and target for optimum loads
  • Designed and developed complex mappings by using Lookup, mapplets, stored procedures, Router transformation rules to generate consolidated data identified by dimensions using Informatica ETL tool.
  • Running Appworx chains Modules to call Unix scripts.
  • Monitoring the loads for a check on data load into the target tables.Validating the source data with target data for finding the mismatches.
  • Design and implementation of Test plan and Test Approach.Defect reporting using Remedy.

Environment:Informatica Power Center 7.1.3, Work Flow Manager, Workflow Monitor,Business Objects 6.5, Universe Designer, BO Designer Remedy, Unix, Oracle 9i, Window Server 2003.

Hire Now