We provide IT Staff Augmentation Services!

Informatica Developer Resume

St Louis, MO


  • 8 Years of IT experience in analysis, design, development and implementation of Business Intelligence solution using Data Mart/Data Warehousing Design, ETL, Client/Server and web application on Unix and windows platform in various domains.
  • Strong data processing experience in designing and implementing Data Warehouse and Data Mart applications, mainly Transformation processes using ETL tool INFORMATICA POWER CENTER, and UNIX Shell Scripting
  • Excellent proficiency in Data Extraction, Transformation and Loading, Database Modeling and Data ware housing tools and technologies such as Ab Initio, ODI and Erwin.
  • Proficient in Dimensional Data modeling and Relational Star Join Schema/ Snowflake models, FACT & Dimensions tables, Physical & logical data modeling and Ralph Kimball Methodologies
  • Proficient level working with Informatica PowerCenter and MDM 9.5(Master Database Management) Methodology. Hands on experience identifying the most critical information within organization and creation of a single source of its truth to power business processes.
  • Strong database experience using Oracle, SQL Server, Teradata, SAP HANA, SQL, PL/SQL, SQL*Loader, Stored Procedures, TOAD, MS Access and UNIX kornshell scripting.
  • Extensively work in ETL process consisting of data transformation, data sourcing, mapping, conversion and loading.
  • Proficiency in data warehousing techniques for data cleansing, slowly changing dimensions types (1, 2 and 3).
  • Expertise in SQL/PLSQL in developing & executing Stored Procedures, Functions, Triggers and tuning on queries while extracting and loading data.
  • Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
  • Monitoring actively Informatica job runs.
  • Experience in Performance Tuning and Debugging of existing ETL processes.
  • Worked with Informatica Data Quality toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
  • Experience in UNIX shell scripting and configuring Cron jobs for Informatica job scheduling, backup of repository and folder
  • Extensive experience and knowledge of the project lifecycle, including requirements gathering, development and execution.
  • Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch
  • Expertise in OLTP/OLAP System Study, Analysis and E - R modeling, developing Database Schemas like Star Schema and Snowflake Schema used in relational, dimensional and multidimensional modeling.
  • Followed effectively industry standards of HIPAA, ANSI837 and PHI concepts.
  • Involved with every phase of the SDLC, including feasibility studies, design, and coding for large and medium business intelligence projects and continually provided value-added services to the clients.
  • Ability to work in groups as well as independently with minimum supervision and initiative to learn new technologies and tools quickly.
  • Coordinated offshore efforts and day-to-day activities,


ETL: Informatica Power Center 9.x/8.x/7.x/6.x (Repository Manager, Designer, Server manager, Work Flow Monitor, Work Flow Manager and knowledge about Informatica Administration), IDQ, ODI, SSIS 2005.

Reporting: Business Objects XI, Cognos.

Data Modeling Tools and others: Physical Modeling, Logical Modeling, Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, FACT, Dimensions), ER Diagrams, ERwin 4.0/3.5.2. Cron, Tivoli, Autosys, Quest TOAD, PL/SQL Loader, XML.

Databases: Teradata 12.0/13.0/13.10, Oracle 11g/10g/9i/8i/7.0 MS SQL Server 2005/2000/7.0/6.5/6.0, MS Access 2003/2000/97.

Languages: C, C++, SQL, PL/SQL, VB, Java.

Operating Systems: UNIX (Sun Solaris, AIX), Windows.


Confidential, St Louis, MO

Informatica Developer


  • Interacted with the Business Users to analyze the business requirements and transform the business requirements into the technical requirements.
  • Prepared technical specifications for the development of Informatica (ETL) mappings to load data into various target tables and defining ETL standards.
  • Working as an Informatica developer for loading Mortgage data to the target SQL Server.
  • Define solution and features and worked in the agile environment.
  • Created ETL mapping documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
  • Created mappings using different transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator etc to extract data from SQL server and load into SQL Server and generate FlatFiles.
  • Used shared folders for having shared mappings, shared transformations, shared sources and shared targets.
  • Worked on Agile Methodology and Involved in installing Informatica and adding users and creating folders and giving permissions to users.
  • Developed mapplets and worklets for reusability.
  • Created post-session and pre-session shell scripts and mail-notifications.
  • Scheduled workflows and shell scripts using Autosys.
  • Designing and creation of complex mappings using SCD type II involving transformations such as expression, joiner, aggregator, lookup, update strategy, and filter.
  • Designed and Implemented the ETL code for Address verification, Identity checking by working with IDQ.
  • Designed IDQ mappings which is used as Maplets in Power center.
  • Developed numerous mappings using the various transformations including Address Validator, Association, Case Converter, Classifier, Comparison, Consolidation, Match, Merge, and Parser in IDQ.
  • Involved in Performance tuning in Mappings and Sessions by identifying the bottlenecks and Implemented effective transformation Logic.
  • Involved in creating shell scripts and involved in production support.
  • Involved in Performance tuning of Informatica mappings, workflows and SQL queries/procedures.
  • Worked with testing team in tracking and resolving the defects/Bug fixing and executing Ad-hoc Requests.
  • Involved in Unit Testing, User Acceptance Testing (UAT) to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Responsible for creating Reusable Transformation objects in the Shared Folder.
  • Responsible for creating Parameter files for different Database Connections.

Environment: Informatica-Power Center 9.6.1, (Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer, Workflow Manager, Workflow monitor, Repository Manager),IDQ, Informatica Analyst, Power Exchange, Oracle 11g, SQL Server, SQL Developer, SQL Server Management Studio, Business Objects, Autosys, UNIX, Flatfiles, Microsoft Visio, Windows, TFS.

Confidential, Seattle, WA

Informatica developer


  • Interacted with business users, analysts to gather, understand, analyze, program and document the requirements of Data Warehouse.
  • Focused on new system and tools with minimal supervision and trained Project Architecture to team members.
  • Proactively analyzed the semantic/application layer to identify bottle necks for various aspects like data type different hash analysis, Latin vs Unicode analysis, Check table skew of system, finding the non-performing views based on the logic as part of performance tuning process.
  • Worked on identifying Mapping Bottlenecks and improved session performance through error handling.
  • Worked on Mapplets and created parameter files wherever necessary to facilitate reusability.
  • Used Error Handling to capture error data in tables for handling nulls, analysis and error remediation Process.
  • Effectively worked on Mapping Designer, Workflow Manager, and Workflow Monitor.
  • Used Informatica Data Quality (IDQ) for data quality, integration and profiling.
  • Used different types of profiling methods i.e. mid-stream profiling and Join analysis profiling.
  • Built IDQ mapplets to perform address cleansing and standardizing the Telephone and SSN details of customers using Informatica Developer tools.
  • Extensively used Sequence Generator in all mappings and fixed bugs / tickets raised in production for existing mappings in common folder for new files through versioning (Check in and Checkout) on an urgency through support for QA in component unit testing and validation.
  • Used shortcuts for sources, targets, transformations, Mapplets, and sessions to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
  • Created Tidal jobs to schedule Informatica Workflows. Executed workflows using Tivoli schedule.
  • Prepared the Standard Operating Procedure (Knowledge Transfer) document, which provides necessary information, required for the Maintenance and Operation of the application.
  • Performed code migration of mappings and workflows from Development to Test and Production Servers through deployment groups for DEV, TEST and PROD repositories in retaining shortcuts, dependencies with versioning.
  • Monitored data integration and data migration processes.
  • Worked with UNIX commands and used UNIX shell scripting to automate jobs. Wrote Unix scripts to back up the log files in QA and production.
  • Effectively understood session error logs and used debugger to test mapping and fixed bugs in DEV in following change procedures and validation.

Environment: Informatica Power Center 9.6.1, Informatica IDQ, Oracle 11g, Oracle SQL* LOADER, Flat Files, UNIX Shell Scripting, TOAD, SQL, WinSCP.

Confidential, Durham, NC

Sr. Informatica developer - Workplace Investments/Plan Record Keeping


  • Worked with business owners, analysts, solution engineers, development teams and infrastructure services to communicate application and data architectures.
  • Actively Involved with System analysts and business in creating Software Design Specification (SDS).
  • Identified necessary elements according to the business requirements and created a Road map and Technical specification document (TSD).
  • Provided expert advice, counsel, Production Support and technical expertise to the project team to help assure that Informatica solutions are designed and developed in the optimal manner and in accordance with industry and Informatica best practices.
  • Produce detail design document, low level design document and involved in code/document review.
  • Participated in the maintenance and enhancement of the application and perform bug fixing.
  • Designed and developed ETL mappings to load, transform data from source to Target using Informatica Power Center 9.5.1.
  • Worked with PowerCenter Designer tools in developing mappings and mapplets to extract and load the data from various sources.
  • Gathered reporting requirements from the users and created functional and technical specifications.
  • Developed various Informatica mappings using various transformations like Sorter transformation, expression, lookups, Aggregator transformation, Normalizer Transformation and Joiner transformation.
  • Created mappings using Informatica Designer to build business rules to load data and tuned them to enhance the performance.
  • Tested all the business application rules with test & live data and automated, monitored the sessions using Work Flow Manager and Workflow Monitor.
  • Created various plans in IDQ to enhance the source data by comparing with reference tables and dictionaries.
  • Analysis of customer information is done with IDQ to standardize the data as per US postal code.
  • Used different types of profiling methods i.e. mid-stream profiling and Join analysis profiling.
  • Performed Unit Testing and Involved in tuning Session and Workflows for better Performance.
  • Created UNIX scripts to dynamically create data interface files and SFTP files to target Axway servers.
  • Created UNIX scripts to Archive and purge Data Files and log files.
  • Implement data replication and partitioning for archival and disaster recovery.
  • Scheduled Informatica jobs using Control-M Scheduler and generate e-mail to notify Success/Failure to the support Team.
  • Created Production Install Plan and involved in Production support.
  • Proactive Monitored data integration processes and route e-mail alert to responsible teams.

Environment: Informatica-Power Center 9.5.1, (Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer,Workflow Manager, Workflow monitor, Repository Manager), IDQ, Oracle 11g, SQL Server, SQL Developer 3.2, SQL Server 2012, SQL Server Management Studio, Business Objects, HP-Service manager, Control-M, UNIX, Flatfiles, Microsoft Visio, Windows.

Confidential, Dubuque, IA

Informatica developer - Business Process Integration


  • Involved in business requirement gathering, which includes conducting Workshop sessions and translating the business inputs from workshop sessions into ETL design/technical documents.
  • Involved in End to End Data flow design in ETL Informatica mappings.
  • Extensively worked on Informatica Designer, Work Flow Manager, Work Flow Monitor and Repository Manager.
  • Created several complex Informatica mappings, Mapplets depending on client requirements. Extensively worked on Connected & Unconnected Lookups, Router, Expression, Source Qualifier, Aggregator, Filter, and Sequence Generator.
  • Developed SQL Scripts and PL/SQL Scripts to extract data from different sources.
  • Performed Data cleansing, standardizing incorrect data formats, misspellings, redundant, missing data values and performed de-duplication of customer data.
  • Extensively worked with Lookup caches like Static cache, Dynamic cache & Persistent cache.
  • Involved in the optimization of SQL queries which resulted in substantial performance improvement for the conversion processes.
  • Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems.
  • Defined the Base objects, Staging tables, foreign key relationships, static lookups, dynamic lookups, queries, packages and query groups.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Used Hierarchies tool for configuring entity base objects, entity types, relationship base objects, relationship types, profiles, put and display packages and used the entity types as subject areas in IDD.
  • Defined the Trust and Validation rules and setting up the match/merge rule sets to get the right master records.
  • Configured match rule set property by enabling search by rules in MDM according to Business Rules.
  • Performed match/merge and ran match rules to check the effectiveness of MDM process on data.
  • Involved with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.
  • Responsible for performance tuning which includes Tuned sources, targets, mappings, and sessions to improve the performance. To get optimal performance, used parameter files and mapping parameter, Variables in mappings and session variables in workflows.
  • Used Workflow Manager to schedule and run batches, sessions, as well as to check session logs and other session related activities.
  • Developed UNIX Shell Scripts for automating Batches and Sessions.
  • Responsible for Error handling, bug fixing, Session monitoring and log analysis.
  • Involved in unit testing, UAT testing and integrated testing of entire process flow.
  • Deployed Informatica jobs from Development environment to Test environment and Test to Production Environment using Informatica export/import methodology.
  • Developed Business Process diagrams to understand the business and process and developed data flow diagrams using Visio.

Environment: Informatica- Power Center 9.5.1, Informatica MDM, (Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer, Transformations Developer, Workflow Manager, Workflow monitor, Repository Manager), Oracle 11g, Toad, Java, Netbeans, View Point Erwin 7, MS Visio, Windows, Autosys.

Confidential, Iselin, NJ

ETL Developer


  • Analyzed the source systems data, and imported the sources into informatica power center using informatica power exchange and designed ETL mappings.
  • Designed various mappings using powercenter using Application source qualifier, expression, lookup, Aggregators, router, Sorter, rank, stored procedure, Input and output and update strategy transformations.
  • Experience in data analysis of various source systems and normalizing the schema objects as per 3NF.
  • Loading the client feed files into ODS (Operational Data Source) & EDW (Enterprise Data Warehouse).
  • Designed ETL using ODI to integrate various source systems to Essbase and planning applications.
  • Maintaining the data into EDW (Teradata) DW and ODS Staging Area (Oracle) for other downstream applications.
  • Created various plans in IDQ to enhance the source data by comparing with reference tables and dictionaries.
  • Data Analysis of customer information is done with IDQ to standardize the data as per US postal code.
  • Tuning the Informatica mapping and SQL's for optimal load performance
  • Improved performance by emphasizing operations like joins, aggregations and lookups using SQL overrides.
  • Involved in migration of objects from Dev to Test and Prod repositories.
  • Ensured continuous support for user testing, troubleshooting, and issue resolution.
  • Schedule of workflow is done in Informatica and Tivoli.
  • Provided ongoing support for maintenance and enhancing the application to meet the changing business requirements through a Change Management process.
  • Involved in the production support, watch the workflow monitor carefully to see all workflows and all Tasks are running properly and data is loading in the proper way and to the right targets and at the right time.

Environment: Informatica PowerCenter 9.1, IDQ, Oracle 10g, ODI, SQL, PL/SQL, TOAD 8.5, BO XI, Tivoli, UNIX and Windows.

Confidential, New Orleans, LA

Sr. ETL Developer- Whitney & Hancock Actimize Implementation


  • Studied the existing systems to merge the Whitney and Hancock Bank into centralized Actimize Data Warehouse.
  • Worked with different LOBs like Checking Account, Savings Account, ATM, Wire Transfer being extracted from different source systems.
  • Applied the Fidelity Software and the Actimize standards in order to determine any Anti-Fraud Anti-Laundering transactions.
  • Created documents such as Business Requirement (BRD) and Technical Design (TDD).
  • Created the ETL / STM specs out of the requirements specified as per business rules.
  • Profiled the data using Informatica Data Explorer (IDE) to standardize from different Source systems.
  • Using Informatica Power Exchange (PWX) created the datamaps making use of COBOL copybooks and data files.
  • Created mappings extracting source data from different sources like Flat Files and Oracle relational databases making use of Informatica Power Center (PWC).
  • Applied the business logic and rules on different LOBs records using different transformation objects such as Expression, Aggregator, Joiner, Sorter, Router, Lookup, Update Strategy objects.
  • Completed the Unit Testing (UT) and generated the Unit Test Cases.
  • Performed the Quality Review (QR) meetings to meet the standards of the Corporate along with Database Admin and Informatica Admin.
  • Migrated the code from DEV to QA to perform System Integration Testing (SIT) and User Acceptance Testing (UAT) and eventually to PROD environment.
  • Set up the environment with Parameter files and Audit Balance Control mechanism.
  • Created the Implementation Plan and uploaded all the documents to PVCS.
  • Supported the Implementation and Post-Implementation process.

Environment: Informatica PowerCenter 8.6.1 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Power Exchange Navigator 8.6.1, IDE, SQL Query Analyzer 8.0, Oracle 9i/10g, SQL Server 2005, SQL Server Management Studio, Toad, SQL Developer 1.1.2, PVCS, Tivoli, AIX, Sun Solaris & Windows NT, Shell Scripting.

Confidential, Waukegan, IL

Application Developer


  • Studied the existing environment and accumulating the requirements by querying the Clients on various aspects.
  • Identified various Data Sources and Development Environment.
  • Data modeling and design of for data warehouse and data marts in star schema methodology with Dimensions and Fact tables.
  • Prepared user requirement documentation for mapping and additional functionality.
  • Responsible for generating complex source views and target definitions.
  • Tuned SQL by running Parallel queries after Analyzing Execution Plan and applied partitions.
  • Optimized target load by dropping and rebuilding indexes using stored procedure pre-sql and post-sql commands.
  • Responsible for tuning ETL procedures and 3NF to optimize load and query Performance.
  • Extensively used ETL to load data using PowerCenter / Power Exchange from source systems like Flat Files and Excel Files into staging tables and load the data into the target database Oracle.
  • Completed the data profiling of various subject areas attributes using IDE.
  • Accomplished the proof of concept (POC) of deploying the Workbench Plans in IDQ Workbench environment.
  • Prepared technical specification to load data into various tables in DataMarts.
  • Applied the concept of slowly changing dimensions Type2 / Type1.
  • Designed and Developed pre-session, post-session routines and batch execution routines using Informatica Server to run Informatica sessions.
  • Maintained the documents with various versions using PVCS.
  • Worked extensively on Mappings, Mapplets, Sessions and Workflows.
  • Scheduled and monitored the jobs using Tivoli Systems (TWS).
  • Used PMCMD command to start, stop and ping server from UNIX and created Shell scripts to automate the process.
  • Lead the activities, growth and professional development for three staff members.

Environment: Informatica PowerCenter 8.6.1/8.1.1 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Power Exchange Navigator 8.6.1/8.1.1, IDQ Workbench, SQL Server 2008, SQL Enterprise Manager, SQL Query Analyzer, Teradata 13.10, TPT 13.0, SQL Assistant, DB2, Java, Tivoli, AIX, Sun Solaris & Windows.

Confidential, Richmond, VA

ETL Developer (Migration from DataStage to Informatica)


  • Performed the migration of mappings from Datastage tool to Informatica ETL tool.
  • With limited guidelines and with no documentation available, studied the existing jobs in DataStage and created specifications for respective mappings.
  • Developed the Low complexity, Medium complexity and High complexity corresponding mappings along with any defects in the old processes.
  • Created Triggers and Stored Procedures using PL/SQL.
  • Generated Parameter Files for all the mappings using KSH.
  • Developed the nightly batch run KSH to update param files using batch id.
  • Developed the nightly integrity run to email succinct report to Integrity group.
  • Performed the Unit Testing and Integration testing with the help of Integrity queries.
  • Validated the results using MicroStrategy reports with both ETL tools.
  • Involved in migrating the newly developed Informatica process in QA and PROD environments.
  • Create and maintain documentation related to production batch jobs.

Environment: Informatica PowerCenter 8.1.3 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Ascential DataStage, MicroStrategy 7.1.5, Oracle 9i, TOAD 8.6.1, Sun Solaris & Windows NT, Shell Scripting.

Confidential, Woodcliff Lake, NJ

Programmer Analyst / ETL Developer


  • Worked as a mentor in explaining the inbound and outbound processes.
  • Responsible for generating XML source and target definitions.
  • Responsible for tuning ETL procedures and 3NF to optimize load and query Performance.
  • Wrote UNIX scripts and SQL cards/scripts for implementing business rules.
  • Code Comparison between COBOL development and its corresponding Informatica development.
  • Output validation against the Production Mainframe files.
  • Generating email notifications through scripts that run in the Post session implementations.
  • Developed Database Triggers and Stored Procedures in Teradata.
  • Generated minor reports and created universes using Business Objects XI.
  • Collaborated with Business Analysts to ascertain the issues with the existing data warehouse. Modified mappings to conform to business rules
  • Heavily worked on reading the data from VSAM files and transferring into staging area
  • Tuned performance of Informatica sessions for large data files by increasing block size, data cache size and sequence buffer length.
  • Worked with DBA on SQL scripts to automate the process of populating the various columns in the tables with surrogate keys
  • Involved in preparing detailed ETL design documents

Environment: Informatica PowerCenter 6.1.2, 7.1.3 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Teradata 7.1.0 (Administrator, SQL Assistant, Performance Monitor), Teradata servers, Business Objects, Shell Scripting, UNIX, Windows.


ETL Developer


  • Prepared the Source-To-Target (STM) Specifications for Informatica Mappings with the Business Rules for the Transformations.
  • Develop the ETL Mappings with Transformations from source to Target using Informatica.
  • Develop the Workflows for the Mappings.
  • Debug the Informatica Mappings based on the SPR’s and make necessary changes for the closure.
  • Migration of the Workflows and Mappings from Development to Test Environment.
  • Tested the mapping thoroughly using unit and regression testing.
  • Developed the Informatica Mappings using various Transformations like Source, Target, Expressions, Joiner, and Router.
  • Assisting with migration from version 1.1 of the .NET Framework to 2.0 and further to Visual Studio Team System.

Environment: Informatica PowerCenter 6.1, 7.1, SQL Server Integration Services 2005, Oracle 8i, Toad, MS Access, Business Objects, UNIX & Windows NT, Shell/Perl Scripting, .NET 2.0 Framework, C#, Visual Studio.

Hire Now