We provide IT Staff Augmentation Services!

Sr. Informatica Consultant Resume

3.00/5 (Submit Your Rating)

Washington, DC

PROFESSIONAL SUMMARY:

  • Over 8+ years of IT experience in designing and developing Data Warehouse applications using ETL and Business Intelligence tools like Informatica Power Center, Power Exchange MQ (Designer, Workflow Manager, Workflow Monitor, Repository Manager, Repository Server Administrator Console and Server Manager).
  • Experience in Installation, Configuration and Administration of Informatica Power center 7.x/6.x /8.x/9.x Client, Server in Windows and UNIX.
  • Highly experienced in Extraction/Transformation/Loading of the legacy data to Data warehouse using ETL Tools.
  • Extensive data cleansing using PL/SQL code and created reports for the same to present to the business users and source systems SME's.
  • Performed Data Mining, Data Validation and Data Analysis.
  • Designing and executing unit test plans and gap analysis to ensure that business requirements and functional specifications are tested and fulfilled.
  • Expertise in SQL and PL/SQL programming, developing & executing Packages, Stored Procedures, Functions, Triggers, Table Partitioning, Materialized Views.
  • Extracted data from Teradata and loaded into various other target systems.
  • Experience in Integration of various data sources like Oracle, Teradata, SQL Server, DB2 and Flat Files in various formats like fixed width, CSV and excel.
  • Expertise in error handling and reprocessing of error records while extraction and loading of data into enterprise ware house objects.
  • Good hands on experience in creating, modifying and implementing Unix shell scripts for running Informatica workflows, preprocessing and post processing validations etc.
  • Conducted extensive data profiling and cleansing using PL/SQL code and created reports for the same to present it to the business users and source systems SME's.
  • Experience with Informatica Identity Resolution in searching and identifying the data records with high accuracy.
  • Experienced in performance tuning of Informatica code.
  • Performed data cleansing and cache optimization.
  • Experience on business glossary and metadata manager in Informatica Power Center.
  • Experience on Design and Deploy UNIX Shell Scripts.
  • Documented ETL development standards as per client requirements.
  • Experience in debugging targets, sources, mappings and Sessions.
  • Extracted data from Flat files, DB2, SQL and Oracle to build an Operation Data Source. Applied business logic to load the data into Global Data Warehouse using Erwin tool .
  • Created and Monitored Sessions and various other Tasks such as Event - Raise Task, Event-Wait Task, Decision Task, Email Task, Assignment Task, Command Task etc. using Informatica Workflow.

TECHNICAL SKILLS:

ETL Tools: Informatica 6.x/7.x/8.x/9.x, Power Exchange 8.6.1, Datastage8.1, Datastage8.5

Data Modelling Tools: Dimensional Data Modelling, Star Join Schema Modelling, Snow Flake Modelling, Fact and Dimensions Tables, Physical and Logical Data Modelling Using Erwin Tool

Other ETL Associated Tools: Informatica Mapping Analyst, Informatica IDQ

RDBMS: Oracle 11g/10g/9i/8.x, SQL Server 2008/2005/2000, MySQL, DB2, MS Access

Database Tools: TOAD, SQL*PLUS, SQL Developer, SQL*Loader, Teradata SQL Assistant, Dynamo DB

Languages: SQL, PLSQL, Shell Scripting, C, C++

Operating Systems: MS-DOS, Windows7/Vista/XP/2003/2000/NT, UNIX, AIX, Sun Solaris

PROFESSIONAL EXPERIENCE:

Confidential, Washington, DC

Sr. Informatica Consultant

Responsibilities:

  • Worked closely with business analysts to understand and document business needs for decision support data for a new client. Used Informatica Power Center designer, workflow manager, workflow monitor.
  • Developed Source to Target Mapping documentation based on business requirement specifications.
  • Extensively developed various Mappings that incorporated business logic, which performed Extraction, Transformation and Loading of Source data into OLAP schema.
  • Extensively used Transformations like Source Qualifier, Expression, Lookup, Update Strategy, Aggregator, Stored Procedure, Filter, Router, Joiner etc.
  • Implemented Lookup transformation to update already existing target tables.
  • Experience with Salesforce.com to have better customer Relationship Management.
  • Implemented Type 1, 2 SCD'S based on the Business Requirements and Nature of business source data.
  • Extensively worked on the newly developed reporting data mart that will meet current capabilities, improve flexibility and improve scalability.
  • Extensively used re-usable Objects like Mapplets, sessions and Transformations in the Mappings.
  • Implemented Constraint Based Load order and Target Load Plan.
  • Developed sequential, concurrent sessions and validated them. Scheduled these sessions by using Workflow manager.
  • Tuned target, source, transformation, mapping and session to increase Session performance.
  • Implemented Pushdown Optimization to reduce the burden on the Integration service and thereby increase the performance.
  • Conducted extensive data profiling and cleansing using PL/SQL code and created reports for the same to present it to the business users and source systems SME's.
  • Performed Data Mining and Data Analysis.
  • Exposure in creating Significant Dashboard design and build experience in MicroStrategy and Business objects.
  • Performed adhoc reporting using Business Objects and MicroStrategy.
  • Understood the Reporting requirements.
  • Developed Reports and created metrics, hierarchies and filters using MicroStrategy Desktop.
  • Developed Stored Procedures and test the application with Toad.
  • Developed PL/SQL Procedures, Functions to build business rules which are helpful to extract data from source and load data to target.
  • Query Optimization for improving the query performance using Indexes and hints.
  • Extensively used debugger to find out errors in mappings and later fixed them.
  • Ran and controlled workflows using the PMCMD.
  • Helped data modeler in designing data model for Data warehouse using Erwin tool.
  • Proactively evaluated the quality and integrity of data by unit test and System test for Informatica mappings according to the business needs.

Environment: Informatica Power Center 9.1.0/8.6.1 (Repository Manger, Designer, Workflow Monitor, Workflow Manager), Teradata, Oracle, Flat files, AutoSys, Erwin tool.

Confidential, Miami, FL

Sr. Informatica Consultant

Responsibilities:

  • Successfully design and develop ETL mappings to Extract, Transform and Load data from various data sources to the Enterprise Data warehouse. The data warehouse contained information regarding Credit card and Bank Account data. The project aims to help make decisions for credit card and Bank account data improvements.
  • Create the ETL mappings using various Informatica transformations: Source qualifier, Data Quality, Lookup, Expression, Filter, Router, Sorter, Aggregator etc.
  • Create and maintain, detailed support documentation for all ETL processes, develop solutions, including detailed flow designs and drafts.
  • Work with the DW architect to prepare the ETL design document.
  • Develop transformation logic to cleanse the source data of inconsistencies during the source to stage loading.
  • Create the sessions, workflows and setting up the dependencies of the tasks in the workflow manager.
  • Used business glossary and metadata manager in Informatica PowerCenter to manage the changes in data, reduce the errors caused due to changes and to ensure data Integrity.
  • Design and develop an entire Data Mart from scratch.
  • Perform Data Mining; extracting pattern from large data structures.
  • Design, develop and automate the Monthly and weekly refresh of the Data mart.
  • Develop Informatica mappings, transformation, reusable objects by using mapping designer, and transformation developer and Mapplet designer in Informatica PowerCenter.
  • Create reusable transformations and Mapplets and used them in mappings.
  • Use Informatica PowerCenter for extraction, loading and transformation (ETL) of data in the data warehouse.
  • Use Informatica Power Center Workflow manager to create sessions, batches to run with the logic embedded in the mappings.
  • Experience with Informatica Identity Resolution(IIR) in searching and identifying the data records with high accuracy.
  • Create various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process. Maintain the batch processes using Unix Shell Scripts.
  • Experienced in performance tuning of Informatica code.
  • Extract data from flat files and oracle database and DynamoDB, apply business logic to load them in to Oracle ware house.
  • Experience in debugging targets, sources, mappings and Sessions.
  • Create DDLs for new tables and updating DDLs for the existing tables in the target systems in coordination with the DW architect.
  • Provide the dimensional data model to give optimal performance by changing the Primary Index of tables and applying various performance tuning techniques on Oracle 11g data warehouse environment.
  • Monitor the Data Quality, Generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.
  • Work with Pre and Post Session SQL commands to drop and recreate the indexes on data warehouse using Source Qualifier Transformation of Informatica Power center.
  • Attend SCRUM meetings regularly to discuss the day-to-day progress of the individual teams and the overall project.
  • Develop several complex Mappings, Mapplets and Reusable Transformations to facilitate one time, Weekly, Monthly and daily loading of Data.
  • Work with Scheduler to run session on daily basis and send email after the completion of loading.
  • Prepare the unit test cases for all the mappings to test the code.

Environment: Informatica Power Center 9.1.0/8.6.1 (Repository Manger, Designer, Workflow Monitor, Workflow Manager), Teradata, Oracle, Flat files, Microstrategy, DB2, SQL, Erwin, AutoSys.

Confidential, Bloomington IL

Sr. Informatica Developer

Responsibilities:

  • Interacted with the end users to get all the incomplete requirements and developed client satisfied code.
  • Performed Source System Data analysis as per the Business Requirement. Distributed data residing in heterogeneous data sources is consolidated onto target.
  • Developed Mappings, Sessions, Workflows and Shell Scripts to extract, validate, and transform data according to the business rules.
  • Built the necessary staging tables and worktables on oracle development environment.
  • Used Power Exchange to extract data and to load data from and to existing mainframe systems.
  • Sourced the data from XML files, flat files, SQL server tables and Oracle tables.
  • Made substantial contributions in simplifying the development and maintenance of ETL by creating re-usable Mapplets and Transformation objects. Partitioned the sessions to reduce the load time.
  • Performed data cleansing and cache optimization.
  • Developed complex mappings in Informatica to load the data from source files using different transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Joiner, Filter, Stored Procedure, normalizer, Router and Mapplets.
  • Developed Mappings that extract data to UNIX server and monitored the Daily, Weekly, Monthly and Quarterly Loads.
  • Created High level/detailed level design documents and also involved in creating ETL functional and technical specification.
  • Documented ETL development standards as per client requirement.
  • Worked in Business Objects environment to test reports generated in BO.
  • Created dummy data for BO reports generation and testing using Oracle Toad by analyzing various business logics and data standards.
  • Involved in review of the mappings and enhancements for better optimization of the Informatica mappings, sessions and workflows.
  • Extensively worked in performance tuning of programs, ETL procedures and processes.
  • Performed Unit Testing of the mappings. Involved in writing the Test Cases and also assisted the users in performing UAT.
  • Created integration services, repository services and migrated the repository objects.
  • Used heterogeneous data sources Oracle, DB2, and XML Files, Flat Files as source also imported stored procedures from Oracle for transformations.
  • Handled various additional responsibilities like team status tracking and reporting to manager in status call.

Environment: Informatica Power Center 8.3, Informatica Power Exchange 8.6.1, Oracle 10g, SQL, DB2XML,PL/SQL, TOAD, Shell Scripts, UNIX (AIX).

Confidential

Informatica Consultant

Responsibilities:

  • Worked with Business Analyst in requirements gathering, business analysis and project coordination.
  • Analyzed the source data coming from different sources like DB2, Flat files.
  • Worked with business users and developers to develop the model and documentation for the new projects like marketing and ODS building.
  • Responsible for developing complex Informatica mappings using different transformations.
  • Responsible for creating Workflows and sessions using Informatica workflow manager and monitor the workflow run and statistic properties on Informatica Workflow Monitor.
  • Created and Monitored Sessions and various other Tasks such as Event-Raise Task, Event-Wait Task, Decision Task, Email Task, Assignment Task, Command Task etc. using Informatica Workflow.
  • Responsible for Defining Mapping parameters and variables and Session parameters according to the requirements and usage of workflow variables for triggering emails.
  • Responsible for tuning the Informatica mappings to increase the performance.
  • Implemented complex ETL logic using SQL overrides in the source Qualifier.
  • Extracted data from Mainframe systems using Power Exchange connectors.
  • Performed Unit tests development work and validates results with Business Analyst.
  • Developed Unix Scripts for updating the control table parameters based on the environments.
  • Used DB2 loader to load data from flat files to DB2 target database and created exception tables to populate the rejected records.
  • Created various data marts for analyzing different business units.
  • Responsible for providing written status reports to management regarding project status, task, and issues/risks.
  • Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process.

Environment: Informatica 8.6.1, Oracle 10g, SQL, PL/SQL, TOAD, Shell Scripts, UNIX (AIX), AutoSys, MQ Migration Tools, Power Exchange 8.x

Confidential

ETL Developer

Responsibilities:

  • Interacted with Data Modelers and Business Analysts to understand the requirements and the impact of the ETL on the business.
  • Designed ETL specification documents for all the projects.
  • Created Tables, Keys (Unique and Primary) and Indexes in the DB2 server.
  • Extracted data from Flat files, DB2, SQL and Oracle to build an Operation Data Source. Applied business logic to load the data into Global Data Warehouse.
  • Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.
  • Extensively used the Add Currently Processed Flat File Name port to load the flat file name and to load contract number coming from flat file name into Target.
  • Worked on complex Source Qualifier queries, Pre and Post SQL queries in the Target.
  • Extensively used workflow variables, mapping parameters and mapping variables.
  • Created sessions, batches for incremental load into staging tables and scheduled them to run daily.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.

We'd love your feedback!