We provide IT Staff Augmentation Services!

Informatica Consultant Resume

3.00/5 (Submit Your Rating)

SUMMARY:

  • Over all 7+ years’ work experience in ETL (Extraction, Transformation and Loading) of data from various sources into EDW, ODS and Data marts using Data Integration Tool Informatica, 9.x/8.x, Oracle in Banking, Insurance, Retail, Telecom and Health care, Medicare Medicaid departments
  • Experience on the Migration of Informatica Version from 9.1 to 9.6
  • Experience in Performance Tuning of Database queries, Running on Voluminous Data.
  • Hands on Experience in Performance tuning, Optimization in Informatica
  • Experience in production Support and Hot Fixes
  • Experience in Data Migration from Oracle to Netezza
  • Extensive experience on Slowly Changing Dimensions - Type 1, Type 2 in different mappings as per the requirements
  • Implemented Push down optimization in Informatica
  • Experience in the performance Tuning Oracle Queries for the better distribution of Data
  • Experience in the Implementation of full lifecycle in Enterprise Data warehouse, Operational Data Store (ODS) and Business Data Marts with Dimensional modeling techniques, Star Schema and Snow Flake Schema using Informatica
  • Hands on experience in the reconciliation process and audit tables
  • Strong Experience in ETL design, ETL Architecture Solution, Development and maintenance
  • Expertise on Informatica Mappings, Mapplet, Sessions, Workflows and Work lets for data loads
  • Certified Experience in Informatica Performance Tuning of Targets, Sources, Sessions, Mappings and Transformations
  • Experience in managing onsite- offshore teams and coordinated test execution across locations
  • Extensively worked with Informatica Mapping Variables, Mapping Parameters and Parameter Files
  • Databases like Oracle, SQL Server, Netezza and Worked on integrating data from Flat files like fixed width /delimited, XML files and COBOL files
  • Experience in writing Stored Procedures, Functions, Triggers and Views on Oracle
  • Extensively worked on Monitoring and Scheduling of Jobs using UNIX Shell Scripts
  • Proficient at using Excel for Data Analysis and using Advanced excel functionalities like Pivot, VLOOKUP and Graphs
  • Worked with PMCMD to interact with Informatica Server from command line and execute the Shell script
  • Involved in Unit testing, Functional testing and User Acceptance testing

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 9 X, 8X

Data Modeling: Dimensional Data Modeling, Star Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling

Database: Oracle, SQL Server, Netezza

DB Tools: TOAD, SQL*Plus, PL/SQL Developer, SQL * Loader, Visual Studio.

Programming Language: PL/SQL, Unix Shell Scripting, Java

Environment: Windows, Linux

Schedulers: Informatica Scheduler, Control M, Tibco Admin

Operating System: Unix/Linux, Mainframe, Windows

PROFESSIONAL EXPERIENCE:

Confidential

Informatica Consultant

Responsibilities:

  • Interaction with business analysts, Analysis, inspection and translating business requirements into technical specifications.
  • Participated in system analysis and data modeling, which included creating tables, views, triggers, functions, indexes, functions, procedures, cursors.
  • Involved Creating Fact and Dimension tables using Star schema.
  • Extensively involved working on the transformations like Source Qualifier, Filter, Joiner, Aggregator, Expression and Lookup.
  • Created session logs, workflow logs and debugger to debug the session and analyze the problem associated with the mappings and generic scripts.
  • Design and developed complex informatica mappings including SCD Type 2 (Slow Changing Dimension Type 2).
  • Extensively worked in Workflow Manager, Workflow Monitor and Worklet Designer to create, edit and run workflows.
  • Involved in the design, development and testing of the PL/SQL stored procedures, packages for the ETL processes.
  • Developed UNIX Shell scripts to automate repetitive database processes and maintained shell scripts for data conversion.
  • Extensively used Various Data Cleansing and Data Conversion Functions in various transformations.
  • Extensively worked with the data modelers to implement logical and physical data modeling to create an enterprise level data warehousing.
  • Created and Modified PL/Sql stored procedures for data retrieval from database.
  • Automated mappings to run using UNIX shell scripts, which included Pre and Post-session jobs and extracted data from Transaction System into Staging Area.
  • Extensively used Informatica Power Center to extract data from various sources and load in to staging database.
  • Designed the mappings between sources (external files and databases) to operational staging targets.
  • Extensive work experience in the areas of Banking, Finance, Insurance and Manufacturing Industries.
  • Involved in data cleansing, mapping transformations and loading activities.
  • Performance Tuning in Oracle
  • Involved in the process design documentation of the Data Warehouse Dimensional Upgrades. Extensively used Informatica for loading the historical data from various tables for different departments.
  • Performing ETL & database code migrations across environments.

Environment: Informatica PowerCenter v 9.6, Oracle 11g, Flat files (fixed width/delimited, XML, SharePoint, JIRA, Quality Center.

Confidential, Dallas, TX

Informatica Developer

Responsibilities:

  • Working on Data Mart Maintenance (Developing and Monitoring Informatica Workflows)
  • Designing the solution for existing business requirements and generating XML Reports for Down streams.
  • Creating Informatica Mappings, Mapplet, Sessions, Workflows and Work lets for data loads.
  • Reading and writing into Dynamic flat files.
  • Find the conformed dimensions and use them for MDM (Master data Management)
  • ETL (Informatica 9.x) Development of Mapping using various transformation, scheduling the workflows.
  • Extensively worked Creating Fact and Dimension tables using Star schema.
  • Created session logs, workflow logs and debugger to debug the session and analyze the problem associated with the mappings and generic scripts.
  • Designed and developed complex informatica mappings including Slow Changing Dimension Type 1,2.
  • Extensively worked in Workflow Manager, Workflow Monitor and Worklet Designer to create, edit and run workflows.
  • Developed UNIX Shell scripts to automate repetitive database processes and maintained shell scripts for data conversion.
  • Tuning and Monitoring the Performance of Existing code.
  • Handling the Incidents for Breaks between Upstream and Downstream applications.
  • Providing hotfixes on production system for correcting data issues.
  • Performance tuning in Oracle and Informatica.

Environment: Informatica PowerCenter 9.1/9.5, Oracle 10g, Power mart, Pl/SQL, Flat files (fixed width/delimited), Putty, WinSCP, Linux, XML, SharePoint, Quality Center.

Confidential, Little Rock, AR

ETL Programmer Analyst

Responsibilities:

  • Worked closely with the Business Analysts team to understand the User Stories.
  • Responsible to raise the ticket.
  • Monitoring and providing the solutions to the existing ETL batch jobs.
  • Configured Integration checks.
  • Involved in full life cycle design and development of Data warehouse.
  • Documented STM (Source to Target Mapping document)
  • Worked on High Level Design Documents, LLD’s
  • Responsible for building/ Design dimension tables and fact tables in ETL.
  • Developed ETL Process for the validation / Comparison of environments.
  • Performance tuning of the existing jobs.
  • Fixed/determine the existing SSRS broken reports and assigned it to the development team.
  • Developed new ETL process for the existing DB jobs.
  • Worked with the UAT team for defect resolutions.
  • Responsible to implement the best practices of Teradata and optimization practices.
  • Implement Information Push down optimization.
  • Involved in migration of code from Dev to Upper environments.
  • Involved in the requirement gathering.
  • Analyze releases for schema changes.
  • Model validations (load model changes).
  • Data Integrity Tasks (compare primary databases to failover databases, de-dup, fill gaps, etc.).
  • Environment compares and validations, fixes.
  • Production support / Hot fixes.
  • Make sure that processes (ETL and data replication) remain running in the lower environments.
  • Help manage archiving shell scripts and DDL/DML scripts in TFS.
  • Performing ETL & database code migrations across environments.
  • Tuned mappings to perform better using different logics to provide maximum efficiency and performance.

Environment: Informatica Power Center 9.6.1, SSIS, Share Point, Netezza, Control M, Mainframe, SQL Server, Oracle, Windows.

Confidential, New Orleans, LA

Informatica Developer

Responsibilities:

  • Responsible for understanding the business requirements and Functional Specifications document and prepared the Source to Target Mapping document. Helped in building the New ETL Design.
  • Optimization of PDO.
  • Developing test cases for Unit, Integration and system testing
  • Coordinated between onsite and offshore teams.
  • Defined various facts and Dimensions in the data mart including Fact Less Facts, Aggregate and Summary facts.
  • Extracting, Scrubbing and Transforming data from Main Frame Files, Oracle, SQL Server, Teradata and then loading into Oracle database using Informatica, SSIS
  • Worked on optimizing the ETL procedures in Informatica.
  • Performance tuning of the Informatica mappings using various components like Parameter files, Variables and Dynamic Cache.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Implementing logical and physical data modeling with STAR and SNOWFLAKE techniques using Erwin in Data warehouse as well as in Data Mart.
  • Used Type 1 and Type 2 mappings to update Slowly Changing Dimension Tables.
  • Involved in the performance tuning process by identifying and optimizing source, target, and mapping and session bottlenecks.
  • Configured incremental aggregator transformation functions to improve the performance of data loading. Worked Database level tuning, SQL Query tuning for the Data warehouse and OLTP Databases.
  • Utilized of Informatica IDQ 8.6.1 to complete initial data profiling and matching/removing duplicate data.
  • Informatica Data Quality (IDQ 8.6.1) is used here for data quality measurement.
  • Used Active batch scheduling tool for scheduling jobs.
  • Checked Sessions and error logs to troubleshoot problems and also used debugger for complex problem trouble shooting.
  • Strong Experience in handling the Tera Data Load after implementing the Primary Index.
  • Negotiated with superiors to acquire the resources necessary to produce the project on time and within budget. Get resources onsite if required to meet the deadlines.
  • Delivered projects working in Onsite-Offshore model . Directly responsible for deliverables.
  • Developed UNIX Shell scripts for calling the Informatica mappings and running the tasks on a daily basis.
  • Created & automated UNIX scripts to run sessions, Handling of Dynamic files on desired date & time for imports.

Environment: Informatica Power Center 9.6.1, Tera Data 14 and 14.10, SSIS, Mainframe, Share Point, Oracle12, TOAD.

Confidential, Chicago, IL

Data Integration Programmer

Responsibilities:

  • Worked on establishing ETL Design in such a way that each ETL Jobs identified to data model (subject area).
  • Responsible to build System design, ETL data flow diagram, Building of Data Mapping Sheet and Run Book.
  • Monitor loads and troubleshoot any issues that arise.
  • Build Dynamic parameters ETL Process in Importing the data from mainframe
  • Setting up dynamic parameter files for source and target connections, which is updated as per batch.
  • Build Dr. Address transformation to validate the address.
  • Build in depth test Cases and automate the test cases by writing the stored procedures. And did the impact analyst for downstream applications.
  • Worked on data profiling and automate the process of test cases
  • Data mapping sheets for subject areas like Claims, Procedure, Drugs, Provider, Recipient(Members), Finance, Prior Authorization
  • Completed training for HIPAA complaint policies
  • Building Views for validating data mapping sheets
  • Building views in the integration layer
  • Building views for data validation of source system SAK
  • Writing Functions/procedures for data validation
  • Writing shell scripts for pre and post command sessions
  • Setting up reconciliation ETL design for staging and Integration layer
  • Setting up Event ID and Batch ID for the ETL process.
  • Deployment of Objects by using Shell Scripting and making the DG group in Informatica.

Environment: Informatica Power Center 9.1, 9.1 Power Exchange, Control M, Oracle 11g, SQL server, MS Visio, ALM, Mainframes, ERP, UNIX, SharePoint

We'd love your feedback!