Informatica Consultant Resume
SUMMARY:
- Over all 7+ years’ work experience inETL(Extraction, Transformation and Loading) of data from various sources into EDW, ODS and Data marts usingData Integration Tool Informatica, 9.x/8.x, Oracle in Banking, Insurance, Retail, Telecom and Health care, Medicare Medicaid departments
- Experience onthe Migration of Informatica Version from 9.1 to 9.6
- Experience in Performance Tuning of Database queries, Running on Voluminous Data.
- Hands on Experience in Performance tuning, Optimization in Informatica
- Experience in production Support and Hot Fixes
- Experience in Data Migration from Oracle to Netezza
- Extensive experience onSlowly Changing Dimensions - Type 1, Type 2 in different mappings as per the requirements
- Implemented Push down optimization in Informatica
- Experience in the performance Tuning Oracle Queries for the better distribution of Data
- Experience in the Implementation of full lifecycle in Enterprise Data warehouse, Operational Data Store (ODS) and Business Data Marts with Dimensional modeling techniques, Star Schema and Snow Flake Schema using Informatica
- Hands on experience in the reconciliation process and audit tables
- Strong Experiencein ETL design, ETL Architecture Solution, Development and maintenance
- Expertise onInformatica Mappings, Mapplet, Sessions, WorkflowsandWork letsfor data loads
- Certified Experience inInformatica Performance TuningofTargets, Sources, Sessions, Mappings and Transformations
- Experience inmanaging onsite- offshore teams and coordinated test execution across locations
- Extensively worked with Informatica Mapping Variables, Mapping Parameters and Parameter Files
- DatabaseslikeOracle, SQL Server, Netezza and Worked on integrating data fromFlat fileslike fixedwidth /delimited, XML filesand COBOL files
- Experience in writing Stored Procedures, Functions, Triggers and Views onOracle
- Extensively worked on Monitoring and Scheduling of Jobs usingUNIX Shell Scripts
- Proficient at using Excel for Data Analysis and using Advanced excel functionalities like Pivot, VLOOKUP and Graphs
- Worked withPMCMDto interact with Informatica Server from command line and execute theShell script
- Involved in Unit testing, Functional testing and User Acceptance testing
TECHNICAL SKILLS:
ETL Tools: Informatica Power Center 9 X, 8X
Data Modeling: Dimensional Data Modeling, Star Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling
Database: Oracle, SQL Server, NetezzaDB ToolsTOAD, SQL*Plus, PL/SQL Developer, SQL * Loader, Visual Studio.
Programming Language: PL/SQL, Unix Shell Scripting, Java
Environment: Windows, Linux
Schedulers: Informatica Scheduler, Control M, Tibco Admin
Operating System: Unix/Linux, Mainframe, Windows
PROFESSIONAL EXPERIENCE
Confidential
Informatica Consultant
Role and Responsibilities:
- Interaction with business analysts, Analysis, inspection and translating business requirements into technical specifications.
- Participated in system analysis and data modeling, which included creating tables, views, triggers, functions, indexes, functions, procedures, cursors.
- Involved Creating Fact and Dimension tables using Star schema.
- Extensively involved working on the transformations like Source Qualifier, Filter, Joiner, Aggregator, Expression and Lookup.
- Created session logs, workflow logs and debugger to debug the session and analyze the problem associated with the mappings and generic scripts.
- Design and developed complex informatica mappings including SCD Type 2 (Slow Changing Dimension Type 2).
- Extensively worked in Workflow Manager, Workflow Monitor and Worklet Designer to create, edit and run workflows.
- Involved in the design, development and testing of the PL/SQL stored procedures, packages for the ETL processes.
- Developed UNIX Shell scripts to automate repetitive database processes and maintained shell scripts for data conversion.
- Extensively used Various Data Cleansing and Data Conversion Functions in various transformations.
- Extensively worked with the data modelers to implement logical and physical data modeling to create an enterprise level data warehousing.
- Created and ModifiedPL/Sqlstored procedures for data retrieval fromdatabase.
- Automated mappings to run usingUNIX shell scripts, which included Pre and Post-session jobs and extracted data from Transaction System into Staging Area.
- Extensively used Informatica Power Center to extract data from various sources and load in to staging database.
- Designed the mappings between sources (external files and databases) to operational staging targets.
- Extensive work experience in the areas of Banking, Finance, Insurance and Manufacturing Industries.
- Involved in data cleansing, mapping transformations and loading activities.
- Performance Tuning in Oracle
- Involved in the process design documentation of the Data Warehouse Dimensional Upgrades. Extensively used Informatica for loading the historical data from various tables for different departments.
- Performing ETL & database code migrations across environments.
Environment: Informatica PowerCenter v 9.6, Oracle 11g, Flat files (fixed width/delimited, XML, SharePoint, JIRA, Quality Center.
Confidential, Dallas, TX
Informatica Developer
Role and Responsibilities:
- Working on Data Mart Maintenance (Developing and Monitoring Informatica Workflows)
- Designing the solution for existing business requirements and generating XML Reports for Down streams.
- Creating Informatica Mappings, Mapplet, Sessions, WorkflowsandWork letsfor data loads.
- Reading and writing into Dynamic flat files.
- Find the conformed dimensions and use them for MDM (Master data Management)
- ETL (Informatica 9.x) Development of Mapping using various transformation, scheduling the workflows.
- Extensively worked Creating Fact and Dimension tables using Star schema.
- Created session logs, workflow logs and debugger to debug the session and analyze the problem associated with the mappings and generic scripts.
- Designed and developed complex informatica mappings including Slow Changing Dimension Type 1,2.
- Extensively worked in Workflow Manager, Workflow Monitor and Worklet Designer to create, edit and run workflows.
- Developed UNIX Shell scripts to automate repetitive database processes and maintained shell scripts for data conversion.
- Tuning and Monitoring the Performance of Existing code.
- Handling the Incidents for Breaks between Upstream and Downstream applications.
- Providing hotfixes on production system for correcting data issues.
- Performance tuning in Oracle and Informatica.
Environment:Informatica PowerCenter 9.1/9.5, Oracle 10g, Power mart, Pl/SQL, Flat files (fixed width/delimited), Putty, WinSCP, Linux, XML, SharePoint, Quality Center.
Confidential, Little Rock, AR
ETL Programmer Analyst
Role and Responsibilities:
- Worked closely with the Business Analysts team to understand the User Stories.
- Responsible to raise the ticket.
- Monitoring and providing the solutions to the existing ETL batch jobs.
- Configured Integration checks.
- Involved in full life cycle design and development of Data warehouse.
- Documented STM (Source to Target Mapping document)
- Worked on High Level Design Documents, LLD’s
- Responsible for building/ Design dimension tables and fact tables in ETL.
- Developed ETL Process for the validation / Comparison of environments.
- Performance tuning of the existing jobs.
- Fixed/determine the existing SSRS broken reports and assigned it to the development team.
- Developed new ETL process for the existing DB jobs.
- Worked with the UAT team for defect resolutions.
- Responsible to implement the best practices of Teradata and optimization practices.
- Implement Information Push down optimization.
- Involved in migration of code from Dev to Upper environments.
- Involved in the requirement gathering.
- Analyze releases for schema changes.
- Model validations (load model changes).
- Data Integrity Tasks (compare primary databases to failover databases, de-dup, fill gaps, etc.).
- Environment compares and validations, fixes.
- Production support / Hot fixes.
- Make sure that processes (ETL and data replication) remain running in the lower environments.
- Help manage archiving shell scripts and DDL/DML scripts in TFS.
- Performing ETL & database code migrations across environments.
- Tuned mappings to perform better using different logics to provide maximum efficiency and performance.
Environment: Informatica Power Center 9.6.1, SSIS, Share Point, Netezza, Control M, Mainframe, SQL Server, Oracle, Windows.
Confidential, New Orleans, LA
Sr. Informatica Developer
Role and Responsibilities:
- Responsible for understanding the business requirements and Functional Specifications document and prepared the Source to Target Mapping document. Helped in building the New ETL Design.
- Optimization of PDO.
- Developing test cases for Unit, Integration and system testing
- Coordinated betweenonsite and offshoreteams.
- Defined various facts and Dimensions in the data mart including Fact Less Facts, Aggregate and Summary facts.
- Extracting, Scrubbing and Transforming data fromMain Frame Files, Oracle, SQL Server, Teradataand then loading into Oracle database using Informatica, SSIS
- Worked on optimizing the ETL procedures in Informatica.
- Performance tuning of the Informatica mappings using various components like Parameter files, Variables and Dynamic Cache.
- Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
- Implementing logical and physical data modeling with STAR and SNOWFLAKE techniques using Erwin in Data warehouse as well as in Data Mart.
- UsedType 1andType 2mappings to update Slowly Changing Dimension Tables.
- Involved in theperformance tuningprocess by identifying and optimizing source, target, and mapping and session bottlenecks.
- Configuredincremental aggregator transformationfunctions to improve the performance of data loading. WorkedDatabase level tuning, SQL Query tuning for the Data warehouse and OLTP Databases.
- Utilized of Informatica IDQ 8.6.1 to complete initial data profiling and matching/removing duplicate data.
- Informatica Data Quality (IDQ 8.6.1) is used here for data quality measurement.
- UsedActive batch scheduling toolfor scheduling jobs.
- Checked Sessions and error logs to troubleshoot problems and also useddebuggerfor complex problem trouble shooting.
- Strong Experience in handling the Tera Data Load after implementing the Primary Index.
- Negotiated with superiors to acquire the resources necessary to produce the project on time and within budget. Get resources onsite if required to meet the deadlines.
- Delivered projects working inOnsite-Offshore model. Directly responsible for deliverables.
- DevelopedUNIX Shell scriptsfor calling the Informatica mappings and running the tasks on a daily basis.
- Created & automated UNIX scripts to run sessions, Handling of Dynamic files on desired date & time for imports.
Environment: Informatica Power Center 9.6.1, Tera Data 14 and 14.10, SSIS, Mainframe, Share Point, Oracle12, TOAD.
Confidential, Chicago, IL
Data Integration Programmer
Role and Responsibilities:
- Worked on establishing ETL Design in such a way that each ETL Jobs identified to data model (subject area).
- Responsible to build System design, ETL data flow diagram, Building of Data Mapping Sheet and Run Book.
- Monitor loads and troubleshoot any issues that arise.
- Build Dynamic parameters ETL Process in Importing the data from mainframe
- Setting up dynamic parameter files for source and target connections, which is updated as per batch.
- Build Dr. Address transformation to validate the address.
- Build in depth test Cases and automate the test cases by writing the stored procedures. And did the impact analyst for downstream applications.
- Worked on data profiling and automate the process of test cases
- Data mapping sheets for subject areas like Claims, Procedure, Drugs, Provider, Recipient(Members), Finance, Prior Authorization
- Completed training for HIPAA complaint policies
- Building Views for validating data mapping sheets
- Building views in the integration layer
- Building views for data validation of source system SAK
- Writing Functions/procedures for data validation
- Writing shell scripts for pre and post command sessions
- Setting up reconciliation ETL design for staging and Integration layer
- Setting up Event ID and Batch ID for the ETL process.
- Deployment of Objects by using Shell Scripting and making the DG group in Informatica.
Environment: Informatica Power Center 9.1, 9.1 Power Exchange, Control M, Oracle 11g, SQL server, MS Visio, ALM, Mainframes, ERP, UNIX, SharePoint
Confidential, Houston, TX
Informatica Developer
Role and Responsibilities:
- Worked withInformaticaPower Center to move data from multiple sources into a common target area ERP System.
- Coordinated with Business system analyst (BSA) to understand and gather the requirements.
- Accordingly, Documented user requirements, translated requirements into system solutions and developed implementation plan and schedule.
- Responsibilities included designing and developing complexInformaticamappings including Type-II slowly changing dimensions.
- Involved in writing the Technical Requirements Document, Mapping Documents, and Low-Level Design Documents.
- Designed and developed mappings using Source Qualifier, Aggregator, Joiner, Lookup, and Sequence Generator, Stored Procedure, Expression, Filter and Rank transformations.
- The Mapping Designer module used for ETL tasks to upload the data from various centers with data in different source systems to build Data Marts.
- Created Informatica parameter files with workflows and sessions.
- Extensively used SQL overrides at Source Qualifier and Lookup Transformations while extracting data from multiple tables.
- Tuned mappings to perform better using different logics to provide maximum efficiency and performance.
- Categorized different alerts and exception types to be built from User’s Data Validation list and built mappings to generate exceptions and alerts accordingly.
- Tested the target data against source system tables.
- Used Command task to move the parameter files to the desired position at the start of the session.
- Worked with pre-session and post-session Sql and stored procedures
- Worked with Performance tuning of the Mappings, Sessions and SQL Queries Optimization.
- Coordination with OBIEE Team regarding the reporting data for Answers and BI Publisher reports and also for the Dashboard Designs.
- Involved in requesting input forms and creating log-in Windows for users to input the feeds.
- Performed Validations and Worked on Quality center tool for defect resolution.
- Performed Informatica Code Migration across environments.
- Documentation of ETL extracts developed using the Informatica extract tool.
Environment: Informatica Power Center 8.x, Oracle, XML Files.