Summary of Experience:
- Around6+ years of IT experience as Technical consultant for Data Warehouse ETL, experienced in requirement analysis, data analysis, application design, modeling, development, testing and support life cycle of Data Warehouse applications using Informatica9.x, 8.x, 7.x.
- Vast experience in Data Warehousing with domains in Healthcare, Pharma, Banking and Financial services.
- Experience in Full Life Cycle Development of building a Data Warehouse.
- Good Understanding of Ralph Kimball and Bill Inmon approaches.
- Excellent knowledge ofOLTP and OLAP System Study, Analysis andE-R modeling.
- Designed and developed logical and physical models based on Star and Snow Flake schemas, identification of fact and dimension tables.
- Played significant role in various phases of project life cycle, such as requirements analysis, functional & technical design, testing, production support, implementation and scheduling.
- Extensively worked on data warehousing and decision support systems with relational databases such as Oracle design and database development using SQL, PL/SQL, SQL PLUS and TOAD.
- Highly proficient in writing, testing and implementation of Triggers, stored procedures, functions using PL/SQL.
- Experience in installing, configuring and updating Informatica server and Client.
- Prominent experience in Data Integration, Extraction, Transformation and Loading data from multiple heterogeneous data source systems like Oracle, SQL Server, DB2, Teradata, XML and Flat files into Data warehouse by using Informatica tool.
- Expertise in working with Informatica tools - Power Center Client tools - Designer, Repository manager, Workflow Manager, Workflow Monitor and Server tools - Informatica Server, Repository Server manager.
- Proficient in creating Transformations and Mappings using Informatica Designer to implement business rules and processing tasks using Workflow Manager to move data from sources to targets.
- Worked extensively on various data transformations such as Source Qualifier, Joiner, Filter, Router, Expression, Lookup, Aggregator, Sorter, Normalizer, Update Strategy, Sequence Generator and Stored Procedure transformations in Informatica Power Center Designer.
- Expertise working on slowly changing dimension type1and type2.
- Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations using Informatica debugger.
- Implemented various Performance Tuning techniques on Sources, Targets, Mappings and sessions levels.
- Experience in designing and implementing partitioning to improve performance while loading large data volume.
- Hands on experience in UNIX shell scripting.
- Strong exposure to working on scheduling tools like Autosys and Control-M.
- Ensure that user requirements are effectively and accurately communicated to the other members of the development team and Facilitate communications between business users, developers and testing teams.
- Excellent problem-solving and trouble-shooting capabilities. Quick learner, highly motivated, result oriented and an enthusiastic team player.
- Good interpersonal skills, experience in handling communication and interactions between different teams.
Windows NT / XP / Vista, Windows Server 2000 / 2003 / 2008, UNIX, Linux.
Informatica Power center 9.x,8.x,7.x.
Oracle 11g/10g/9i/8i, SQL Server 2008/2005, DB2 UDB, Teradata, MS- Access.
SQL* Plus, SQL Loader, TOAD, DB2 Import, DB2 Export.
C, C++, Java, SQL, PL/SQL, Shell scripting.
Star-Schema Modeling, Snowflakes Modeling, FACT and Dimension Tables, Erwin.
Internet Technologies and Microsoft tools
HTML, XML, MS Office toolset ( Word, Excel, Visio, Power point)
SSRS, Business Objects, Cognos
Confidential, White Plains, NY Sep' 12 - Present
Confidential, works collaboratively with Medicaid Agencies, state and local governments, health plans, employers and labor trust groups to design and deliver services and solutions to meet today's healthcare challenges. The BIC (Business Intelligence Center) project is mainly involved in receiving claims, eligibility and provider related data from different clients and load them in to C3 data warehouse. The entire ETL process consists of flat files, landing area, staging area, ODS layer and data warehouse.
- Interacted with the Business Users to analyze the Business Requirements and transform the business requirements into the technical requirements.
- Extracted data from incoming files from different customers using Informatica power center 9.1 and loaded in to Operational Data Store which is SQL Server database.
- Worked on Power Center Tools like designer, workflow manager, workflow monitor and repository manager.
- Worked on Designer Tools like source analyzer, transformation developer, mapplet designer and mapping designer.
- Created new mappings and updating old mappings according to changes in Business logic.
- Developed complex mappings using different transformations like source qualifier, connected look up, unconnected look up, expression, aggregator, joiner, filter, normalize, sequence generator and router transformations.
- Worked with SQL Override in the Source Qualifier and Lookup transformation.
- Extensively used various Functions like LTRIM, RTRIM, ISNULL, ISDATE, TO_DATE, Decode, Substr, Instr and IIF function.
- Developed Re-Usable Transformations and Mapplets.
- Used Update Strategy DD_INSERT, DD_UPDATE to insert and update data for implementing the Slowly Changing Dimension Logic.
- Developed Slowly Changing Dimensions Mapping for Type 1 SCD and Type 2 SCD.
- Defined Target Load Order Plan for loading data into Target Tables.
- Reduced the amount of data moving through flows to have a tremendous impact on the mappings performance.
- Responsible for Performance Tuning at the Mapping Level and Session level.
- Used Debugger to troubleshoot the mappings.
- Prepared Unit Test plan and efficient unit test documentation was created along with Unit test cases for the developed code to make sure the test results match with the client requirement.
- Worked with the production issues and gave support and appropriate solutions to resolve that issues.
- Extensively used Active Batch tool to Schedule ETL jobs.
- Prepared detail design documentation thoroughly for QA and production support department to use as hand guide for future production runs before code gets migrated.
Environment: Informatica Power Center 9.1, Flat files, SQL Server 2008, Active Batch V7, Jira, Microsoft Visio, and Windows NT.
Confidential, Honolulu, HI Sep' 11 - Sep' 12
Confidential, is a member of theBlue Cross Blue Shield Association, an association of independentmedical insuranceproviders. A nonprofit, mutual benefit association founded in 1938, HMSA covers more than half of the state's population. The project deals with membership system replacement process which provides extracts to the vendor and conversion of elements.
- Worked closely with the business analyst and Data warehouse architect to understand the source data and did the entire source to target documentation for all kinds of sources.
- Involved in requirements gathering, functional/technical specification, designing and development of End-to-end ETL process for ClaimsData Warehouse.
- Developed STAR Schema's including identifying facts and dimensions and designed tables.
- Upgraded informatica power center 8.6.1version to 9.0.1.
- Worked with Informatica Power Center 9.0.1 for extraction, transformation and loading (ETL) of data in the data warehouse.
- Worked with Repository Manager for managing of repositories, creation of users, creation of user groups and folders.
- Used PL/SQL to load some small and straight forward load using ODBC driver connection.
- Extensively used Informatica client tools to extract data from different sources from flat files, oracle, mainframe which is DB2 and finally loaded into a single data warehouse repository (SQL server).
- Experience in developing and documenting Data Mappings, Transformations and Sessions.
- Created complex mapping using various transformations like Expression, Filter, Aggregator, Joiner, Union, Router, Sorter, Source Qualifier, Stored procedure, Java, Normalizer, Look up and Update Strategy.
- Created reusable transformations and used in various mappings.
- Power Exchange Change Data Capture has been done for data updates.
- Used Update Strategy to insert and update data for implementing the SlowlyChanging Dimension Logic.
- Built in mapping variable/parameters and created parameter files for imparting flexible runs of workflows based on changing variable values and to filter the source data.
- Responsible for creating workflows with multiple sessions usingInformaticaWorkflow Manager and monitor the workflow run and statistic properties onInformatica Workflow Monitor.
- Used Incremental Aggregation technique to load data into Aggregation tables for improved performance.
- Responsible for Performance Tuning at the Mapping, Session, Source and the Target levels.
- Used PMCMDcommand to execute workflow and created shell script to copy files from one folder to another folder.
- UsedVersion controlprovided byInformaticaon each and every object used.
- Created debugging sessions before the session to validate the transformations and also used existing mappings in debug mode extensively for error identification by creating break points and monitoring the debug monitor.
- Analyzed Session Log files in case the session failed to resolve errors in mapping or session configurations.
- Performed Unittesting and validated the data.
- Worked with the scheduling team in scheduling jobs and scheduled the sessions and workflow and loaded data into data warehouse on a daily basis.
- Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
Environment: Informatica Power Center 9.0.1/8.6.1, Power Exchange, Flat files, Erwin, DB2, Oracle 11g/10g, Windows NT, SQL Server 2008/2005, Toad, PL/SQL, UNIX, Autosys, SSRS.
Confidential, Woonsocket, RI Oct' 10 - Sep' 11
Confidential, an integrated pharmacy services provider, combining aUnited Statespharmaceutical services company with a U.S. pharmacy chain. The Name of Project is SSR (Stock Status Report) & the basic aim of Project is to know Stock Status of our Drugs. The Individual Business Data Warehouse will integrate the information from various data sources into one central repository providing the organization a cross-enterprise consolidated view of the information.
- Interacted closely with Business Users in the Requirements gathering phase.
- Responsible for the definition, development and testing of the processes/programs necessary to extract data from the client\'s operational databases like RDBMS tables and Flat File, Transform and cleanse the data, and Load it into target database.
- Imported Source/Target Tables from the respective databases and build the code using various transformations, reusable transformations and mapplets using Designer module of Informatica.
- Extensively used various transformations Lookup, Update Strategy, Expression, Aggregator, Filter, Stored Procedures and Joiner etc.
- Used Workflow Manager for creating validating, testing and running sessions and batches and to load the data into the Target Database.
- Used Target Load Plan in order to execute the mappings sequentially in a pipeline structure.
- Used mapping parameters and variables.
- Partitioned Sessionsfor concurrent loading of data into the target tables.
- Used Constraint Based Loading, partitioning, performance tuning on existing mapping.
- Developed workflows with sequential and parallel sessions.
- Created various PL/SQL stored procedures for dropping and recreating indexes on target tables.
- Performance tuning of Informatica sessions for large data files has been done by increasing block size, data cache size and, sequence buffer length.
- Errors were detected and corrected using Session Log messages and Server Messages.
- Performed Unit Testing and tuned the Informatica mappings for better performance
- Extensively used SQL*Loader to load data from flat files to database tables in Oracle.
- ETL mapping Documents for every mapping and Data Migration document have been developed for smooth transfer of project from development to testing environment and then to production environment.
- Attended Daily production support calls to troubleshoot production issues and provided solutions
Environment: Informatica Power Center 8.6.1, Erwin, Oracle 10g, Oracle 9i, Windows NT, Flat files, SQL, PL/SQL, SQL*Loader, Autosys, Cognos 8.0, Business Objects, UNIX, UNIX Shell Scripts.
Confidential, Cincinnati, OH July' 09 - Sept' 10
Confidential, is one of the leading community banks in US. The objective of this project was to create a Consolidate Reporting Project. Build a data mart for the Retail Customer Service department. These tools populate data in the database, which is pulled to design the datamart.
Assisted in creating Physical models and used Erwin for Dimensional Data Modeling. Worked with flat files, XML, other RDBMS databases to load data.
- Involved in loading the data from Source Tables to ODS (Operational Data Store) Tables using Transformation and Cleansing Logic using Informatica.
- Worked with informatica tools to create mappings, mapplets and Reusable Transformations to load data from various sources to target. Created Mappings using Source Qualifier, Aggregator, Filter, Joiner, Sorter, Lookup, Update Strategy, Router, Sequence Generator and Stored procedure transformations.
- Developed Slowly Changing Dimensions Type-I, Type-II mappings.Created, Configured and Scheduled the Sessions and Batches for different mappings using workflow Manager.
- Developed workflow tasks like reusable Email, Event wait, Timer, Command, Decision.
- Created UNIX script to run batch jobs according to business requirements
- Implemented partitioning and bulk loads for loading large volume of data.
- Performed Performance Tuning of sources, targets, mappings, transformations and sessions, by implementing various techniques like parameter files, variables, partitioning techniques and pushdown optimization, and also identifying performance bottlenecks.
- Reduced load time for daily loading process by performing performancetuning.
- Used Power Exchange to extract DB2 Source from mainframe server.
- Involved in scheduling the workflows using Autosys.
- Involved in identifying bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs so that they conform to the business needs.
- To check the dataquality, unittesting was performed and created test cases and detailed documentation for it.
- Used PMCMD commands to execute workflows in non-windows environments.
- Coordinated with the QA and the Reporting team and provided guidance and knowledge on the ETL process. Involved in the UATtesting and Integrationtesting process.
Environment: Informatica Power Center 8.6.1/8.1.1, Power Exchange, Oracle 10g, DB2, Flat Files, XML, Mainframe, PL/SQL, SQL server 2005, SQL*PLUS, Erwin 7.2,Cognos 8, SSRS, Unix Scripting, Windows NT, TOAD, Autosys.
Confidential, Dublin OH Oct' 08-July' 09
Confidential, is one of the multinational healthcare companies which offers a range of services and develops products to improve essential processes in healthcare industry. The aim of the project was to create a Data Warehouse that would involve source data from different departments like Finance, Sales and Marketing and provide complete analytical solutions. Decisions are based on the reports produced using the Data Warehouse.
- Translated user requirements into systemResponsible for the Business Analysis and requirements gathering, create a road map for the Data warehouse designing.
- Involved in Data Modeling sessions using Erwin.
- Involved in Data Analysis of the OLTP system to identify the sources for extraction of the data.
- Creating and modifying Oracle Database tables as per the design requirements and applying Constraints to maintain complete Referential Integrity and creating indexes for performance.
- Used Informatica Power Center for ETL extraction, Transformation and loading data from heterogeneous source systems to Staging area.
- Developed Mappings for loading data from Multiple Sources Flat Files, and Oracle, into the target Teradata tables.
- Created Scripts using Teradata utilities (Fast load, Multi load, Fast export, Tpump,).
- Extensively used loader utilities to load flat files into Teradata RDBMS.
- Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations, maintaining workflows.
- Created Worklets to run several sessions sequentially.
- Determined bottlenecks at various points like targets, sources, mappings, sessions or system, Meta data management and Optimized the Performance. This led to better session performance and achieving high response time.
- Extensively worked on Performance Tuning and there by decreased the load time.
- Used various Teradata Index techniquesto improve the query performance.
- Extensively worked with Lookup Caches like Persistent Cache, Static Cache, and Dynamic Cache to improve the performance of the lookup transformations.
- Extensively worked on UNIX Shell scripts and invoked oracle procedures and workflows via shell scripts.
- Responsible for error handling using Session Logs, Bad Files and Workflow Logs in the Workflow Monitor.
- Extensively worked with the Debugger for handling the data errors in the mapping designer.
- Involved in all the testing phases of the project to check whether the data is being processed accurately according to the user requirements and performance monitoring.
- Extensively used Control-M tool to Schedule, Execute and Monitor all ETL jobs.
Environment: Informatica Power Center 8.1.1, Flat files, Erwin, Oracle 10g, SQL server 2005, Teradata, PL/SQL, Control-M, Business Objects 6.5, UNIX.
Confidential, Hyderabad, India Aug' 06- Sep' 08
This system helps the customer service representatives to deal and transact with customer's loan, credit, debit, portfolios, investment etc. The operational data of different financial departments loaded into central Data Warehouse and transformed into different regional data marts. Informatica Power Center is used to extract the base tables in the data warehouse and the source databases include Oracle. Responsibilities:
Worked on building up the database inOracle.
- Created Data Structures. i.e. tables & views and applied the referential integrity.
- Worked as an administrator and assigned rights to the users, groups for accessing the database.
- Responsible for creating and modifying the PL/SQL procedure, function, triggers according to the business requirement.
- Created indexes, sequences and constraints.
- Created Materializedviews for summary tables for better queryperformance.
- Identified source system, their connectivity, related tables and fields and ensured data consistency for mapping.
- Worked closely with users, decision makers to develop the transformation logic to be used in Informatica Power Center. Converted the business rules into technical specifications for ETL process for populating fact and dimension table of data warehouse. Created mappings, transformations using Designer, and created sessions using Workflow Manager. Created staging tables to do validations against data before loading data into original fact and dimension tables.
- Involved in loading large amounts of data using utilities such asSQL Loader.
- Designed and developed Oracle Reports for the analysis of the data.
Environment: Informatica Power Center 7.1.1, Oracle 9i/8i, Oracle Reports 6i, SQL Loader, UNIX, Windows NT.