- 7+ years of Experience on Data warehouse, Data Migration projects using ETL tool Informatica Power Center 9.x/8.x, Power Exchange 8.6.1, Oracle, DB2, UNIX in industry verticals like banking and insurance.
- Demonstrated expertise utilizing ETL tools, Informatica 9.1, 8.x, PL/SQL packages and RDBM systems like Oracle and DB2.
- Worked on all phases of data warehouse development lifecycle, from ETL design and implementation, and support or new and existing applications.
- Excellent technical and analytical skills with clear understanding of ETL design and project architecture based on reporting requirements.
- Experienced in OLTP/OLAP system study, analysis and ER modeling, developing Database schemas like Star schema and Snowflake schema used in relational databases.
- Experience in UNIX shell scripting (file validations, file movement, workflow execution).
- Experience in migration activities using Deployment groups.
- Experience in advanced Informatica concepts like pushdown optimization, Dynamic partitioning, concurrent workflow execution with different parameters.
- Experience in using java transformation for dynamic expression logic and developed dynamic aggregation and dynamic sorting logic in Informatica.
- Experience in reading and writing XML files using XML transformations and worked on COBOL VSAM files as sources.
- Having experience in creating mappings to generate parameter file at run time, to validate metadata using Informatica repository tables/views.
- working experience in reading data from various sources like SAP using SAP connector (File Mode and stream mode) , Mainframe using IBM MQ Series, real-time CDC data using Power exchange and XML sources using XML source qualifier.
- Experience in performance tuning of SQL and Informatica objects.
- Experience in developing the best practices and quality assurance standards for ETL and BI solutions and having knowledge of reporting Tool Cognos.
- Excellent problem solving skills with strong technical background and good interpersonal skills. Quick learner and excellent team player.
Informatica Power Center 9.x, 8.x
Data modeling Tool
Oracle 10g, DB2, SQL Server
SQL, PL/SQL, Shell Scripting
Star Schema, Snowflake Schema
Confidential, San Francisco, CA
Role: Sr. ETL Developer
Description: Confidential is customer data reporting. Confidential is responsible for creation and distribution of statements. BEM will pass on the registered accounts information to the source systems using the trigger files, which will be used by the source systems to generate BTR/Payment feeds , however to generate a trigger file BEM will need registered accounts information which it will extract from the snapshot table created by Confidential.
- Involved in complete Life Cycle of developing Enterprise Data Warehouse Application and, developing ETL Architecture using Informatica.
- Interacted with Source system SME’s to analyze how various business processes have been tracked across the source tables.
- Designed Data warehouse target tables by using Dimensional Modeling Techniques – Star and Snowflake Schemas. Created Dimensions and Fact tables using Erwin.
- Extracted data from various data sources such as Flat file, DB2 and transformed and loaded into targets using Informatica.
- Created various complex mappings using different transformations such as Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator etc.
- Designed and developed initial and incremental data loads using Informatica.
- Designed and Developed several Mapplets and worklets for reusability.
- Developed sessions using different types of partitioning techniques like round robin, hash key portioning for better performance.
- Implemented audit process to ensure Data warehouse is matching with the source systems in all reporting perspectives.
- Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
- Involved in Unit testing, User Acceptance testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
- Prepared the Standard Operating Procedure (Knowledge Transfer) document, which provides necessary information, required for the Maintenance and Operation of the application.
- Provided Nightly batch loads support and implemented solutions to correct the data issues raised by end user during production support phase of the project.
Confidential, California, CA
Role: ETL Developer
Description: Developed software for Loan Management System, which offers a wide-range platform for organizing and managing all the stages of Loan Processing. Its main modules include Loan Origination, Analysis, Loan Servicing and Push & Pull Reporting. The system development cycle will comprise development plan, design, and delivery, validation, testing (unit, module & system) and implementation.
- Involved in meetings with Technical architect and Business users to understand Business
- Involved in designing dimensional modeling and implemented STAR flake, SNOW flake Schemas
- Worked on ERWIN to identify facts and dimensional tables by analyzing logical and physical Models
- Analyzed source files prior to extracting from flat files, VSAM, SQL server.
- Extracted data from various Heterogeneous sources like Flat files, SQL server, Oracle.
- Worked on slowly changing Dimensions Type1, Type 2.
- Developed Complex mappings using various transformations like Expression, Update Strategy,
- Router Transformation, Aggregator, Filter, Sorter, sequence Generator, Lookup and Joiner.
- Used Debugger to identify errors in the mapping and Increased Performance by using various
- Performance tuning techniques
- Developed Tests cases based on business rules and modified mapping logic accordingly after Identifying errors
- Validated errors at various stages, during building mappings, and loading data into staging.
- Identified bottlenecks in Mappings, sessions and workflows and increased performance in Loading data.
- Optimized the performance of queries with modification in SQL queries, removed unnecessary
- Columns, eliminated redundant and inconsistent data, normalized tables, established joins and Created indexes wherever necessary
- Used Joins and other complex queries to view the results that meet the business decision.
- Created Database Objects like Tables, Indexes, Views, User defined functions, Cursors, Triggers,
- Stored Procedure and Constraints.
- Created stored procedures, scripts for the Bulk application, tested them on the test Servers and moved them to production.
- Involved in Unit testing of the mappings and migrated repository from 9.0 to 9.0.1.
Confidential , New York City, NY
- Involved in understanding requirements, analyze new and current systems to quickly identify required sources and targets
- Involved in preparing detailed ETL Design Documents.
- Involved in the requirement definition and analysis in support of Data Warehouse and Datmart efforts.
- Developed ETL mappings, transformations using Informatica Power Center 7.1.2
- Created mappings using Informatica Designer to build business rules to load data and tuned them to enhance the performance
- Extensively used ETL to load data from Flat Files, Oracle and SQL SERVER to ORACLE,SQL SERVER
- Developed data Mappings between source systems and warehouse components using Mapping Designer.
- Extensively used the Informatica Debugger for debugging the Mappings to check the Business rules implemented correctly or not.
- Worked extensively on different types of transformations like source qualifier, expression, Aggregator, Router, filter, update strategy, lookup, sorter, Normalizer, Union, Stored Procedure and sequence generator.
- Tested all the business application rules with test & live data and automated, monitored the sessions using Work Flow Manager and Workflow Monitor
- Created, launched & scheduled Workflows/sessions. Involved in the Performance Tuning of Mappings and Sessions
- Involved in generating reports using Business Objects.
- Involved in fixing invalid Mappings, Testing of Informatica Sessions, Batches and the Target Data.
- Troubleshoot connectivity problems. Looked up and read session, event and error logs for troubleshooting
- Written documentation to describe program development, logic, coding, testing, changes and corrections.
- Design and Development of ETL (Informatica) mappings. Used various Transformations like Normalizer, Aggregator, Lookup, Expression and Filter Transformations as part of Enhancement requests.
- Preparation of Unit Test Cases and Unit testing of Informatica mappings.
- Involved in data analysis and handling the ad-hoc requests by interacting with business analysts, clients and customers to resolve the issues.
- Preparation of technical Design Documents by converting the business requirements in to technical details.
- Extensively worked on ETL performance tuning, worked with DBA' for SQL query tuning etc. Fine-tuned existing Informatica mappings for Performance Optimization.
- Tested the data and data integrity among various sources and targets. Used debugger by making use of breakpoints to monitor data movement, identified and fixed the bugs
- Handling daily loads and month end process as part of Production Support process..
- Liaised with other team members for the following;
- Facilitating insurance coverage and payroll processing. Formulating files (Carrier Feeds, HRIS Files and Payroll Files).