We provide IT Staff Augmentation Services!

Etl Lead Resume

4.00/5 (Submit Your Rating)

NY

Technical Skills

  • Proficient in Informatica
  • Proficient in Business Objects
  • Proficient in Oracle

Soft Skills

  • Best in documentation
  • Effective Communicator

Team Work

  • Believes in synergy
  • Strong work ethic

Thank you for your time.Looking forward to work in your organization.

Professional Summary
  • Seven Years of experience in Analysis, Design, Development and Maintenance of various software applications emphasizing on Data warehousing.
  • Seven Years of Data Warehousing ETL experience using Informatica Power Center 9.x/8.x/7.x.
  • Experience in the Implementation of full lifecycle in Data warehouses and Business Data marts with Star Schemas and Snow flake Schemas.
  • Relational and Dimensional Data Modeling experience.
  • Extensive experience in Data warehousing tools including Informatica, Business Objects,Cognos using different databases.
  • Extensively Worked on Data warehousing with Data Marts, ODS, Data Modeling, DataCleansing, ETL tool Informatica and data analyzing reporting Tool Business Objects.
  • Experience in working with data extraction, transformation and loading using Informatica Power Center and Power Mart.
  • Experience in working with all Informatica modules like Server manager, Repositorymanager, Designer.
  • Extensive experience in design and implementation of star and snowflake schemas used in multidimensional modeling.
  • Experience in integration of various data sources like Oracle, Flat files into staging area.
  • Team and Independent Player with excellent Organizational and Interpersonal skills.
  • Excellent written and oral communicationskills.
Work Experience

Confidential,
NY JAN 2012 – In Project
ETL Lead

MVP is a major health care provider in New York .MVP merged with Preferred Care.To be efficient and competitive in the market, merging these two to build a new EDW was the strategic plan of the management and referred it as FRDM. Amisys and Facets are two sources which are then moved to stage to ODS to EDW and to MART finally

Responsibilities:

  • Building Informatica Maplets
  • Testing Informatica Maplets
  • Prepare paperwork to get Maplets through physical environments
  • Maintaining confidentiality and adhere to regulatory compliance issues
  • Created informatica mappings to send the data to federal systems
  • Used source oracle tables, flat files, sql server tables and XML

Environment: Informatica Power Center 9.0, Cognos Oracle 11g, SQL Sever 2005, Control M

Confidential,
OH JUL 11- Jan 2012
Data ware house ETL Lead

Responsible in the development, design and implementation of HCD Appliance for CMCD Dallas. Actively involved and worked on the ETL mappings and file creation.

Responsibilites:

  • Worked on ETL Detailed Design document which is a guideline to ETL Coding.
  • Supported reduction of data redundancy and fragmentation, elimination of unnecessary movement of data, and improve data quality
  • Participated and facilitated functional and technical designs of systems to ensure sound decisions are being made with current and future business and strategies and opportunities
  • Created guidelines and standards in terms of ETL environment
  • Developed ETL code using Informatica.
  • Used source oracle tables, flat files, sql server tables and XML

Environment: Informatica Power Center 9.1, DB2, SQL Server, Oracle 10g/9i, Sun solaris Unix, Business Objects , Autosys Scheduler, HP Quality Center, TOAD 9.1.0.62

Confidential,
TN SEP 08 –JUL 11
Data warehouse Developer

Responsible for the development, design, and implementation of a Managed Care Data Mart integrating both RightStart and Ultracare Programs. Actively involved in the architecture of the data flows. Designed reusable concepts for various extracts, audit trailing and reconciliation process.

Responsibilites:

  • Worked on ETL Detailed Design document which is a guideline to ETL Coding.
  • Created new Informatica mappings and modified existing mappings for the Managed Care project
  • Involved in designing an error handling strategy for data validation, error reporting.
  • Worked on Informatica Power Center 8.1 tool - Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, and Reusable Transformations.
  • Populated data into Staging tables, MCDM Schema from Oracle source systems called Proton and AMI systems, and applied business logic on transformation mapping for inserting and updating records.
  • Applied the Business logic in Populating the necessary clinical Facts and Dimensions and make sure the bi-directional data was valid
  • Using Informatica Designer designed Mappings that populated the Data into the Target Star Schema on Oracle 10G Instance.
  • Performed data manipulations using various Informatica Transformations like Joiner, Expression, Lookup, Aggregate, Filter, Update Strategy, Router, Normalizer, etc.
  • Conducted SQL testing for DB sources for Insert and Update timestamps, Counts, Data definitions, Control, and Error logic for Facts and Reference tables.
  • Involved in performance tuning of the Mappings, SQL statements, Query optimization, and Explain Plan utilities for Optimum Performance and used Informatica Debugger.
  • Used source oracle tables, flat files, sql server tables and XML
  • Used data base objects like Materialized views, Sequences, Triggers, Cursors, Parallel Partitioning and Stored Procedures for accomplishing the complex logical situations and Memory Management.
  • Applied appropriate field level validations like date validations, Applying Default values for cleansing the data.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Prepared Unit Test Plans and performed Negative testing , Positive testing
  • Implemented the Slowly Changing Dimension strategy for the Warehouse.
  • Converted heavy SQL overrides in existing mappings of the QSR and CMS projects to transformation objects

Environment: Informatica Power Center 8.6.1, ,Business Objects DB2, SQL Server, Oracle 10g/9i, HP Unix, Autosys Scheduler, HP Quality Center, TOAD 9.1.0.62,

Confidential,
OH JAN 08 – AUG 08
Data warehouse Developer

Responsible for the development, design, and implementation of a Nationwide Bank Data mart. Actively involved in the architecture of the data flows. Designed reusable concepts for various extracts, audit trailing and reconciliation process.

Responsibilities

  • Performed data gap analysis from source systems to target elements, and also from target elements to reporting elements.
  • Performed Source to Target Analysis to capture transformation rules/business validation rules.
  • Developed Common Information Model (CIM) data model.
  • Extensively used worked with parameters and parameter files
  • Developed ETL code using Informatica.
  • Designed ETL Workflow for different source systems.
  • Provided impact analysis on various source tables.
  • Implemented End-to-End Auditing and Reconciliation of records

Environment: Informatica Power Center 7.1, Oracle 9i, MS SQL Server 9, Sun OS, Tivoli Workload Scheduler, Harvest Version Control Manager, HP Quality Center.

Confidential,
OH OCT 07-DEC 07
Data warehouse Developer

Responsible for the development, design, and implementation of KMV_RiskFrontier Integration application. This application was the consolidation of investments and accounting data from Legacy systems that resided on multiple platforms and data types and consolidated into an Oracle based 3rd party purchased investments analytical system and then stage and load into the KMV Data mart. KMV Data mart was used for reporting to calculate risk frontier.

Responsibilities

  • Performed data gap analysis from source systems to target elements and also from target elements to reporting elements.
  • Source to Target Analysis to capture transformation rules, business validation rules.
  • Created the data mappings for source to target
  • Developed Common Information Model data model (CIM).
  • Developed ETL code.
  • Designed ETL Workflow for different source system.
  • Did Impact analysis on various source tables.
  • Implemented End-to-End Auditing and Reconciliation of records

Environment: Informatica Power Center Version 7.1,4 Oracle 9i, MS SQL Server 9, Sun OS 5.8, Tivoli workload scheduler, Harvest version control manager, HP Quality center.

Confidential,
OH JAN 07- SEP 07
Data warehouse Developer

The Institutional line of business is retrieving information from a variety of sources and in a variety of formats. Some of the information is readily available and easy to obtain, but the majority of the information required to support the business is difficult to obtain and requires multiple data sources. Therefore, the overall objective of this program is to create a single repository i.e., IHUB for Institutional information that will contain current, verified, accurate, and complete data that is easy to retrieve and manipulate and is available to be used by other applications, processes, and users.

Responsibilities

  • Implement logic to cleanse initial report source data and populate that into a staging layer.
  • Informatica ETL process development
  • Informatica PowerCenter mappings using Informatica version 8.1.1
  • Building Java transformations within Informatica 8.1.1 for improved performance
  • Migrated Funds Management ETL workflows and mappings from Informatica version 7.X to version 8.x
  • Defect resolution during unit testing.
  • Designed reusable functions and transformations.

Environment: Informatica Power Center Version 8.X and 7.X, Oracle 10.g, Sybase Adaptive server enterprise 12.5.4, Sun OS 5.10, Control-M.

Confidential,
Detroit, MI MAR’06- DEC 06
Data warehouse Developer

Blue Cross and Blue Shield of Michigan is one of the biggest health insurance companies, which are providing its customers various Group as well as individual Coverage. Blue Health Intelligence (BHI) warehouse is a centralized warehouse, which maintains data related to all claims, pharmacy related and member related data. BHI data warehouse is an Oracle based warehouse, which holds information related to member coverage, claims processed for that member and how payments have been adjusted for those claims.

Responsibilities

  • Involved in Analyzing Business Requirements
  • Involved in database modeling based on the client requirements
  • Designed the ETL Mapping Documents for Informatica Code development
  • Perform data warehouse design for corporate repository
  • Design and develop ETL process Using Informatica tools like Power Center and Power Exchange, Power channel.
  • Worked with Informatica 7.2 in developing maplets, mappings, sessions, worklets, reusable components and workflows
  • Create and Schedule load process for Informatica as per business requirements
  • Developed Reusable Transformations & Mapplet and used them in the Mappings
  • Created tasks and workflows and monitored the sessions in Workflow Manager.
  • Extensively used almost all of the transformations of Informatica such as the Source qualifier, Aggregators, Connected & unconnected lookups, Filters & Routers, Update Strategy.
  • Implement Change Request (CR) and as per changed business requirements
  • Install Informatica PowerCenter and Power Exchange.
  • Involved in tuning mappings and database objects
  • Created UNIX scripts for Merge, Archiving and Validation for ETL Loads
  • Extensively used Power Exchange data maps for pulling data from mainframe databases
  • Worked with Reformat and Discovery (Corporate data Analyzing tool) for analyzing the data

Environment: Informatica Power Center 7.1.2 on HP-UX 11.0, SQL Server 2000, Data Junction 7.5, Toad, Erwin4.0, Business Objects 5.0 Designer, SunOS 5.9.

Confidential,
Austin,TX JUN 05 – JAN 06
Data warehouse Developer

The scope of NRD Reporting system is to provide a consolidated clean placement & reference data to provide a robust OLAP frame work, includes advanced analytics, functional & drill down reporting capabilities, enforce role based security and has the flexibility & scalability to meet the future needs of the business. NRD will consolidate numerous data sources by Extracting, Transforming and Loading (ETL) placement and reference data into an NRD datamart. The scheduled & Ad hoc reports are built using the Cognos impromptu.

Responsibilities

  • Involved in the Analysis of Physical Data Model for ETL mapping and the process flow diagrams.
  • Created Mappings to extract data from data sources like Oracle 9i, SqlServer.
  • Used Informatica (Repository Manager) to give permissions to users, create new users, repositories
  • Extensively used the Mappingparameters, variables&parameterfile in the ETL complex mappings
  • Worked extensively with complex mappings using expressions, aggregators, filters, lookup and procedures to develop and feed in Data Warehouse
  • Accessed Mainframe Data through Power Exchange, Generated Files to upload to reporting data mart
  • Used SQL, PL/SQL for the stored procedures, Packages, function over ride required for the NRD ETL Process
  • Created the Worklets, Tasks, Timers, Reusable objects for scheduling the Workflows
  • Involved in performance tuning of the complex Informatica mapping for extracting & loading the data from GBT and resolving the production issues for Phase 1A
  • Created the SQL Loader scripts & used the SQL Loader to GBT clean up files and for data validation

Environment: Informatica PowerCenter 6.2, Informatica Power Connect, TOAD, Erwin 3.5.2, Oracle 9i/8i, SQL Server 2000, Win NT, Sun Solaris 5.8, Cognos 6.6, Unix AIX 5.1/4.3.

Education: MS in Electrical and Computer Engg

We'd love your feedback!