We provide IT Staff Augmentation Services!

Etl Consultant Resume

0/5 (Submit Your Rating)

Rhode, IslanD

SUMMARY:

  • Extensive 12+ professional Information Technology experience, with expertise in the field of Enterprise Data Warehousing, Data Integration and Data/Code Migration.
  • Strong experience in Operational Data management and Data quality reporting to Business and Technical teams for Risk Management.
  • Gather data and provide Agile Metrics Have experience in coaching fellow Scrum Masters.
  • Used Model Mart of Erwin for effective model management of sharing, dividing and reusing Model information and design for productivity improvement.
  • Having 4+ years of experience in Data quality testing and Data quality support/Management.
  • Hands on experience in Oracle 11g and InformaticaIDQ Analyst
  • Hands on experience in writing and executing SQL queries
  • Expertisein Data Quality validation, Data Profiling and Data analysis in Enterprise Data Warehouse environment.
  • Expertisein Data Quality validation,Data ProfilingandData analysis in Enterprise Data Warehouse environment.
  • Strong Exposure in Data Warehouse concepts like star schema&Snow - flake schema
  • Strong ability to execute multipleData Quality and Data Governance projects simultaneously and deliver within in timelines.
  • 10+years of experience in using Informatica Power CenterETL v9.5.1/v 8.6.1/8.1.1/7.1.3/6.2/5.1 , Informatica Power Exchange 8 with industry best practices and experience in Reporting tools like Business Objects
  • Solid understanding of Ralph Kimball & Bill Inman’s Methodologies
  • Experience in dimensional modeling (Star Schema & Snow Flake Schema).
  • Expert on various UNIX (AIX, SUN, HP - UX and SVR) systems and Scheduling tools like AUTOSYS & CONTROL-M.
  • Good knowledge of Oracle RDBMS v10g/9i/8i with proficiency in writing SQL queries and PL/SQL procedures.
  • Experience in writing ksh UNIX scripts.
  • Good business knowledge in Auto, Healthcare, Banking, Pharmacy, Insurance & Manufacturing
  • Team player, self-starter with excellent communication and interpersonal skills.

TECHNICAL SKILLS:

DW &BI: TeradataV13.10, InformaticaPowerCenter9.5.1 IDQ,8/7.x/6.x/5.x, OLAP, OLTP

Databases: Oracle 9i/8.x/7.x, MS SQL Server, MS Access, DB2, Teradata V2R5/V2R6

Programming: C, C++, UNIX Shell Scripting, PL/SQL

Environment: UNIX, Windows, Mainframes

Other Tools: Win Runner, Test Director, Quality Center and QTP,Toad 8.5, SQL Navigator, control M, AUTOSYS, Tivoli, VMware, ALM

Web Servers: IS 6.0, Apache 1.3.x, Sun One Web server, Tomcat.

PROFESSIONAL EXPERIENCE:

Confidential, Rhode Island

ETL Consultant

Responsibilities:

  • Strong knowledge in Informatica IDQ transformations and power center tool
  • Experience in monitoring and tracking the Data quality for enterprise data elements to provide faster decision making and analytics.
  • Creating Profiles and grouping categorization.
  • Executing profiles and fetching metrics and find the Exceptions.
  • Responsible for Unit Test Execution (Component and Assemble testing).
  • Worked on mapping specification to build the Business rules.
  • Provided support to Testing Teams during SIT and UAT execution.
  • Implemented code re-usability by creating re-usable transformations.
  • Implemented the Change Data Capture extensively to send only the delta records to the target systems.
  • Experience in creating Migration documents & Informatica documents.
  • Extensively used ETL to load data using Power Center from source system into staging tables and to load the data into the target database.
  • Created PL/SQL Scripts and Stored Procedures for data transformation on the data warehouse.

Confidential, Herndon, VA

Sr. Informatica ETL Developer

Responsibilities:

  • Did analysis and fine tuning of Match and Merge strategy based on customer requirement and the data. Created Match and consolidation strategy for complex Product data for Mortgage,
  • Data Profiling and Data Quality Analysis of the data given by customer to find out anomalies in the data and highlighting the effectiveness of the Informatica Products.
  • Worked on Scorecards Worked with SA and Data Modelers to understand the functionality.
  • Performed data analysis and build data transformation & data mapping documents.
  • Involved in discussions / facilitate sessions with technical developers, data modelers and business SMEs to provide solution.
  • Built Informatica Workflows using Informatica client Tools such as Repository Manager, Developer, Workflow Manager and Workflow Monitor.
  • Working with the DQ Architect in understanding the current state of Data.
  • Data Profiling, Data Cleansing, Data Standardization, Data De-Duplication using Informatica Data Quality
  • Generation and maintaining of IDQ workflows for defined rules
  • Implement word pattern changes, developing matching routines, configuration management using Informatica DQ.
  • Worked on preparing test cases for component testing.
  • Responsible for Unit Test Execution (Component and Assemble testing).
  • Involved in Test Script and Test Data Preparation for Component Testing and Assemble Testing.
  • Provided support to Testing Teams during SIT and UAT execution.
  • Worked on fixing code Defects and also environment issues during duringiDev, SIT and UAT.

Confidential, Bloomfield, CT

ETL Consultant

Responsibilities:

  • Worked with systems analysts to understand source system data to develop accurate ETL programs.
  • Experience on data analysis (fieldvalidation) for source, staging &DW.
  • Increase performance with Optimal use of database resources
  • Using Pushdown Optimizationoption to Enhance IT’s agility and development productivity.
  • Monitoring load control table based technique to incrementally load data from EDW into the data mart.
  • Writing and executing SQL queries for Data validation as per business rules.
  • Worked on data analysis and review for data quality failures.
  • Created EDW dimension table load programs as Slowly Changing Dimensions (SCD) with Type II logic.
  • Implemented code re-usability by creating re-usable transformations.
  • Implemented the Change Data Capture extensively to send only the delta records to the target systems.
  • Loading facts and dimensions from source to target data mart
  • Resolved memory related bottleneck issues like DTM buffer size, cache size to optimize workflow/session runs.
  • Performance tuning through SQL tuning, indexes and INFA mapping and session level.
  • Experience in creating Migration documents & Informatica documents.
  • 24x7 On-call production support.

Environment: ALM, Teradata V13.10, InformaticaPower center 8.6.1, Unix, Oracle 10g, Toad 9.5, Power Exchange 8, Microsoft Office, Windows 7.

Confidential, Hartford, CT

ETL Consultant

Responsibilities:

  • Experience in validating data quality & business rules by using Mapping document and FSD to maintain the data integrity.
  • Responsible for Data validation is completed on time and meets all the business requirements
  • Experience in writing SQL test cases for Data quality validation.
  • Experience in various data validation and Data analysis activities to perform data quality testing
  • Participated actively in user meetings and collected requirements from users.
  • Used Informatica Power Center 8.1 for extraction, transformation and loading (ETL) of data in the Data Warehouse and used velocity methodology
  • Ensure oversight and access management for the IDQ Repository.
  • Responsible for technical implementation of all Integration & IDQ routines.
  • Performed ETL into ORACLE Data Warehouse using Informatica Mappings including transformations Aggregator, Joiner, Lookup (Connected & unconnected), Filter, Update Strategy, Stored Procedure, Router, Expression, Sql and Sorter.
  • Involved in design and development of ad-hoc data load or business critical code enhancement
  • Created workflows and tested mappings and workflows in development, test and Production environment
  • Actively involved in Performance improvements of Mapping and Sessions and fine-tuned all transformations
  • Performed Informatica code Migration from Development/ QA/ Production and fixed and solved mapping and workflow problems
  • Developed and maintained optimized SQL queries in the Data Warehouse

Environment: Informatica 8.1.1, UNIX Scripting, Toad, Oracle 11g, Sql, Pl/sql, Cognos 10.1.

Confidential

Informatica Developer

Responsibilities:

  • Studied the existing environment and accumulating the requirements by querying the Clients on various aspects.
  • Collecting new requirements from client.
  • Meeting with user groups to analyze requirements, proposed changes in design and specifications by the reports development group.
  • Prepared user requirement documentation for mapping and additional functionality.
  • Developed and implemented various enhancements to the application in the form of production and new production rollouts.
  • Analysis of the requirements and translating it into Informatica mappings.
  • Used Repository Manager to create user groups and users, and managed users by setting up their privileges.
  • Extensively used ETL to load data using Power Center from source system into staging tables and to load the data into the target database.
  • Worked on Power Center Designer client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.
  • Created PL/SQL Scripts and Stored Procedures for data transformation on the data warehouse.
  • Testing Informatica Mappings for data accuracy.
  • Improved performance by identifying the bottlenecks in Source, Target, Mapping and Session levels.
  • Creating / Updating and Running / scheduling Batches and Sessions.
  • Developed PL/SQL stored procedures for pre and post session commands.
  • Analyzing specifications for various applications enhancements for compatibility within the system.
  • Involved in various discussions with end user.
  • Monitoring Data Loads.
  • Unit testing.
  • Designed and developed UNIX shell scripts.
  • On call Production Support.
  • Tuned performance of Informatica session for large volumes of data by increasing data cache size and target based commit interval.

Environment: Informatica Power Center 7.1.3, UNIX Scripting, Oracle10g, Pl/Sql, DB2, Autosys.

Confidential, 60 Wall Street, New York

QA Analyst

Responsibilities:

  • Participated in business meetings and Involved in design and development and identifying data sources and targets
  • Translated Business processes into Informatica mappings for building Data marts by using Informatica Designer which populated the Data into the Target Star Schema on Oracle9i
  • Created business documents for existing Mappings and created INPUT, OUTPUT documents
  • Translated business processes into mappings for building Data Ware house by using Informatica 7.1.3
  • Created workflows for mappings and tested mappings and workflows
  • Extensively used most of the transformations like FILTER, Router, Joiner, Lookup (Connected& Un-Connected), Aggregator, Expression, NORMALIZER and Update Strategy
  • Scheduled code reviews with team lead and Unix shell scripts are used for scheduling to automate the load process & While testing extensively used pmcmd to schedule and execute
  • Involved in the Upgrade project from Informatica 5.1.1 to Informatica 7.1.3 in all phases of Development, Test and Production Environments and actively involved in Performance improvements and enhancements
  • Actively involved in design, development of CAREMARK (Mail Order pharmacy/On-line pharmacy) project and created Mappings and Workflows
  • Involved in testing of CAREMARK mappings and workflows and performance tuning of mappings and sessions
  • Used Autosys for CAREMARK application to scheduling sessions
  • Used debugger to test data flow from Source to Target and fix problems and used test load options
  • Written Stored Procedure, tested and used in Pre Session to validate Pharmacy names
  • Extensively worked in the performance tuning of the mappings, ETL Procedures and processes.

Environment: Informatica 7.1.3/ 5.1.1 , UNIX (Sun Solaris 8), Autosys, Oracle9i, Pl/Sql, COBOL, DB2, UNIX Scripting, TOAD 7.6, Cognos 7.x.

Confidential, Orlando, FL

Test Analyst

Responsibilities:

  • Extraction, Transformation and Load was performed using Informatica Power Center to build the Data Warehouse
  • Creation of Transformations like Sequence generator, Lookup, joiner and Update Strategy transformations in Informatica designer
  • Involved in performance tuning of the Informatica mapping using various components like Parameter files, variables and Dynamic Cache
  • Wrote stored procedures for dropping and re-creating indexes for efficient Data Load and used Pl/Sql
  • Worked on Data Modeling by involving in the design of the Subscription Data Mart, Staging Customer & Contact Model
  • Extensively used Pl/Sql to load and implement business rules before transforming data to Target Database. After cleansing data used Informatica tool to extract, transform and load the data from MS SQL Server, Flat Files to load Target
  • Tested all the applications and transported the data to the target Warehouse Oracle tables, schedule and run extraction and load process and monitor sessions and batches by using Informatica Workflow Manager
  • Created Migration Documentation and Process Flow for mappings and sessions

Environment: Informatica Power Center 5.1, Erwin 4.1, oracle 9i, PL/SQL, SQL Server 2005, Db2, Xml, Microsoft Reporting Services, Test director, Unix, Unix Shell Script, VSS, Itemfield Content Master Data Transformation Tool.

Confidential

QA Analyst

Responsibilities:

  • Creation of test data request to cover all the scenarios identified.
  • Extraction of data from source and validation in each of the ETL layers.
  • To ensure that the data required for testing is present/created in all the source systems
  • Debugging of issues and identifying the root cause of the issue and working with the development team on its resolution
  • Responsible for the System Integration testing involving the billing, PeopleSoft, CDR data
  • Debugging of the jobs failed and analyzing the reason for the failure in the Data Stage Designer

Environment: Oracle, MS Word, MS Excel, Test Directory.

We'd love your feedback!