Etl/idq Tech Lead Resume
St Louis, MO
PROFESSIONAL SUMMARY:
- Over 11+ years of IT experience in Data warehousing using the Informatica ETL tool, possessing superior analytical skills and the ability to meet deadlines. Enjoy sharing technical expertise and coordinating projects with team members.
- Experience in Requirement Gathering and Analysis, System Design and Development phases of Software Development Life Cycle (SDLC), Agile and waterfall methodologies.
- 8+years of strong experience in Data Warehousing experience using Informatica Power Center/Power Mart with various versions 9.x/8.x/7.x.
- Extensively worked on Informatica Tools - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplets Designer, Transformation Developer, Workflow Manager and Workflow Monitor.
- Experienced in designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc.
- Proficient in using Informatica Workflow Manager, Workflow Monitor, Pmcmd (Informatica command line utility) to create, schedule and control Workflows, tasks and Sessions.
- Experience in integration of various data sources like Oracle, SQL Server, Teradata, Vertica,Flat Files, XML files, DB2 into the data warehouse.
- Experience in using Teradata SQL Assistant, Teradata Administrator and data load/export utilities like BTEQ, Fast Load, Multi Load, Fast Export, and Exposure to T pump on UNIX/Windows environments and running the batch process for Teradata.
- Experience with development in Data integration programs that involve ETL tools including Teradata, Informatica, SHELL programming, and ETL Scheduling tools in UNIX environment.
- Experienced in troubleshooting Teradata scripts, fixing bugs and addressing production issues.
- Teradata Developer and ETL developer and ensured successful Data Warehouse Implementations.
- Experienced in data cleansing, data profiling and data analysis along with troubleshooting Teradata scripts, fixing bugs and addressing production issues.
- Hands on experience with managing Slowly Changing Dimensions (SCD) such as Type 1,2,3 and
- Cleansing,Profiling,Conversion,Aggregation, and Performance Optimization and setting up Change Data Capture mechanisms.
- Strong working experience in Data Warehousing concepts, Star Schema and Snowflake Schema methodologies.
- Expert in debugging, troubleshooting, monitoring, and performance tuning.
- Good knowledge and understanding of Scheduling tools TWS, Autosys.
- Experience in Creating Physical / Logical data models, Reverse engineering form RDBMS, Forward engineering to create Schema’s, Complete compare using CA All fusion Erwin 4x.
- Experienced in coding using SQL.
- Experience in UNIX Shell Scripting on UNIX & Linux platform.
TECHNICAL SKILLS:
ETL Tools: InformaticaPowerCenter9.x/8.x/7.x/6.x, Data Validation Option, Data Profiling using Informatica IDQ, Data Transformation Studio
Reporting Tools: Business Objects XI R3, 6.5
Operating System: Win NT/XP/7
Languages: SQL, UNIX Shell scripting, Windows batch scripting
Database: Oracle 9i, 10g, DB2, SQL Server
Config Mgmt. Tool: Microsoft VSS
Other Tools: Toad, Rapid SQL, Oracle SQL Developer, SQL Loader, Autosys Scheduling, HP Quality Center, Control M Scheduling, SVN, GIT/STASH
PROFESSIONAL EXPERIENCE:
Confidential, St. Louis, MO
ETL/IDQ Tech Lead
RESPONSIBILITIES:
- Involved with design, development and implementation of Confidential Data Management projects. Worked on high visibility priority projects such as Advisor Best Practices, Brokerage Data Warehouse Refresh projects.
- Analyzed Source Systems, Target System, jobs, deployment processes with minimal supervision Worked with data analysts and application teams, and gathered requirements
- Worked on data analysis, profiling, conversion, name/date of birth/SSN/address matching across various systems, error handling, and reporting and monitoring using IDQ
- Designed ETL jobs for the data management team Developed test cases using Informatica DVO tool
- Worked closely with the administrator to push code deployments to production Automated scheduling of Unix Scripts and Informatica sessions using Autosys Participated and conducted code reviews
- Developed unit tests and integration tests for the code
- Involved in creating the technical runbooks and playbooks which will have the list of all tasks and jobs and the sequence in which they need to be executed as part of each cycle
- Coordinated with Quality Assurance (QA) team during QA and pre-production testing processes Coordinated Quality Center activities - test result entry, defect tracking and management
- Analyzed and coordinated testing related activities with various stake holders and teams Involved in Informatica code migration across various environments
Environment: Informatica Power Center 9.x, UNIX Shell Scripting, DB2, Informatica Data Validation Option, Informatica IDQ, Oracle, TOAD, Autosys, CA - Harvest, HP Quality Center, Windows XP, Microsoft Visio
Confidential, West Lake, TX
Sr. ETL/Informatica Developer
RESPONSIBILITIES:
- Involved in design, development and implementation of shared data services projects (using agile methodologies) in Fidelity pricing and cash management services group.
- Worked on design, development and delivery of data (both real-time and batch) from operational systems/flat files into a staging area, ODSs (operational data stores), data warehouse and downstream data marts
- Developed Informatica mappings/sessions/workflows to create data feeds Developed Informatica Parser projects to convert unstructured data to XML
- Automated scheduling of Unix Scripts and Informatica sessions using Control M
- Participated in and conducted design/code reviews and demos with the production support team Performed unit tests and integration tests for the code
- Coordinated with Quality Assurance (QA) team during QA and UAT testing
Environment: Informatica Power Center 9.x, Informatica Data Studio, IDQ, UNIX Shell Scripting, Oracle 12c, Control M, HP Quality Center, WINDOWS XP and Microsoft Visio, SVN, GIT/STASH
Confidential, Jersey City, NJ
Sr. ETL/Informatica Developer
Responsibilities:
- Interacted with End Users for gathering requirements.
- Developed standards for ETL framework for the ease of reusing similar logic across the board.
- Involved rigorously in data profiling, Cleansing, parsing, standardize, address validation & Match merge the data through Informatica developer (IDQ) and analyser.
- Used Repository manager to create Repository, User groups, Users and managed users by setting up their privileges and profile.
- Transform the data by using ELT concept (Extract, Load and Transform). Design, customize and analyse Informatica work flows.
- Integrate IDQ mappings and mapplets in to power center and execute.
- Create new rules based on business requirement & implement through IDQ. Create profiling form by using Informatica analyser.
- Analyse requirements, create design and deliver documented solutions that adhere to prescribed Agile development methodology and tools
- Developed mappings to extract data from different sources like DB2, XML files are loaded into Data Mart.
- Created complex mappings by using different transformations like Filter, Router, Connected and Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.
- Made use of Informatica Workflow Manager extensively to create, schedule, execute and monitor sessions, Worklets and workflows.
- Developed Slowly Changing Dimension for type I and type II (flag, version and date).
- Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema’s using Erwin/TOAD.
- Writing quick PLSQL scripts.
- Scheduling and Automation of ETL processes with scheduling tool in Autosys.. Scheduled the workflows using Shell script.
- Creating Informatica Development Standards. This document describes the general guidelines for Informatica developers, the naming conventions to be used in the Transformations and also development and production environment structures.
- Troubleshoot database, workflows, mappings, source, and target to find out the bottlenecks and improved the performance.
- Migrated Informatica mappings/sessions /workflows from Development to Test and to production environment.
Environment: Informatica Powercenter 9.x, IDQ, XML files, DB2,TOAD,Oracle 11g, PLSQL server 2008, T-SQL UNIX Shell Scripts, Autosys.
Confidential, Charlotte, N.C
Sr. ETL/Informatica Developer
Responsibilities:
- Interacted with End Users for gathering requirements.
- Developed standards for ETL framework for the ease of reusing similar logic across the board.
- Involved rigorously in data profiling, Cleansing, parsing, standardize, address validation & Match merge the data through Informatica developer (IDQ) and analyser.
- Used Repository manager to create Repository, User groups, Users and managed users by setting up their privileges and profile.
- Integrate IDQ mappings and mapplets in to power center and execute.
- Create new rules based on business requirement & implement through IDQ. Create profiling form by using Informatica analyser.
- Developed mappings to extract data from different sources like DB2, XML files are loaded into Data Mart.
- Created complex mappings by using different transformations like Filter, Router, Connected and Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.
- Made use of Informatica Workflow Manager extensively to create, schedule, execute and monitor sessions, Worklets and workflows.
- Developed Slowly Changing Dimension for type I and type II (flag, version and date).
- Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema’s using Erwin.
- Scheduled the workflows using Shell script.
- Creating Informatica Development Standards. This document describes the general guidelines for Informatica developers, the naming conventions to be used in the Transformations and also development and production environment structures.
- Wrote SQL Queries, Triggers and PL/SQL Procedures to apply and maintain the Business Rules.
- Troubleshoot database, workflows, mappings, source, and target to find out the bottlenecks and improved the performance.
- Used TOAD to run PL-SQL queries and validate the data in warehouse and mart.
- Conducted Database testing to check constraints, Indexes, procedures, field size and checked other performance tuning at data base level.
- Migrated Informatica mappings/sessions /workflows from Development to Test and to production environment.
Environment: Informatica Powercenter 9.x, IDQ, XML files, DB2, Oracle 11g, SQL server 2008, SQL, PL/SQL, UNIX Shell Scripts.
Confidential, Durham, NC
Sr. ETL Informatica Developer
RESPONSIBILITIES:
- Responsible for Business Analysis and Requirements Collection.
- Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
- Parsed high-level design specification to simple ETL coding and mapping standards.
- Designed and customized data models for Data warehouse supporting data from multiple sources on real time
- Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
- Created mapping documents to outline data flow from sources to targets.
- Extracted the data from the Oracle, flat files, Teradata and other RDBMS databases into staging area and populated onto Data warehouse.
- Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
- Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
- Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
- Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
- Developed mapping parameters and variables to support SQL override.
- Analyzed the query performance using SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
- Created Mapplets to use them in different mappings.
- Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
- Developed mappings to load into staging tables and then to Dimensions and Facts.
- Used existing ETL standards to develop these mappings.
- Extensively worked in Loading data, Backing up, restoring and recovery into Vertica database.
- Knowledge about Creating and populating Vertica database by troubleshooting performance issues and optimize query performance.
- Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
- Worked on different Tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, work lets, Assignment, Timer and scheduling of the workflow.
- Modified existing mappings for enhancements of new business requirements.
- Created Autosys Scripts for installing Autosys Jobs for daily scheduling & alerting scenarios.
- Used Debugger to test the mappings and fixed the bugs.
- Wrote UNIX shell Scripts&PMCMD commands for FTP of files from remote server and backup of repository z
- Involved in Performance tuning at source, target, mappings, sessions, and system levels.
- Prepared migration document to move the mappings from development to testing and then to production repositories.
Environment: Informatica Power Center 9.5.1, Workflow Manager, Workflow Monitor, Informatica Power Connect / Power Exchange, Data Analyzer, PL/SQL, SQL Developer, UNIX, Oracle 11g, Erwin, Autosys, SQL Server, Sybase, Teradata 13.10, Teradata SQL Assistant, Vertica 6.1.x, VSQL.
Confidential, Chicago, IL
ETL Developer
RESPONSIBILITIES:
- Investigated and responded to business regarding their claim issues.
- Assisted in developing recommendations for improvements to the claim processing system.
- Developed documentation for business requirements, transformation logic, process descriptions / flow charts, detailed design specification, test plan and implementation plan.
- Used DataStage Manager to import the table definitions into the repository, created the table definitions for the CSV and Flat files, import and export the projects, release and package the jobs.
- Wrote complex SQL queries to extensively test the ETL process and user-defined Functions, Routines and Transforms in DataStage Basic to be used in derivations.
- Used the DataStage Director and its run-time engine to schedule running, test and debug its components, and monitor job running
- Extensively worked on Data Integration on Claims Polices and Payment level.
- Designed and developed ETL jobs using DataStage Designer as per the mapping specifications using appropriate stages.
- Compiled, tested and debugged the jobs. Built the shared container for using across different jobs. Ensured compliance with coding standards, source code version control, build procedures and deployment procedures.
- Assisted in identifying current and future reporting requirements utilizing current successful strategies and developing business use cases.
- Identified business needs, evaluated business and technical alternatives, recommended solutions and participated in their implementation.
- Provided production support and performed enhancement on existing multiple projects.
Environment: IBM WebSphere DataStage 7.5 Enterprise Edition (Designer, Director, Manager, Parallel Extender), Sybase IQ, DB2 Mainframe, Crystal Reports, SQL Server 2005, Oracle 10g, Autosys