Etl Developer Resume
3.00/5 (Submit Your Rating)
SUMMARY:
- 5+years of experience in Data Warehouse/ETL procedure loading data from various sources into DW using Power Center Designer, Workflow Manager and Workflow Monitor
- Experience in Developing Complex Mappings, Reusable Transformations, Sessions and Workflows using Informatica ETL tool for data extraction from various sources and loading into target tables
- Experience in performance tuning of mapping and workflow to reduce the data load and process time
- Experience in creating various transformations using Aggregator, Look Up, Update Strategy, Joiner, Filter, Sequence Generator, Normalizer, Sorter, Router, Stored Procedure in Informatica Power Center Designer
- Involved in data migration and data quality check through ETL processes
- Expertise in DW concepts like Star Schema, Snowflake Schema, Fact Tables and Dimension Tables
- Hands on Experience in ERWIN Data Modeling tool
- Experience in creating sessions in Workflow Manager and running the workflows in Workflow Monitor and analyzing them. Hands on experience in various Active and Passive Transformations
- Worked with re - usable objects like Re-Usable Transformation and Mapplets
- Extensively used SQL Override function in Source Qualifier Transformation
- Extensive ETL Testing experience using Informatica Power Center 8.6/7.1 components (Designer, Workflow Manager, Workflow Monitor and Server Manager)
- Sound Knowledge & Profound Experience in various phases of Software Development Life Cycle (SDLC)
- Expertise in writing UNIX Shell Scripts for scheduling session for Work Flow
- Experience in Production Support post implementation
- Experience working with TOAD Querying Tool
TECHNICAL SKILLS:
Tools: Informatica 8.6/9.x, TOAD
OFFICE S/W: MS Office( MS Word, MS Excel, MS Power Point)
Data Base: Oracle 10i/9.x, DB2
Methodologies: Data Modeling - Logical, Physical. Dimensional Modeling - Star/Snowflake
Programming: UNIX Shell Scripting, SQL
Operating Systems: Unix, Windows XP/NT/2000/7
PROFESSIONAL EXPERIENCE:
Confidential
ETL Developer
Environment: Informatica 9.0.1, DB2, UNIX
Responsibilities:
- Created design pattern templates for deliverable objects
- Strong working experience with Informatica Power Center
- Experience in wrapping the Informatica WorkFlows and UNIX scripts with Control framework
- Involved in designing the process flow for extracting data across various interacting systems
- Involved in extracting the data from Flat Files and loading into staging tables
- Experience with SQL Override and source filter usage in Source Qualifiers and data flow management using Filter Transformation
- Created and used reusable transformations for Data type conversions
- Experience with transformations including complex LookUps, Stored Procedures, Update Strategy, etc.
- Expert in designing and developing the reusable sessions and Worklets
- Debugging the Informatica mappings and validating the data in target tables once it got loaded
- Involved in performance tuning by identifying and optimizing source, target, mappings and session bottlenecks
- Fine tuning Informatica mappings for Performance Optimization
- Problem solving and troubleshooting design and development issues and providing appropriate solutions
- Involved in deploying/migrating Informatica Workflows, Database Objects and Metadata within UAT and Production environment
- Expert in switching the Metadata between different schemas in various environments like UAT and Production
- Involved in deploying the Informatica WorkFlows using Subversion and Quick-build process
- Troubleshoot the issues by checking sessions, workflow logs and metadata
- Actively participated in database activities like checking the Primary Indexes, correctness of the data and field size validation, etc.
- Performed System Testing, Regression Testing, Acceptance Testing, Functional Testing and Stress Testing
- Worked closely with Production Control team to schedule Informatica WorkFlows and Shell Scripts (for Archive and Purge) in Control M
- Prepared the Standard Operating Procedure (Knowledge Transfer) document, which provides necessary information, required for the Maintenance and Operation of the application
Confidential
ETL Developer
Environment: Informatica 8.6, DB2 DB, UNIX
Responsibilities:
- Requirement analysis, Impact Analysis and Preparing detailed design
- Requirement gathering and detailed design walkthroughs for the team members, Coding and Testing of ETLs, Co-ordination of deployment activities with other teams, Data validation, Preparing Unit Test cases, Triaging of defects, Conducting code review sessions
- Interacting with senior peers or subject matter experts to learn more about the data
- Designing and creation of complex mappings using SCD Type II involving transformations such as Expression, Joiner, Aggregator, LookUp, Update Strategy, and Filter
- Defined Target Load Order Plan for loading data into Target Tables
- Used Informatica Power Center Workflow manager to create sessions, batches and workflows to run the logic embedded in the mappings
- Used Mapplets and Reusable Transformations to prevent redundancy of transformation usage and maintainability
- Performed functionality, Integration, positive and negative testing
- Worked on issues with ETL and DB migration from development to SIT, SIT to Pre-Production and Pre-Production to Production environments
- Reviewed ETL Mappings, Sessions, Unix Scripts, Test Results, Parameter Files, Test Cases, Mapping Specification Documents etc developed by team members
- Assisted/groomed team members and provided guidance during ETL development
- Check project progress, status reporting, risk identification and communicating to proper channel
- Writing UNIX Shell Scripts for Informatica ETL tool to run the Sessions and also for job scheduling
- Integrated Data Cleansing processes within the Mappings
- Ensuring the SLA during production support during the release cycles
Confidential
ETL Developer
Environment: Informatica 8.6, ORACLE DB, UNIX
Responsibilities:
- Requirement gathering-analysis and detailed design walkthroughs for the team members, Coding and Testing of ETLs, Co-ordination of deployment activities with other teams, Data validation, Preparing Unit Test cases, Triaging of defects, Conducting code review sessions
- Interacting with senior peers or subject matter experts to learn more about the data
- Designing and creation of complex mappings using SCD Type II involving transformations such as Expression, Joiner, Aggregator, LookUp, Update Strategy, and Filter
- Defined Target Load Order Plan for loading data into Target Tables
- Used Informatica Power Center Workflow manager to create sessions, batches and workflows to run the logic embedded in the mappings.
- Used Mapplets and Reusable Transformations prevent redundancy of transformation usage and maintainability
- Performed functionality, Integration, positive and negative testing
- Worked on issues with ETL and DB migration from development to SIT, SIT to Pre-Production and Pre-Production to Production environments
- Reviewed ETL Mappings, Sessions, Unix Scripts, Test Results, Parameter Files, Test Cases, Mapping Specification Documents etc developed by team members
- Assisted/groomed team members and provided guidance during ETL development
- Check project progress, status reporting, risk identification and communicating to proper channel
- Writing UNIX Shell Scripts for Informatica ETL tool to run the Sessions and also for job scheduling
- Integrated Data Cleansing processes within the Mappings
- Ensuring the SLA during production support during the release cycles
Confidential, CA
ETL Developer
Environment: Informatica 8.x, DB2
Responsibilities:
- Participated in the Data Model design according to the Business Requirements
- Created Design Specification Documents including Source è Target
- Translated Business Requirements into Informatica mappings to build Data Warehouse using Informatica Designer which populated the data into the Target Star Schema on ORACLE 9i Instance
- Extensively used Router, Joiner, LookUp, Aggregator, Expression and Filter transformations in mappings
- Extensively worked with performance tuning of the mappings by implementing the Hash Key Algorithms for the Flat Files
- Debugged and resolved load failures by verifying the log files. Supported QA-Testing in fixing the bugs and also helped to resolve various data issues
- Maintained a very good interaction with Analysts, Project Managers, Architects and Testers to have efficient and better results
- Partially Involved in writing the UNIX Shell Scripts, which triggers the workflows to run in a particular order as a part of the daily loading into the Data Warehouse