Sr Etl Developer Resume
Philadelphia, PA
SUMMARY:
- Over 9.5 Years of professional IT experience, which Includes over 9 years of experience in Informatica Power Center 9.x/8.x/7.x/6.x/5.x, Oracle 11g/10g/9i, Teradata 15.x/12.x,DB2 SQL Server 2008/05, PL SQL 4.x/3.x, Erwin, Unix, Autosys, excellent skills in complete Life Cycle of Data Warehousing.
- Strong working experience in the, Data Analysis, Design, Development, MDM, IDQ Implementation and Testing of Data Warehousing using ETL
- Strong working experience in Informatica Data Integration Tools - Designer, Workflow Manager, Workflow Monitor and Repository Manager.
- Proficient in all phases of the System Development Life Cycle (SDLC) .
- Requirements Gathering, Data Scrubbing, Design, Development, Performance Tuning and Unit Testing.
- Experience in database design, Entity-Relationship Modeling and Dimensional Modeling.
- Extensively involved in creating Complex Mappings and reusable components like Reusable
- Transformations, Mapplets, Worklets and control tasks to implement reusable business logic.
- Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi, Tpump, TPT to Export and Load data to/from different source systems including flat files
- Created a BTEQ script for pre population of the work tables prior to the main load process.
- Extensively wrote several BTEQ scripts for data manipulations and post session processing.
- Created a BTEQ script for pre population of the work tables prior to the main load process
- Written several Teradata BTEQ scripts to implement the business logic.
- Excellent in coding SQL, PL/SQL Procedures, Functions, Triggers and Packages.
- Proficient in SQL Tuning to ensure best query performance.
- Expertise in using ERwin, MS Visio, Oracle Designer to design process flow diagrams.
- Excellent experience in creating UNIX scripts, scheduling jobs using Autosys, Tivoli schedulers.
- Extensive experience in Administration/maintenance of Informatica Power Center including Installation, upgrading and patching.
- Assisted in Data modeling by creating Star schemas using MS Visio and Erwin tools.
- Cleanse and Scrub the Data in uniform data type and format applying MDM Informatica and IDQ tools and Load to STAGE and HUB tables and then to the EDW and Finally to Dimension and rollup/aggregate the data by the business grains into the FACT tables.
- Primary activities include data analysis identifying and implementing data quality rules in IDQ and finally linking rules including Address Doctor to Power Center ETL processes and delivery to MDM Data Hub and other data consumers.
- Developed business rules for cleansing/validating/standardization of data.
- Developed User Exits for implementing custom business rules.
- Defined System Trust and Validation rules for the base object columns.
- Developed MDM Hub Match and Merge rules Batch jobs and Batch groups.
- Created Query Groups and packages in MDM Hub Console.
- Extensively involved in creating Complex Mappings and reusable components like Reusable Transformations, Mapplets, Worklets and control tasks to implement reusable business logic
- Developed Slowly Changing Dimension Mappings of Type I, and II
TECHNICAL SKILLS:
ETL: Informatica Power Center 9.5/9.1/8.6/8.5/8.1/7.1
Operating Systems: Unix, Windows NT/2000/XP
Reporting tools: Qlilkview 11
DBMS: Oracle11g/10g/9i, Sql Server 2005/2008, Teradata 11/12/13
Data modeling tool: Erwin, Oracle Designer 10g, MS Visio 2010
Data Base Tools: SQL* Loader, TOAD, PL/SQL Developer, SQL Developer
Languages: PL/SQL, UNIX Shell Scripting
ITIL TOOLS: Service now
PROFESSIONAL EXPERIENCE:
Confidential - Philadelphia, PA
Sr ETL Developer
Responsibilities:
- Extensively used Informatica Power Center to create data mappings for extracting the data from various Relational systems, applying appropriate Transformations and Loading.
- Extensively used Informatica client Tools Source Analyzer, Warehouse designer, Mapping designer, Mapplets Designer, Transformation Developer.
- Expertise in working in Teradata systems and used utilities like Multiload, Fastload, Fastexport, BTEQ, TPump, Teradata SQL
- Created a BTEQ script for pre population of the work tables prior to the main load process.
- Extensively wrote several BTEQ scripts for data manipulations and post session processing.
- Created a BTEQ script for pre population of the work tables prior to the main load process
- Written several Teradata BTEQ scripts to implement the business logic
- Writing Teradata Sql queries to join or any modifications in the table
- Creation of customized Mload scripts on UNIX platform for Teradata loads
- Implemented various integrity constraints for data integrity like Referential Integrity, using Primary key and foreign keys relationships.
- Developed numerous Complex Informatica Mapplets and Reusable Transformations as needed.
- Designed and created complex source to target mapping using various transformations inclusive of but not limited to Lookup, Aggregator, Joiner, Filter, Source Qualifier, Expression and Router Transformations.
- Expertise in using different tasks (Session, Assignment, Command, Decision, Email, Event-Raise, Event- Wait, Control).
- Cleanse and Scrub the Data in uniform data type and format applying MDM Informatica and IDQ tools and Load to STAGE and HUB tables and then to the EDW and Finally to Dimension and rollup/aggregate the data by the business grains into the FACT tables.
- Primary activities include data analysis identifying and implementing data quality rules in IDQ and finally linking rules including Address Doctor to Power Center ETL processes and delivery to MDM Data Hub and other data consumers.
- Developed business rules for cleansing/validating/standardization of data
- Performed Data cleansing using external tools like Name Parser and Dataflow.
- Optimized Query Performance, Mapping Performance, Session Performance and Reliability.
- Configured the mappings to handle the updates to preserve the existing records using Update Strategy Transformation (Slowly Changing Dimensions SCD Type-2).
- Implemented Stored Procedures, Functions, views, Triggers, Packages in PL/SQL.
- Implemented Source Pre-Load, Source Post-Load, Target Pre-Load and Target Post-Load functionalities.
- Extensive Performance Tuning of Sources, Targets, Mappings and Sessions.
- Used Debugger and breakpoints to view transformations output and debug mappings.
- Implemented Pipeline Partitioning to improve performance.
- Created very useful UNIX shell scripts while writing cron jobs for batch processing. Excellent experience using Tivoli job scheduler.
- Used Test Director to log the defects and coordinated with Test team for a timely resolution.
- Provided Production Support at the end of every release.
- Documented Technical specifications, business requirements and functional specifications for the all Informatica Extraction, Transformation and Loading (ETL) mappings.
Environment: Informatica Power Center 9.1/8.6/8.1, Oracle 11g/10g, Qlik view XI, Teradata 13.0/12.0, PL/SQL, SQL*Loader, UNIX, Erwin,IDQ.
Confidential - Atlanta, GA
ETL Developer
Responsibilities:
- Extraction and Transformation of data from various sources such as Oracle and Flat files and loading them into the Oracle target database using InformaticaPower Center.
- Worked on Power Center client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplets and Transformations Developer.
- Developed different types of transformations like Source qualifier, Expression, Filter, Aggregator, Lookup, Stored procedure and update strategies.
- Created Mapplets for reusable business rules.
- Ran workflows and sessions during production support and monitor workflow and session logs for error.
- Used Debugger and breakpoints to view and edit transformations output and debug mappings.
- Created a BTEQ script for pre population of the work tables prior to the main load process.
- Extensively wrote several BTEQ scripts for data manipulations and post session processing.
- Created a BTEQ script for pre population of the work tables prior to the main load process
- Expertise in working in Teradata systems and used utilities like Multiload, Fastload, Fastexport, BTEQ, TPump, Teradata SQL,TPT
- Writing Teradata Sql queries to join or any modifications in the table.
- Creation of customized Mload scripts on UNIX platform for Teradata loads.
- Worked on ETL strategy to store Data validation rules, Error handling methods to handle both expected and non expected errors and documented it carefully.
- Used Update Strategies for cleansing, updating and adding to the data in the warehouse.
- Designed and developed UNIX shell scripts as part of the ETL process to compare control totals, automate the process of loading, pulling and pushing data from & to different servers.
- Extensively involved in unit and integration testing. Worked closely with QA team during the Testing phase and fixed bugs that were reported.
- Used Debugger and Breakpoints to view and edit transformations output and debug mappings.
- Optimized and perform Tuned mappings to achieve higher response times
- Carried out unit and Integration testing for Informatica mappings, sessions and workflows.
Environment: Informatica Power Center 8.6/8.1, Oracle 11g/10g, Qlik view 11, Teradata 13.0/12.0, PL/SQL, SQL*Loader, UNIX, Erwin,IDQ.
Confidential - Phoenix, AZ
ETL Developer
Responsibilities:
- Created Informatica Mappings, sessions including Command tasks like Event Wait, Event Raise, and Timer and assignment workflows on business requirements
- Involved in design, development and implementation of the Enterprise Data Warehouse (EDW) and Data Mart.
- Created a BTEQ script for pre population of the work tables prior to the main load process.
- Extensively wrote several BTEQ scripts for data manipulations and post session processing.
- Created a BTEQ script for pre population of the work tables prior to the main load process
- Used external tools like Address for cleansing the data in the source systems.
- Designed mappings using Source qualifier, Joiner, Aggregator, Expression, Lookup, Router, Filter, and Update Strategy transformations and Mapplets to load data into the target involving slowly changing dimensions.
- Used Workflow Manager for creating and maintaining the Sessions and Workflow Monitor to monitor workflows.
- Enhanced existing UNIX shell scripts as part of the ETL process to schedule tasks/sessions.
- Coordinated with end users and reporting teams to correlate Business requirements
- Extraction, transformation and loading of data were carried out from different sources like Flat files, Sql Server and Power Exchange.
- Expertise in working in Teradata systems and used utilities like Multiload, Fastload, Fastexport, BTEQ, TPump, Teradata SQL.TPT
- Writing Teradata Sql queries to join or any modifications in the table.
- Used Debugger and breakpoints to view transformations output and debug mappings.
Environment: Informatica Power Center 8.6/8.1, Oracle 11g/10g, Qlik view XI, Teradata 11.0/12.0, Autosys, PL/SQL, SQL*Loader, UNIX, Erwin.
Confidential - Chicago, Illinois
ETL Developer
Responsibilities:
- Created Technical Specification documents based on high level requirement documents
- Reviewed technical specification documents with the functional owners.
- Accomplished data movement process that load data from DB2 into Teradata by the development of Korn Shell scripts, using Teradata SQL and utilities
- Worked in Teradata Utilities Bteq, Fastload, Fastexport, Multiload, and improved the design of Bteq, Multiload.
- Extraction, Transformation and Loading of data were carried out from different sources like Flat files, Sql Server Involved in creating and designing mappings and mapplets using Expression, Filter, Router, Joiner, Lookup, Update Strategy, Stored Procedure, Union and other transformations
- Worked on XML transformations to send input to Permits as per specifications.
- Reworking if any discrepancies in the flat file extracts.
- Moving the code between development and System testing environments
- Fixing the Bugs rose during System and Integration testing.
- Done Audit and Reconcillation for the data during SIT.
Environment: Informatica Power Center 9.1/8.6/8.1, Oracle 11g/10g, Teradata 13.0/12.0, DB2 UDB, PL/SQL, SQL*Loader, UNIX, Erwin.
Confidential
ETL Developer
Responsibilities:
- Created the ETL design documentation, Mapping document, Migration document, Test cases.
- Extracted data from source systems, Applied complex transformation and then loaded in the Target table.
- Created Technical Specification documents based on high level requirement documents.
- Extensively worked in the performance tuning of programs, ETL procedures and processes.
- Error checking & Testing of the ETL procedures & programs using Informatica session log.
- Worked in Teradata Utilities,Fastload,Bteq,multiload
- Performance Tuned Informatica Targets, Sources, mappings & sessions for large data files by Increasing data cache size, sequence buffer length and target based commit interval.
- Reviewed Technical specification documents with the functional owners.
- Developed parallel jobs using technical specification documents.
- Tested the jobs and data in Oracle.
- Reworking if any discrepancies in the flat file extracts.
- Fixing the Bugs raised during System and integration testing.
- Created Sequence jobs and scheduled them.
- Done Audit and Reconciliation for the data during SIT.
Environment: Informatica Power Center 9.1/8.6/8.1, Oracle 11g/10g,Teradata 10, PL/SQL, SQL*Loader, UNIX, Erwin.
Confidential
ETL Consultant
Responsibilities:
- Involved in all phases of SDLC from requirement, design, development, testing, training and rollout to the field user and warranty support for production environment.
- Involved in preparing Plan and effort estimations required to execute the project.
- Designing and building Informatica solution and PDO( Push down optimization ) where required .
- Design and build Teradata SQL,TPT, BTEQ and UNIX shell script
- Performance tuning for Data warehouse Database (Teradata) and Data warehouse operations.
- Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.
- Reviewed Technical specification documents with the functional owners.
- Developed parallel jobs using technical specification documents.
- Tested the jobs and data in Oracle.
- Reworking if any discrepancies in the flat file extracts.
- Fixing the Bugs raised during System and integration testing.
- Created Sequence jobs and scheduled them.
- Done Audit and Reconciliation for the data during SIT.
Environment: Informatica Power Center 9.1/8.6/8.1, Oracle 11g/10g,Teradata 10, PL/SQL, SQL*Loader, Unix, Erwin.