We provide IT Staff Augmentation Services!

Sr Etl Developers Resume Profile

2.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY

  • 6 years of professional experience in Requirement Gathering/Analysis, Design, Development, Implementation, Testingof Data Warehouses, Data marts, andDecision Support Systems DSS using Informatica Power Center with Oracle, MS SQL server, DB2 and Teradatadatabases.
  • 5 years of strong data warehousing experience using Informatica Power Mart 6.1/5.1/4.7, Power Center 9.5.0/9.1.0/8.6.1/8.1/7.1.3/7.0/6.2/5.1, Power Exchange, Power Connect as ETL tool,IDE Informatica Data Explorer, IDQ - Informatica Data Quality.
  • 3 years of experience on working with Teradata SQL Assistant, Teradata Administrator, PMON and Utilities such as BTEQ, Fast Load, Multi Load, Xml import, Fast Export, Tpump, TPT. Exposure to Tpump on UNIX/Windows/Mainframe environments and running batch process for Teradata CRM.
  • Extensive work experience in ETL processes consisting of data sourcing, data transformation, mapping and loading of data from multiple source systems into Data Warehouse using Informatica Power Center.
  • Extensively used Informatica client tools Designer Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer , Repository Manager, Workflow Manager and Workflow Monitor.
  • Good exposure to Hadoop open source software frame work
  • Data modeling experience using Dimensional Data Modeling, Star Schema, Snow-Flake, Fact and Dimension Tables, Physical and Logical Data Modeling using Erwin 3.x/4.x.
  • Extensive experience in Client/Server technology area with Oracle Database, Teradata, SQL Server, DB2 and PL/SQL for the back end development of Packages, Stored Procedures, Functions and Triggers.
  • Experience in all phases of the Data warehouse life cycle involving Analysis, Design, Development and Testing of Data warehouses using ETL Logic.
  • Expert in designing Star Schema and well versed with UNIX shell wrappers, KSH and Oracle PL/SQL programming.
  • Involved in complete System Software development Life Cycle SDLC of Data warehousing, Decision Support System.
  • Worked extensively on transformations like Lookups, Aggregator, Update Strategy, Stored Procedure, Sequence generator, Joiner transformations.
  • Proficient in server side programming like stored procedures, stored functions, database triggers, packages using PL/SQL and SQL.
  • Experience in Data Extraction, Data Migration, Data Integration, Data Testing, Data Modeling and Data Warehousing using Informatica.
  • Developed scripts for data cleansing, data validation, data transformation for the data coming from different source systems.

PROFESSIONAL EXPERIENCE

Confidential

Sr ETL/Informatica Developer

Responsibilities:

  • Meetings with business/user groups to understand the business process, gather requirements, analyze, design, development and implementation according to client requirement.
  • Extensively worked in data Extraction, transformation and loading from source to target system using Informatica power center and Teradata Utilities.
  • Designed and developed a number of complex mappings using various transformations like Source Qualifier, Aggregator, Router, Joiner, Union, Expression, Lookup Connected unconnected , Filter, Update Strategy, Stored Procedure, Sequence Generator, etc.
  • Performed Teradata and Informatica performance tuning
  • Worked with Workflow Manager for the creation of various tasks like Worklets, Sessions, Batches, Event Wait, E-mail notifications, Decision and to Schedule jobs.
  • Extensively used the Slowly Changing Dimensions-Type II in various data mappings to load dimension tables in Data warehouse.
  • Developed BTEQ Import, BTEQ Export, FastLoad, MultiLoad, FastExport scripts and shell scripts to move data from source systems to staging and from staging to Data warehouse in batch processing mode.
  • Extensively created and used various Teradata Set Tables, Multi-Set table, global temporary tables, and volatile tables.
  • Extensively involved in data transformations, validations, extraction and loading process. Implemented various Teradata Join Types like Inner-join, outer-join, self-join. And various join strategies like Merge join, Nested join, Row Hash Joins.
  • Worked on Error handling and performance tuning in Teradata queries and utilities.
  • Implemented update strategies, incremental loads, Data capture and Incremental Aggregation.
  • Extensively worked on various Look up Caches like Static, Dynamic, Persistent, and Shared Caches.
  • Worked extensively on Flat Files, as the data from various Legacy Systems.
  • Involved in analyzing source systems and designing the processes for Extracting Transforming and Loading the data to Teradata database.
  • Developed mappingparameters and variables to support SQL override.
  • Wrote shell scripts to perform pre-session and post-session operations
  • Created debugging sessions before the session to validate the transformations and also used existing mappings in debug mode extensively for error identification by creating break points and monitoring the debug monitor.
  • Responsible for Tuning Report Queries and ADHOC Queries.
  • Exported data from Teradata database using BTEQ Export and FastExport.
  • Transfer files over various platforms using secure FTP protocol.
  • Extensively worked on Performance tuning to increase the throughput of the data load like read the data from flat file write the data into target flat files to identify the bottlenecks .
  • Responsible for managing, scheduling and monitoring the workflow sessions
  • Responsible for performance tuning for several ETL mappings, Mapplets, workflow session executions
  • Involved in conceptual, logical and physical data modeling and used star schema in designing the data warehouse
  • Responsible for Error handling, bug fixing, Session monitoring, log analysis
  • Involved in peer-to-peer reviews.
  • Involved in creating Unit test plans for and testing the data for various applications
  • Involved in 24x7 production support

Environment: Informatica Power Center 9.5.1/9.1.0 Workflow Manager, Workflow Monitor, Warehouse Designer, Source Analyzer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager , Teradata 14.0/12.0, TeradataUtilities BTEQ, Multiload, FastLoad, FastExport, BTEQ,Tpump, TPT , OBIEE, Cognos, SAS, Oracle 11G, MS SQL Server 2008, Teradata SQL Assistant 14.10, Flatfiles, TOAD 10.5, Autosys, UNIX, Korn Shell scripting

Confidential

ETL/ Informatica Developer

Responsibilities:

  • Meetings with business/user groups to understand the business process and gather requirements. Extracted and analyzed the sample data from operational systems OLTP system to validate the user requirements. Created high level and detail level design documents.
  • Participated in data model discussions with Data Modelers for creating both logical and physical data models.
  • Created Mapping Documents and ETL design documents for both Presentation Layer and Standard Layer. Followed Enterprise Level Metadata while creating design and source-to-target mapping documents.
  • Designed and developed a number of complex mappings using various transformations like Source Qualifier, Aggregator, Router, Joiner, Union, Expression, Lookup Connected unconnected , Filter, Update Strategy, Stored Procedure, Sequence Generatorand used reusable transformations as well as mapplets
  • Scheduling the Informatica sessions to automate the loading process.
  • Optimizing the Informatica sessions, SQL queries using TOAD.
  • Running the SQL scripts from TOAD and creating Oracle Objects like tables, views, Indexes, sequences, synonyms and other Oracle Objects.
  • Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, Fast Export, Teradata Parallel Transporter TPT , DDL and DML Commands.
  • Created Fast Load, Fast Export, Multi Load, TPUMP, and BTEQ to load data from Oracle database and Flat files to primary data warehouse.
  • Created several BTEQ scripts involving derived tables and volatile/Global temporary tables for ad hoc purpose to extract data for several business users on scheduled basis
  • Extensively createdTeradata Macro's and stored procedures for repeated use across various applications
  • Used Informatica Data Quality IDQ to analyze, standardize and cleanse the source data.
  • Created Informatica Exception Handling Mapping for Data Quality, Data Cleansing and Data Validations
  • Worked efficiently on Teradata Parallel Transport TPT codes. Used Teradata Parallel Transport scripts to load data from one environment to other
  • Created complex oracle and SQL Reports, Oracle Triggers, Procedures and Functions in PL/SQL
  • Created packages used throughout the application and developed stored procedures to implement interface programs for data extraction, transformation and loading from and into Oracle database and Teradata.
  • Used ODI to tackle the non oracle targets and also to handle the sources such as XML, Flatfiles.
  • Authored the Unit testing and Code Review standard documents.
  • Extensive use of UNIX Scripts to build Interfaces between various Data Warehouse systems across the company.
  • Created connected and Un-Connected transformations tolookup the data from Source and Target Tables.
  • Worked with Workflow Manager for the creation of various tasks like Worklets, Sessions, Batches, Event Wait, E-mail notifications, Decision and to Schedule jobs.
  • Extensively used the Slowly Changing Dimensions-Type II in various data mappings to load dimension tables in Data warehouse.
  • Extensively worked on Performance tuning to increase the throughput of the data load like read the data from flat file write the data into target flat files to identify the bottlenecks .
  • Scheduled Sessions and Batches on the Informatica Server using Informatica workflow Manager.
  • Created complex PL/SQL stored procedures and functions.
  • Developed and implemented the UNIX shell script for the start and stop procedures of the sessions
  • Monitoring the ETL jobs and fixing the Bugs
  • Developed, executed Test Plans and Test Cases to validate and check Referential integrity of data extracts before loading it to Data Warehouse.
  • Involved in 24x7production support.

Environment: Informatica Power Center 9.1.0 Workflow Manager, Workflow Monitor, Warehouse Designer, Source Analyzer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager , Power Exchange, Oracle 11G,TOAD 9.5, Teradata V12.0, TeradataUtilities BTEQ, Multiload, FastLoad, FastExport, BTEQ,Tpump, TPT , MicroStrategy 9.3 .1 Administrator, Architect, Desktop, Narrowcast Server, Object Manager, Enterprise Manager , DB2, Teradata SQL Assistant, Flatfiles, SAS, Control M, UNIX, Korn Shell scripting

Confidential

ETL/ Informatica Developer

Responsibilities:

  • Managed the entire ETL process involving the access, manipulation, analysis, interpretation and presentation of information from both internal and secondary data sources.
  • Developed complex mappings using Informatica PowerCenter Designer to transform and load the data from various source systems like Oracle and Sybase into the Teradata target database.
  • Analyzed and understood all data in the source databases and designed the overall data architecture and all the individual data marts in the data warehouse.
  • Involved in the development and testing of individual data marts, Informatica mappings and update processes.
  • Identified and tracked the slowly changing dimensionsType 1 and Type 2, heterogeneous Sources and determined the hierarchies in dimensions.
  • Design, Development and Documentation of the ETL Extract, Transformation Load strategy to populate the Data Warehouse from the various source systems.
  • Worked on Informatica 9.1.0 client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager and Workflow Monitor.
  • Based on the requirements, used various transformations like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner and Sequence Generator in the mapping.
  • Developed Informatica SCD type-I, Type-II mappings. Extensively used almost all of the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, Mapplets and others.
  • Implemented update strategies, incremental loads, Data capture and Incremental Aggregation.
  • Extensively worked on various Look up Caches like Static, Dynamic, Persistent, and Shared Caches.
  • Developed workflow tasks like Email, Event wait, Event Raise, Timer, Command and Decision.
  • Created Stored Procedures in PL/SQL.
  • Involved in 24x7 production support.
  • Environment: Informatica Power Center 9.1.0, Workflow Manager, Workflow Monitor, Oracle 10g, Teradata Version 14.0/12.0, Temporal, Teradata Utilities Multiload, FastLoad, FastExport, BTEQ,Tpump , SAS, Sybase, Teradata Aprimo, Trillium, DB2, Teradata SQL Assistant, SQL server, SAP, Flatfiles, TOAD 9.x, SQL, Erwin, Control M, UNIX, Korn Shell scripting

Confidential

ETL/ Informatica Developer

Responsibilities:

  • Developed ETL mappings, transformations using Informatica Power Center
  • Extensively used Informatica tools Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer.
  • Used advanced techniques of Informatica to load data from Teradata into the target flat file.
  • Used Teradata utilities Fast load, Multi load to load the data into the Informatica layer.
  • Used Teradata Fast Export connection in Session properties.
  • Designed reusable transformations and mapplets.
  • Worked extensively on Flat Files, as the data from various Legacy Systems.
  • Involved in analyzing source systems and designing the processes for Extracting Transforming and Loading the data to Teradata database.
  • Created and ran workflows using Workflow Manager to load the data into the Target Database.
  • Developed mapping parameters and variables to support SQL override.
  • Wrote shell scripts to perform pre-session and post-session operationsUsed Shell scripts to check on the flat files before loading the data into Qlickview server.
  • Created Workflows in the Workflow designer and executed tasks such as Sessions, Commands, etc. And Monitored transformation processes using Workflow monitor
  • Created debugging sessions before the session to validate the transformations and also used existing mappings in debug mode extensively for error identification by creating break points and monitoring the debug monitor.
  • Used Teradata as the Database and loaded the data for Analysis.
  • Tested the data and data integrity among sources and targets using Unit Testing.
  • Involved in 24x7 production support.

Environment: Informatica Power Center 8.6.0 Workflow Manager, Workflow Monitor, Warehouse Designer, Source Analyzer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager Teradata SQL Assistant , Unix, Win XP.

We'd love your feedback!