We provide IT Staff Augmentation Services!

Sr.etl Developer Resume

4.00/5 (Submit Your Rating)

Columbus, OH

PROFESSIONAL SUMMARY:

  • Over 8.6 years of experience in analysis, design, development, design and development of enterprise level data warehouses using Informatica. Knowledgeable and experienced on current data warehousing concepts and methodologies. Experience in complete Software Development Life Cycle (SDLC) (Requirement Analysis, Design, Development and Testing).
  • Extensive use of SQL for database development on MS SQL Server, Oracle, Teradata, DB2, and Netezza. Extensive experience in extraction, transformation and loading of data directly from different heterogeneous source systems like Flat files (Fixed width & Delimited), COBOL files, VSAM, IBM DB2, Excel, Oracle, MS SQL Server, Teradata and Netezza.
  • Experienced in loader utilities including SQL Loader, Fast Load and MultiLoad. Good knowledge in data migration from Oracle to Teradata RDBMS. Experienced in PL/SQL scripts and extensively used bulk load concept. Experienced in UNIX shell scripting and VI Editor Usage.
  • Ability to document warehousing designs and clearly communicate alternatives to the client and business. Designed complex Mappings and have expertise in performance tuning and experience working on slowly-changing Dimension Tables and Fact tables.
  • Knowledge of different Schemas (Star and Snow Flake) to fit reporting, query and business analysis requirements. Understands Data Warehouse Change Data Capture (CDC) processing and challenges. Experienced in data modeling both in 3rd normal form and dimensional techniques.
  • Expertise in UNIX shell scripting and automation of ETL processes using IBM Tivoli (Maestro). Experience in PL/SQL Programming and in writing Stored Procedures, Functions etc.
  • Knowledge of Informatica Pushdown Optimization. Experience in Business analysis in developing plans and processes to accurately solicit and capture project requirements and goals. Designed the Informatica Mapping Technical specification on the basis of Functional Requirements.
  • Experience in working closely with the project management to ensure timely delivery of solutions and learning business concepts quickly and relate them to specific project needs.
  • Designed, developed and implemented data warehouse applications utilizing Informatica power center and Teradata BTEQs.
  • Worked as a Data modeler in creating and logical and physical database design. Involved in reviewing changes to data model and analysis of enterprise data model.

EDUCATION:

  • Bachelors in Computer Science.

TECHNICAL SKILLS:

Operating Systems : UNIX, MS Windows 2000/98/95/XP, MVS OS/390/AS400

Languages : SQL, PL/SQL, BTEQ, COBOL, Shell, PERL

Databases : Oracle, DB2, Teradata, Netezza, MS SQL Server

Tools : Informatica Power Center 8.X/7.X/Power Exchange
8. X/Workflow Manager/Workflow Monitor/Repository
Manager, SAS, Micro Strategy, Cognos, Visio, ERWIN,
Mercury Quality Center. IBM Clear Quest

Related Skills : CDC, Data Modeling (Dimensional, 3rd normal form),
MS Office, FTP, NDM (Network Data Mover)
Files : VSAM, COBOL Sequential and XML Files

PROFESSIONAL EXPERIENCE:

Confidential,Dec 08 - Present
Sr.ETL Developer, Columbus, OH

Project:
Galaxy Data Warehouse will support the Nationwide Company’s Property & Casualty Insurance Operations (PCIO) by becoming Nationwide PCIO’s “authority source” for information and reporting within the organization. Nationwide will use this customer-centric foundation of data as a competitive advantage by leveraging the cross-functional relationships and opportunities inherent within the Auto and Commercial Insurance data. The project involves the analysis, development and implementation effort required to integrate over 850 data elements and measurements into a common set of Teradata data tables

Responsibilities:

  • The project involved the analysis, development and implementation effort required to integrate data elements and measurements into a common set of data tables.
  • Modified existing/Developed new complex Informatica Power Center Mappings to extract the data according to the guidelines provided by the business users and populate the data into Target Systems.
  • Captured business rules for logical and physical modeling in metadata
  • Extensive experience with data ETL from disparate data sources including Oracle and Teradata, and Netezza.
  • Designed and developed Informatica mappings, enabling the extract, transport and loading of the data into target tables
  • Assisted the business in the data analysis effort to resolve issues related to source data (Personal and Commercial) on IBM mainframe
  • Developed detailed ETL specifications based on business requirements
  • Designed and coded the ETL extraction strategy to extract mainframe data from mainframe using COBOL
  • Built Informatica mappings and Proof of Concept (POCs) for each phase of the ETL process and helped other ETL Developers in reusing the same for the other tables in the source data
  • Documentation of mappings and sessions involved in the ETL Process and developed and Implemented Error Handling Strategies.
  • Developed the Teradata SQL, including BTEQ and used Teradata utilities like Fastload, Mload, and Fastexport Utilities.
  • Tested the ETL application following all the ETL standards and architecture
  • Assisted in the development, execution and documentation of system and integration test plans; updated and maintained metadata

Environment: Informatica, Teradata, Oracle 10g, Netezza, Informatica 7.1.4/8.6, Micro strategy, MVS, AS400, COBOL, VSAM, JCL, FILEAID, CHANGEMAN, VISIO, XPEDITER, UNIX, SAR.

Confidential,Boston, MA Sep 08 – Nov 08
Sr.Informatica Developer

Harvard University is the oldest institution of higher learning in the United States. It is also the first and oldest corporation in North America. Harvard has its own Data warehouse which enables authorized users to have on-line access to management and reporting data, it also provides improved information reporting and query environment for financial, human resource systems and Alumni Services.

Responsibilities:

  • Parsed high-level design spec to simple ETL coding and mapping standards.
  • Worked on Informatica Power Center tool - Source Analyzer, Target designer, Mapping & Mapplet Designer and Transformation Designer.
  • Used Transformations like Lookup, Joiner, Rank and Source Qualifier Transformations in the Informatica Designer.
  • Designed mechanism to capture changes in slowly changing dimension tables of the data.
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator, Stored Procedure, dynamic Lookup and Router transformations for populating target oracle tables in efficient manner.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Tuning Informatica Mappings and Sessions for optimum performance.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Used shell scripts for automating the execution of maps. Managed Change control implementation and coordinating daily, monthly releases and reruns.
  • Created and maintained several custom reports for the client using Business Objects.

Environment: Informatica Power Center 8.6, PL/SQL, Oracle 10g/9i, Erwin 7.1/4.0, SQL, Business Objects 6.5/XIR2, UNIX, TOAD, PVCS

Confidential,MI Oct 07 – Aug 08
Sr. ETL Developer/Analyst

T-Systems is a service provider of information and communications technology (ICT) for the automobile and logistics sectors of Volkswagen. Tenure at TSNA, which included working on ‘Project SAGA’, involved consolidating the collections department data which are spread across the country. The data provided the business with customized reports on the performance of departments in various locations.
Responsibilities:

  • Analyzed the complete Data warehouse ETLs and worked on improving performance.
  • Worked on Power Center client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.
  • Developed mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Update strategy and Sequence generator.
  • Involved in Performance tuning for better performance.
  • Participated in weekly end-user meetings to discuss data quality, performance issues and ways to improve data accuracy and new requirements, etc.
  • Involved in Unit, System integration, User Acceptance Testing of Mappings
  • Wrote complex SQL override scripts at source qualifier level to avoid Informatica joiners and Look-ups to improve the performance as the volume of the data was heavy.
  • Responsible for creating workflows. Created Session, Event, Command, and Control Decision and Email tasks in Workflow Manager.
  • Extensively used ETL processes to load data from various source systems such as Oracle, Flat Files into target systems Oracle and Teradata by applying business logic on transformation mapping for inserting and updating records when loaded.
  • Developed Unix Shell Scripts for invoking the Informatica mappings and BTEQ Teradata procedures as part of the ETL Control and email notifications
  • Supported the process steps under development, test and production environment
  • Created high-level Technical Design Document and Unit Test Plans.
  • Documented Informatica mappings, design and validation rules.
  • Worked on Business Objects XIR2 suite using Desktop Intelligence, CMC, Web Intelligence reporting and Publisher.
  • Created Universes (Business view of the database), retrieving data using Universes, Personal data files, Stored procedures and Freehand SQL methods and creating complex, Ad hoc Reports using Business Objects and Web Intelligence Tools.

Environment: Informatica Power Center 8.5/8.1.4/7.1.4, Oracle 10g/9i/8i, Teradata v2r6,Business Objects 6.5/XIR2, SQL, PL/SQL,TOAD, Perl, UNIX, clear case

Confidential,MI Feb 06 – Sep 07
Sr. ETL Developer

AAA is major insurance company offering different kind of insurances. The primary objective of the project was to make a data warehousing system out of customers. This process involved gathering the source data globally from its group of companies (in various proprietary formats like RDBMS, Flat files etc). The data was extracted, transformed and loaded into target database using different mappings and transformations. The EDW warehouse was developed to capture data from different systems and loading to staging area, ODS & Datamart for analysis & reporting purposes

Responsibilities:

  • Created Sessions and Workflows to load data from the IBM DB2 UDB 8 databases that were hosted on HP UX 11i RISC server.
  • Used conversion process for VSAM files using Informatica Power Exchange.
  • Developed different mapping logic using various transformations to extract data from different sources like flat files, IBM MQ series, Oracle.
  • Created UNIX shell scripts, JCL mainframe procedures, and processes to extract data from various sources such as Oracle.
  • Involved in developing design documents and mapping documents.
  • Modified the existing ETL code (Mappings, Sessions and Workflows) and the shell scripts as per the user requirements.
  • Resolved issues that cause the production jobs to fail by analyzing the ETL code and log files created by the failed jobs on the Informatica server.
  • Wrote complex SQL override scripts at source qualifier level to avoid Informatica joiners and Look-ups to improve the performance as the volume of the data was heavy.
  • Used Data clean for cleansing the source data coming from heterogeneous sources to load into targets.
  • Scheduled ETL jobs using autosys.
  • Experience in using Erwin tool for designing of data model.
  • Extensively involved in unit testing, integration testing and system testing of the mappings and writing Unit and System Test Plan.
  • Created high-level Technical Design Document.
  • Documented Informatica mappings, design and validation rules.

Environment: Informatica Power Center/power mart 7.1.1,Informatica Power Exchange 7.1.1, Autosys 4.5, HP-UNIX 11i, Oracle 10g/ 9i, PL/SQL, IBM DB2 8.5,MQ Series, VSAM files UDB 8.

  • Confidential,WA Apr 04- Jan 06

ETL Developer

  • Insurance data warehouse system was designed and developed using Informatica as an ETL tool and oracle as back end.

Responsibilities:

  • Worked on Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.
  • Involved in designing the procedures for getting the data from various systems to Data Warehousing system. The data was standardized to store various Business Units in tables.
  • Responsible for studying the existing data warehouse which is in PL/SQL and also working on PL/SQL packages, stored procedures, triggers and functions.
  • Used various transformations like Aggregator, Expression, and Lookup, Rank, Update Strategy, Stored procedure and Sequence Generator.
  • Extensively used tools like Fast load, fast export, Multiload, BTEQ for transforming and loading data from various sources into the Teradata data marts.
  • Worked with Heterogeneous targets in Power Center 6.1/7.1
  • Worked with different types of partition techniques like key range, pass through, Round Robin and Hash partitioning
  • Knowledge of slowly changing dimension tables and fact tables.
  • Involved in Performance tuning for better performance.
  • Responsible for automation of ETL processes using Autosys.
  • Worked with various business owners, SMEs, and work groups to design and develop operational processes. Worked with various business owners, SMEs, and work groups to design and develop operational processes.
  • Extensively wrote complex Teradata and Oracle SQL queries for fast export, UNIX script and SQL Override and for unit testing scenarios of the mappings.
  • Wrote Pre-session and Post-session shell scripts for dropping, creating indexes for tables, Email tasks and various other applications.
  • Creation of different types of reports, such as Master/Detail, Cross Tab and Chart (for trend analysis) using Business Objects
  • Involved in Unit, System integration, User Acceptance Testing of Mappings

Environment: Informatica PowerCenter/ Powermart 6.x/7.x, Oracle 9i/8i, DB2, Teradata v2r5/v2r6, SQL Server 2000, Perl, Shell Scripting, SQL, PL/SQL, TOAD, Business Objects, Cognos and Sun Solaris UNIX, Windows XP.

Confidential,NC Jul 03- Mar 04
Sr ETL Developer

Time Warner Cable, Inc. is the second-largest cable operator in the U.S. and an industry leader in developing and launching innovative video, data and voice services. The enterprise data warehouse helped resolve performance issues and be able to continuously learn and retain knowledge of new services and products being offered by the company. The TWC data warehouse was built on Oracle database. The sources of this were SQL Server, Oracle and Flat Files. The sources were loaded into Oracle DW using Informatica.

Responsibilities:

  • Worked on Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.
  • Involved in designing the procedures for getting the data from various systems to Data Warehousing system. The data was standardized to store various Business Units in tables.
  • Designed data model for sales data mart to load sales order and sales revenue.
  • Responsible for studying the existing data warehouse which is in PL/SQL and also working on PL/SQL packages, stored procedures, triggers and functions.
  • Used Extraction Transformation and Loading to transfer the data to the target Database.
  • Provided development for initial load programs to migrate commercial databases from Oracle data marts to Teradata warehouse as well as ETL framework to supply continuous engineering and manufacturing updates to the data warehouse Oracle, Teradata, ODBC.
  • Used various transformations like Aggregator, Expression, Lookup, Rank, Update Strategy, Stored procedure and Sequence Generator.
  • Worked with Heterogeneous targets in Power Center 6.1/7.1
  • Worked with different types of partition techniques like key range, pass through, Round Robin and Hash partitioning
  • Knowledge of slowly changing dimension tables and fact tables
  • Involved in Performance tuning for better performance
  • Responsible for automation of ETL processes using Autosys.
  • Worked with various business owners, SMEs, and work groups to design and develop operational processes. Worked with various business owners, SMEs, and work groups to design and develop operational processes.
  • Responsible for Creating workflows and worklets. Created Session, Event, Command, Control Decision and Email tasks in Workflow Manager
  • Wrote the shell scripts to process the PL/SQL procedures and wrote the PL/SQL program units for data extracting, transforming and loading.
  • Wrote Pre-session and Post-session shell scripts for dropping, creating indexes for tables, Email tasks and various other applications.
  • Creation of different types of reports, such as Master/Detail, Cross Tab and Chart (for trend analysis) using Business Objects
  • Involved in Unit, System integration, User Acceptance Testing of Mappings

Environment: Informatica PowerCenter 7.x, Oracle 9i/8i, DB2, Teradata, Teradata SQL Assistant Perl, Shell Scripting, SQL, PL/SQL, TOAD, Cognos and Sun Solaris UNIX, Windows XP

Confidential,MI Jan 02 to Jun 03
Data Warehousing Consultant

Pfizer is a research-based global pharmaceutical company, dedicated to developing new, safe medicines to prevent and treat the world\'s most serious diseases. The work involved gathering a wide variety of information, including data from the company\'s sales force automation system, historical transaction data, and external market and competitor information from third-party data providers. This Data Warehouse is used to deliver reports and information to sales and marketing management.

Responsibilities:

  • Involved in the upgrade of Repositories in Power Center 5.1 to 6.1.
  • Worked on Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.
  • Migrating Informatica mappings and sessions to production as per instructions specified.
  • Involved in designing the procedures for getting the data from various systems to Data Warehousing system. The data was standardized to store various Business Units in tables.
  • Responsible for Studying the existing data warehouse which is in PL/SQL and also working on PL/SQL packages, stored procedures, journals/shadow tables, triggers and functions.
  • Used Extraction Transformation and Loading to transfer the data to the target Database.
  • Used various transformations like Aggregator, Expression, Lookup, Rank, Update Strategy, Stored procedure and Sequence Generator.
  • Worked with Heterogeneous targets in Power Center 6.1.
  • Worked with different types of partition techniques like key range, pass through, Round Robin and Hash partitioning
  • Knowledge of slowly changing dimension tables and fact tables.
  • Involved in Performance tuning for better performance
  • Creating and Running Sessions & Batches
  • Responsible for Creating workflows and worklets. Created Session, Event, Command, Control Decision and Email tasks in Workflow Manager.
  • Wrote the shell scripts to process the PL/SQL procedures and wrote the PL/SQL program units for data extracting, transforming and loading.
  • Wrote Perl and Korn shell scripts for extracting data from different sources like flat files, and oracle databases and for transformation of data.
  • Wrote Pre-session and Post-session shell scripts for dropping, creating indexes for tables, Email tasks and various other applications.
  • Responsible for automation of ETL processes using Crontab
  • Creation of different types of reports, such as Master/Detail, Cross Tab and Chart (for trend analysis) using Business objects
  • Involved in Unit, System integration, User Acceptance Testing of Mappings

Environment: Informatica PowerCenter/ Powermart 5.x/6.x, Informatica Power Plug/Metadata Exchange, Oracle 9i/8i, Teradata, Shell Scripting, SQL, PL/SQL, TOAD, Business Objects 5.1 and Sun Solaris UNIX, Windows NT

Confidential,OH Mar 01 to Dec 01
ETL developer
National City is a leading Investment and wealth management banking company involved in financial services. The project that I worked on was in building a data warehouse and reporting off of the data warehouse. The data warehouse had data consolidated from across the banks in the country. Data from different sources was brought into Oracle database using ETL Tool.

Responsibilities:

  • Worked extensively on Informatica Power Center 5.1/4.7 client tools (Repository manager, Designer, Server Manager).
  • Involved in analyzing source systems and designing the processes for Extracting Transforming and Loading the data to Data Warehousing system.
  • Standardized data to store various Business Units in tables. Summaries for Orders, Quotes and Customers were stored in Summary Tables for Analysis.
  • Used transformations like Aggregators, Connected & unconnected lookups, Filters & Sequence and many more.
  • Extensively used ETL to load data from Flat files system, Sybase to Oracle database.
  • Used Mapping Parameters and variables and passed the value of parameters from a parameter file to filter the data.
  • Used Shell Scripting to create indicator files for event driven sessions
  • Stored data about Sales Orders, Quotes and Purchase Orders into data warehouse.
  • Applied performance tuning logic to optimize session performance
  • Creating and Running sessions & Batches using Server Manager to load the data into the Target Database.
  • Responsible for automation of ETL processes using Crontab.
  • Created several reports using Cognos Impromptu.

Environment: Informatica Power Center 5.1/4.7,Cognos BI Tools, Sybase, Oracle 8i, Perl, Korn Shell, SAP R/3, PL/SQL, ETL, SQL*Loader, Windows NT 4.0, HP-UNIX.

We'd love your feedback!