We provide IT Staff Augmentation Services!

Etl Developer Resume

4.75/5 (Submit Your Rating)

Summary:

  • Energetic, composed, motivated ETL developer with a passion for innovation, learning and technology.
  • 6 years of strong experience in ETL methodology for data transformation using Informatica PowerCenter 7.x - 9.x and ITSM.
  • Technical expertise with specialization in the Data Warehouse and system development life cycle and support.

Areas of Expertise:

  • Extensive experience in Informatica (9.x) applications. Designed and developed the Workflows, Worklets, Mappings, Mapplets, Sessions, Tasks, Transformations and schedule the Workflows and Sessions.
  • Interacted with end-users and functional analysts to identify and develop Business Specification Documents (BSD) and transform it into technical requirements.
  • Strong experience with Informatica tools – Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformations Developer, Informatica Repository.
  • Designed and developed complex mappings to move data from multiple sources into a common target area such as Data Marts and Data Warehouse using lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Normalizer, Sequence Generator, Update Strategy from varied transformation logics in Informatica.
  • Having strong hands on experience in extraction of the data from various source systems ranging from Mainframes like DB2, Flat Files, VSAM files, etc. to RDBMS like Oracle, SQL Server, Teradata etc.
  • Extensively used Slowly Changing Dimension (SCD) technique in insurance application. Expertise in OLTP/OLAP System Study, Analysis, E-R modeling, developing Dimensional Models using Star schema and Snowflake schema techniques used in relational, dimensional and multidimensional modeling.
  • Worked on optimizing the mappings by creating re-usable transformations and Mapplets. Created debugging and performance tuning of sources, targets, mappings, transformations and sessions.
  • Worked on Cognos reporting tool for Business reporting, Dashboard set up and end to end testing.
  • Exposure in overall SDLC including requirement gathering, data modeling, development, testing, debugging, deployment, documentation, production support.
  • Experience in Task Automation using UNIX Scripts, Job scheduling and Communicating with Server using pmcmd. Extensively used Autosys for Job monitoring and scheduling.
  • Excellent analytical, problem solving and communication skills with ability to interact with individuals at all levels.

Technical Skills:

ETL Tools: Informatica PowerCenter 9.x/8.x
Database: Oracle 10g/ 9i/ 8i, Sql Server, DB2 and Teradata
DB Tools: SQL*Plus, TOAD, SQL Loader
Environment: WINDOWS XP/2000, Unix and OS390
Languages: SQL, PL/SQL, UNIX
Reporting Tools: Cognos, MS Office Suite
Drafting Tools: Microsoft Visio and Erwin
Other Tools/Utilities: HP Quality Center, Autosys, AQT, Lotus notes, Managing IS and CAFM (File master)

Professional Experience:

Confidential, Webster- MA Sep 2010 - Present
Senior ETL Developer

Commerce Insurance Company provides Personal and Commercial P&C Insurance in Massachusetts and New Hampshire. Their core product lines include Personal Auto, Homeowners, and Commercial Auto insurance. In Massachusetts, Commerce Insurance is the leading provider of both private passenger and homeowners insurance and is the second leading provider of Commercial Auto insurance.

Responsibilities:

  • Created analysis of source systems, business requirements and identification of business rules.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica PowerCenter
  • Worked on complex mappings, mapplets and workflow to meet the business needs ensured they are reusable transformation to avoid duplications.
  • Extensively used ETL to transfer and extract data from source files (Flat files and DB2) and load the data into the target database.
  • Documented Mapping and Transformation details, user requirements, implementation plan and schedule.
  • Extensively used Autosys for Scheduling and monitoring
  • Designed and developed efficient Error Handling methods and implemented throughout the mappings. Responsible for Data quality analysis to determine cleansing requirements.
  • Worked with several facets of the Informatica PowerCenter tool - Source Analyzer, Data Warehousing Designer, Mapping & Mapplet Designer and Transformation Designer. Development of Informatica mappings for better performance.
  • Responsible for Performance Tuning at the Mapping Level, Session Level, Source Level and the Target Level for Slowly Changing Dimensions Type1, Type2 for Data Loads
  • Configured the sessions using workflow manager to have multiple partitions on Source data and to improve performance. Understand the business needs and implement the same into a functional database design
  • Prepared Unit/ Systems Test Plan and the test cases for the developed mappings.
  • Responsible for team members’ work assignment and tracking.

Tools and Environment: Informatica PowerCenter, Oracle 9i, Flat files, PL/SQL, DB2, SQL, Erwin, TOAD 7.0, Windows XP, Managing IS, Lotus Notes, OS390 and Autosys

Confidential, Bernardsville - NJ Apr 2009 - Aug 2010
ETL Developer

NJSI is a large New Jersey-based insurance provider focused on Home and Auto insurance; it has grown to provide insurance to more than 700,000 policyholders living in New Jersey.

Responsibilities:

  • Analyzed the source system and involved in designing the ETL data load.
  • Developed/designed Informatica mappings by translating the business requirements.
  • Worked in various transformations like Lookup, Joiner, Sorter, Aggregator, Router, Rank and Source Qualifier to create complex mapping.
  • Involved in performance tuning of the Informatica mappings using various components like Parameter Files, round robin and Key range partitioning to ensure source and target bottlenecks were removed.
  • Implemented documentation standards and practices to make mappings easier to maintain.
  • Extensive SQL querying for Data Analysis and wrote, executed, performance tuned SQL Queries for Data Analysis & Profiling. Extracted business rule and implemented business logic to extract and load SQL server using T-SQL.
  • Worked with Teradata utilities like FastLoad and MultiLoad
  • Involved in automating retail prepaid system process. Created packages and dependencies of the processes.
  • Identified common issues in Cognos and published in NJSI Wiki page. Established Dashboards and Business reports.
  • Created automating retail prepaid system process; created packages and dependencies of the processes.
  • Created support, maintaining, enhancing and developing Wiki page new interfaces for Claim warehouse application.
  • Used Autosys for scheduling various data cleansing scripts and loading processes; maintained the batch processes using UNIX Scripts.
  • Monitor & troubleshoot batches and sessions for weekly and monthly extracts from various data sources across all platforms to the target database.
  • Tuned the mappings by removing the Source/Target bottlenecks and Expressions to improve the throughput of the data loads.

Tools and Environment: Informatica PowerCenter, Oracle 10g, PL/SQL, MS SQL Server, Cognos, Autosys, and Quality Center

Confidential, Hartford - CT Jun 2008 - Feb 2009
ETL Developer

This Project defines the mid-level platform design for the Core Banking System (CBS) with regards to the changes needed to migrate Commercial & Offshore accounts from CAP to CBS. The document is designed to complement the End to End Project Design document and shows at a platform level the developing design in greater detail. It serves as an overview to the developments of application software on the CBS and IBM WebSphere DataStage (Ascential).

Responsibilities:

  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica PowerCenter
  • Develop Mappings and Workflows to generate staging files.
  • Developed various transformations like Source Qualifier, Sorter transformation, Joiner transformation, Update Strategy, Lookup transformation, Expressions and Sequence Generator for loading the data into target table.
  • Created multiple Mapplets. Workflows, Tasks, database connections using Workflow Manager
  • Created sessions and batches to move data at specific intervals & on demand using Server Manager
  • Responsibilities include creating the sessions and scheduling the sessions
  • Recovering the failed Sessions and Batches.
  • Extracted the data from Oracle, DB2, CSV and Flat files
  • Implemented performance tuning techniques by identifying and resolving the bottlenecks in source, target, transformations, mappings and sessions to improve performance. Understanding the Functional Requirements.
  • Designed the dimension model of the OLAP data marts in Erwin.
  • Preparing the documents for test data loading

Tools and Environment: Informatica PowerCenter 7.1, , Oracle 9i, Erwin, TOAD, DB2 and Flat files (Host – VSAM / PS file)

Confidential, MA Sept 2007 – May 2008
Data Warehouse Developer

The Health department of Pennsylvania is currently implementing a major initiative Pennsylvania’s National Electronic Disease Surveillance System (PA-NEDSS).It is an online, public health disease reporting and case management system for the Pennsylvania Department of Health (DOH). PA-NEDSS seeks to provide a single, integrated Web-based application. The Enterprise Data Warehouse is built for PA to generate reports for prior approval of the budget.

Responsibilities:

  • Data Analysis to extract useful data, finding patterns and regularities from the sources and develop conclusions.
  • Interacted with functional/end users to gather requirements of core reporting system to understand exceptional features users expecting with ETL and Reporting system and also to successfully implement business logic.
  • Created Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, grain, dimensions and measured facts.
  • Extracted the data from the flat files and other RDBMS databases into staging area andpopulated onto Data Warehouse.
  • Developed number of complex Informatica mapplets and reusable transformations to implement the business logic and to load the data incrementally. Created table mappings to load Fact and Dimension tables.
  • Developed Informatica mappings by usage of Aggregator, SQL Overrides in Lookups, Source filter in Source Qualifier and data flow management into multiple targets using Router transformations
  • Used PowerCenter server manager/Workflow manager for session management, database connection management and scheduling of jobs to be run in the batch process using Control M auto scheduling tool.
  • Migrated mappings, sessions, and workflows from Development to testing and then to Production environments.
  • Created multiple Type 2 mappings in the Customer mart for both Dimension as well as Fact tables, implementing both date based and flag based versioning logic.
  • Performed scheduled loading and clean ups and Monitored troubleshoots batches and sessions for weekly and Monthly extracts from various data sources across all platforms to the target database
  • Worked with Session Logs, Informatica Debugger, and Performance Logs for error handling the workflows and session failures.
  • Identify the flow of information, analyzing the existing systems, evaluating alternatives and choosing the "most appropriate" alternative.
  • Worked with tools like TOAD to write queries and generate the result
  • Used SQL Override in Source qualifier to customize SQL and filter data according to requirement.
  • Wrote PRE and POST SQL commands in session properties to manage constraints that improved performance and written SQL queries and PL/SQL procedures to perform database operations according to business requirements.
  • Performed Pushdown Optimization to increase the read and write throughput
  • Created system end to end testing (Unit and System Integration Testing) as part of the SDLC process and performance tuning at the mapping level, session level, source level, and the target level.

Tools and Environment: Informatica PowerCenter 8.1 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Windows XP, Oracle 10g, TOAD, PL/SQL Developer, SQL Plus.

Confidential, India May 2006 – Aug 2007
Data Warehouse Developer

Aviva is the world's sixth largest insurance company providing customers with insurance, savings and investment products. It is UK's largest insurer and one of Europe's leading providers of life and general insurance.

Responsibilities

  • Analyzed relationships of Flat Files and to extract the analyzed systems, met with end users and business units in order to define the requirements
  • Documented user requirements, translated requirements into system solutions and develop implementation plan and schedule.
  • Developed data Mappings between source systems and warehouse components.
  • Developed ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translates business rules and functionality requirements into ETL procedures.
  • Developed and tested all the backend programs, Informatica mappings and update processes.
  • Created and Monitored Batches and Sessions using Informatica PowerCenter Server.
  • Responsible to tune ETL procedures and STAR Schemas to optimize load and query performance.
  • Extensively worked in the performance tuning of the programs, ETL Procedures and processes
  • Written documentation to describe program development, logic, coding, testing, changes and corrections.
  • Perform analysis and resolution of Help Desk Tickets and maintenance for assigned applications.

Tools and Environment: Informatica PowerMart 7.1, Oracle, PL/SQL, Windows, Remedy, Synergy.

Academic Credentials:

  • Bachelor’s degree with honors

We'd love your feedback!