We provide IT Staff Augmentation Services!

Sr. Etl Developer / Data Warehouse Developer / Informatica Lead Resume

0/5 (Submit Your Rating)

Carmel, IN

SUMMARY

  • 9 Years of experience in IT industry with focus on Data warehousing and Business Intelligence using Informatica (ETL) Developer.
  • Strong experience in integration of various data sources like Oracle, PeopleSoft, Teradata, MS SQL Server, DB2, Netezza, Mainframe and Flat Files.
  • Experience in all phases of the Data warehouse life cycle involving Analysis, Design, Development and Testing of Data warehouses using ETL Logic.
  • Experience in architecture design of Extract, Transform, Load environment using Informatica Power Mart and Power Center.
  • Extensively worked on Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Sequence Generator and Union Transformation.
  • Developed Slowly Changing Dimension Mappings of type I, type II (version, flag and time stamp) and type III.
  • Experience of Advanced Programming for data transformation (JAVA, C++, C)
  • Hands on experience in developing web based applications in Java using Servlets, JSPs and EJB
  • Knowledge in HTML,XML, DTD, XML Schema
  • Hands on experience using Data Services/Data Integrator to implement data integration solutions
  • Experience in debugging, error handling and performance tuning of sources, targets, mappings and sessions with the help of error logs generated by Informatica server.
  • Excellent hands on experience in writing, testing and implementation of the PL/SQL - Stored procedures, Functions, Triggers, Views, Materialized Views, Packages, DTS and performance tuning of SQL queries.
  • Worked with Oracle Stored Procedures and experienced in loading data into Data Warehouse/Data Marts using Informatica, SQL*Loader, Export/Import utilities.
  • Strong experiences in UNIX shell scripting. Created scripts for calling pmcmd command, file watcher, FTP and file archiving scripts.
  • Strong practical understanding of Star Schema and Snowflake Schema Methodology, using Data Modeling tool Erwin 4.0/4.2.
  • Involved in Technical Documentation, Unit test, Integration test and writing the test plan.
  • Highly adaptive to a team environment and proven ability to work in a fast paced teaming environment with excellent communication skills.

TECHNICAL SKILLS

Tools: Used: Informatica PowerCenter 9.6.1/9.5.1/9.1/8 x, Power Exchange, Autosys, TOAD, SQL Developer.

Databases: Oracle 11g/10g/9x, Teradata 13.x, MS SQL Server 2008/2012, DB2, Netezza 7.0

Data Modeling Tool: Erwin 3.5/4.0/4.2.

Languages & Scripting: SQL, PL/SQL and UNIX Shell Script

Operating Systems: Windows 2000/2003/NT/XP/2007, UNIX.

PROFESSIONAL EXPERIENCE

Confidential, Carmel, IN

Sr. ETL Developer / Data warehouse Developer / Informatica lead

Responsibilities:

  • Interact with Subject Matter Experts (SME) to gather Business Requirements for the application, through one-on-one interviews, and worked on developing the business rules for cleansing.
  • Extracting data from various sources (Flat Files, Oracle, XML, SQL Server, IBM mainframe ) using different kinds of transformations like Router, Sorter, Stored Procedure, Source Qualifier, Joiner, Aggregator, lookup, Expression, XML, Connected and Unconnected Lookups, Sequence Generator, Union, Update Strategy to load data into target tables.
  • Worked on Data Cleansing, Data conversion and process implementation.
  • Worked on Incremental Aggregations for improving performance in Mappings and Sessions.
  • Worked with PRI/POST session commands to unzip and generate a list file for INDIRECT source files type.
  • Worked with InformaticaPower Exchangefor Loading/Retrieving data from mainframe systems.
  • Working with InformaticaCDC to capture the changes.
  • Designed the Process Control Table that would maintain the status of all the CDC jobs and thereby drive the load of Derived Master Tables.
  • Worked with Memory cache for static and dynamic cache for better throughput of sessions containing Lookup, Joiner, and Aggregator and Rank transformations.
  • Extracted sources from XMLs and load them into Flat Files, Oracle and Teradata.
  • Used SCD Type 1 (discard previous versions of dimensions in the target table) and SCD type 2 (maintain history) mapping to update slowly changing dimension tables.
  • Extensively used Power Connect to import data from different External Sources.
  • Developed Stored Procedures, Functions, Triggers, Indexes and Views.
  • Redesigned some of the mappings in the system to meet new functionality.
  • Extensively used PL/SQL, TOAD for creating views and stored procedures and indexes on the table for performance tuning.
  • Worked on Batch Processing or Scheduling and Shell Scripting.

Environment: Informatica Power Center 9.6.1, Power Exchange, Oracle 11/g, Flat Files, Microsoft SQL Server, Autosys, UNIX Putty

Confidential, Jacksonville, FL

Sr. Informatica / Data warehouse Developer

Responsibilities:

  • Involved in Full Life Cycle Development of building a Data Warehouse.
  • Developed Complex transformations, Mapplets using Informatica to Extract, Transform and load data into Data marts, Enterprise Data warehouse (EDW) and Operational data store (ODS).
  • Extracted data from various source systems like Oracle, SQL Server, .CSV and flat files and loaded into relational data warehouse.
  • Designed and developed complex mappings, from varied transformation logic like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Stored Procedure and more Created sessions, sequential and concurrent batches for proper execution of mappings using server manager.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Created and scheduled Sessions, Jobs based on demand, run on time and run only once using Workflow Manager.
  • Modified the existing Worklets and workflows to accommodate for these new sessions and mappings. Created the break points in the designer and worked with debugger.
  • Worked on Data Cleansing, Data conversion and process implementation.
  • Included data from different sources like Oracle Stored Procedures and Personal data files in the same report.
  • Migration of New and Changed Informatica objects across the environments using copy and Deployment Group methods.
  • Executed sessions, both sequential and concurrent for efficient execution of mappings and used other tasks like event wait, event raise, email, command and pre/post SQL.
  • Responsible for issue resolution, work closely to perform data quality & analysis and monitoring of data load.
  • Partially involved in writing the UNIX Shell Scripts, which triggers the workflows to run in a particular order as a part of the daily loading into the Warehouse.

Environment: Informatica Power Center 9.5.1, (Source Analyzer, Autosys, Target Designer, Mapping Designer, Mapplet Designer, Transformation Developer, Workflow Manager, Workflow Monitor, Worklets), Oracle 11g, PL/SQL, Microsoft SQL Server, PeopleSoft 9.2, Flat Files, SQL*Plus, Control M, Autosys, Unix Putty, WinSCP.

Confidential, Pittsburgh, PA

Sr. Informatica Developer

Responsibilities:

  • Working as part of ESDM (Enterprise Systems and Data Management) team to build Confidential healthcare EDW projects to integrate data from different sources systems.
  • Created and analyzed Process Work Flows, Scope & Functional Specifications, and was responsible for preparing Software Requirement Specifications (SRS) and Functional Specification Documents (FSD).
  • Applied business rules using Informatica Data Quality (IDQ) tool to cleanse data.
  • Responsible for Requirement gathering, Design, Development, testing and Implement of customized applications. Does the ETL -Informatica, Oracle, UNIX andAutosys framework for the project.
  • Worked in Informatica B2B Data Transformation.
  • Performance tuning of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Traced the flow of data in PowerCenter from the sources to targets using Metadata Manager
  • Perform analysis profiling on existing data and identify root causes for data inaccuracies Table Lineage and Impact Analysis and recommendation of Data Quality Internal and External Audits.
  • Analyze data and create detailed reports utilizing proprietary tools and third party technologies to identify and resolve data quality issues to ensure quality assurance and drive data quality initiatives.
  • Developed Packages and procedures for Extraction, Transformation and Loading processes.
  • Improved the performance of ETL process by fine tuning queries, creating appropriate indexes and using temp tables.
  • Used Metadata manager for validating, promoting, importing and exporting repositories from development environment to testing environment.
  • Resolved Inconsistent and Duplicate Data to Support Strategic Goals with Multidomain MDM
  • Used Autosys and cronjobs in UNIX environment for scheduling routine tasks.
  • Used Control M scheduling tool to schedule the Jobs

Environment: Informatica Power Center 9.1, IDQ, Metadata Manager, Netezza, Oracle 10g/9i, PL/SQL, DB2, Flat Files, Autosys, Toad 10.5, Unix Putty

Confidential, Baltimore, MD

ETL Developer

Responsibilities:

  • Designed, developed, implemented and maintained Informatica PowerCenter and IDQ 8.6.1 application for matching and merging process.
  • Integrated Informatica applications as Maplets within PowerCenter 8.6 Mappings.
  • Used Informatica IDQ 8.6.1to complete initialdata profiling and matching/removing duplicate data.
  • Successfully created complex Informatica mappings to load data in to the data mart and monitored them.
  • The mappings involved extensive use of transformations like Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator.
  • Defined Target Load Order Plan and Constraint based loading for loading data correctly into different Target Tables.
  • Integrated workflow and created Autosys jobs for scheduling.
  • Administrating User Privileges, Groups and Folders which includes creation, update and deletion.
  • Migration of New and Changed Informatica objects across the environments using Folder to Folder and Deployment Group methods.
  • Performed unit testing on the given tasks after performing the appropriate mappings.
  • Worked on SQL tools like TOAD to run SQL Queries to validate the data.
  • Created PL/SQL procedures to populate base data mart aggregation structure.
  • Used PMCMD/PMREP commands for running Informatica from backend.
  • Developed Batch Jobs using UNIX Shell scripts to automate the process of loading, pushing and pulling data from and to different servers.

Environment: Informatica Power Center 8.6.1, Autosys, SQL server 2005/2008, Teradata, Oracle10g, UNIX Shell Scripting, Erwin 4.5, Microsoft Visio, SQL Plus, TOAD, BO XI 3.1.

Confidential, Charlotte, NC

Sr. ETL/ Informatica developer

Responsibilities:

  • Interacted with business analyst and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build the Data marts.
  • Developed a standard ETL framework to enable the reusability of similar logic across the board.
  • Involved in System Documentation of Dataflow and methodology.
  • Assisted in designing Logical/Physical Data Models, forward/reverse engineering using Erwin 4.0.
  • Developed mappings to extract data from SQL Server, Oracle, Flat files and load into Data Mart using the Power Center.
  • Created complexInformaticamappings involving complex transformation rules to populate the dimension and fact tables.
  • CreatedAutosysbatches (Jobs, Scheduling tables) to load the tables according to the sourcing dependencies and Business SLA and created common UNIX shell script to run Informatica workflows.
  • Developed common routine mappings. Made use of mapping variable & mapping parameters.
  • Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.
  • Developed Slowly Changing Dimension for type I and type II (flag, date).
  • Used mapplets for use in mappings thereby saving valuable design time and effort.
  • Used Informatica Workflow Manager to create, schedule, execute and monitor sessions, worklets and workflows.
  • Written procedures, Queries to retrieve data from DWH and implemented in DM.
  • Worked on creating Views & Materialized views from the database perspective.
  • Used shell scripts in UNIX to execute the workflow.
  • Responsible for submitting DBA requests, following up with data base DBA’s and Informatica administrators, creating remedy ticket, handling all the signoff’s for the production, updating all changes with new versions using START team, creating test cases using Quality Center.
  • Written SQL Queries, Triggers and Stored Procedures to apply and maintain the Business Rules.
  • Troubleshooting database, workflows, mappings, source, and target to find out the bottlenecks and improved the performance.
  • Written Indexes, primary keys and checked other performance tuning at data base level.
  • Conducted Database testing to check Constrains, field size, Indexes, Stored Procedures, etc.
  • Created testing metrics using MS-Excel.
  • Implemented various Performance Tuning techniques on Sources, Targets, Mappings, Workflows and database tuning.
  • Involved in generating reports from Data Mart using Cognos.
  • Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.
  • Performed Configuration Management to Migrate Informatica mappings/sessions /workflows from Development to Test and to production environment.

Environment: Informatica Power Center 8.1, Autosys, Teradata, SQL Server 2000/2005/2008, Oracle 10g, PL/SQL, UNIX Shell Scripts, Cognos, Erwin, Business Objects 6.5.

Confidential, Columbus, IN

ETL/Informatica Developer

Responsibilities:

  • Participated in the design team and user requirement gathering meetings and documentation.
  • Involved in developing and testing multiple projects with Sprint Communications.
  • Worked on Informatica Power Center 8.6 to transform the data for Postpaid, prepaid, flex pay, various kinds of the transformations were used to implement simple and complex business logic.
  • Worked with various transformations and Configured Workflows, Worklets, and Sessions.
  • Involved in performance tuning at source, target, mapping, session and system levels.
  • Involved in preparing detailed Business Analysis documents, ETL design documents, unit test plans for the mappings.
  • Involved in writing UNIX shell scripts to archive the source files.
  • Involved in the entire SDLC (Software Development Life Cycle) process that includes Implementation, testing, deployment, documentation, training and maintenance.
  • Involved in designing and developing the data acquisition process for the data warehouse including the initial load and subsequent refreshes.
  • Involved in Unit and Integration testing of the Informatica mappings and created Unit Test Documentation.

Environment: Informatica power center 7.1, MS SQL Server 2005, MS SQL Server Reporting Services, MS SQL Server Integration Services Oracle 10g, SQL plus, flatfiles, BO XI/R2, Microsoft excel, TOAD 8.6, PL/SQL, SQL loader.

Confidential

ETL/Informatica developer

Responsibilities:

  • Used various transformations like Filter, Router, Sequence Generator, Look Ups, Update Strategy, Joiner, Source Qualifier, Expression, Sorter, and Aggregator.
  • Used Mapping Variables, Mapping Parameters and Parameter Files for the capturing delta loads.
  • Worked with various tasks like Session, E-Mail, Workflows, Worklets, and Command.
  • Worked with the Informatica Scheduler for scheduling the delta loads and master loads. worked with aggregate functions like Avg, Min, Max, First, Last, and Count in the Aggregator Transformation.
  • Worked with various re-usable tasks, workflows, Worklets, mapplets, and re-usable transformations.
  • Performance tuning of the process at the mapping level, session level, source level, and the target level.
  • Extensively worked with various Look up Caches like Static, Dynamic, Persistent, and Shared Caches.
  • Worked extensively with the business intelligence team to in corporate any changes that they need in the delivered files.

Environment: Informatica Power Center 7.1/6.2 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), SQL Server, CA Autosys, Oracle 9i, Business Objects, UNIX.

We'd love your feedback!