We provide IT Staff Augmentation Services!

Sr.etl/informatica Developer Resume

0/5 (Submit Your Rating)

TN

SUMMARY

  • 8+ years of IT experience in Design, Development, Testing and Implementation of business application systems for Pharmaceutical, Financial, Banking, and Utility Sectors.
  • Experience in data mart life cycle development and perform ETL procedure using Informatica PowerCenter Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools - Informatica Server, Repository Server manager.
  • Experience in analyzing, managing, building various Data Warehouse systems, collaborates with Business Analysts and Technical Team.
  • Proficient in all phases of the Software Development Life Cycle (SDLC) - Requirement Gathering, Analysis, Design, System Implementation, Performance Tuning, Testing and Support.
  • Experience in integration of data sources like Oracle, DB2, Flat files, Sybase, and SQL Server into the staging area, experienced working with Ralph Kimball Methodologies, and experience in Dimensional Modelling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and Logical Data Modelling using Erwin.
  • Developed complex mapping using Informatica Transformation logic like Unconnected and Connected Lookups, Source Qualifier, Filter, Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter and Sequence Generator.
  • Database experience includes Oracle 11g/10g, SQL, PL/SQL, Stored Procedures, Packages, TOAD, Constraints, Triggers, Indexes, Views, Database Links, Export/Import Utilities.
  • Experienced using various automating DB tools such as Informatica Scheduler, Autosys, and Oracle Scheduler.
  • Involved in various ETL activities that includes designing, developing, implementing, and support for Data Warehouse using ETL tools.
  • Experienced in developing Slowly Changing Dimensions Type 1,2, & 3 using Informatica PowerCenter.
  • Worked on Informatica SAP BAPI/IDOCs transformations, Webservices transformations and security/certificates/mutual authentication configuration using Informatica 8.6.
  • Worked with UNIX environment systems involves UNIX shell scripts for Informatica Pre & Post session action.
  • Experience with ETL process involves gathering user requirements, preparing detailed documents for the Source/Target mapping.
  • Excellent interpersonal and communication skills, self-motivated, and is experienced in working with team, business people and developers across multiple disciplines.

TECHNICAL SKILLS

Operating Systems and Environment: Mac OS X, Windows,Vista/XP, UNIX, Sun Solaris, Windows Server/Client OS

ETL Tools: Informatica Power Center, Informatica server 2008, Power Mart, Informatica Power Analyzer, Power Exchange, Informatica Power Connect, Informatica Designer, Workflow Manager, Workflow Monitor

Databases: Oracle, MS SQL Server, DB2 v8.1, Teradata, MS Access, Sybase 12x/11x.SAP HANA.

DB Tools: TOAD 8.6, SQL*loader, SQL*Plus

Data Modelling tools: Ralph Kimball Methodology, Oracle Designer, Visio, Star Join Schema Modelling, Snowflake Modelling, FACT and Dimensions Tables, Physical and Logical Data Modelling

Business Intelligence Tools: Cognos, Business Objects XI r2/6.x/5.x, Crystal Reports 8.0, MicroStrategy 7i, MS Access Reports

Languages: SQL, PL/SQL, UNIX, Shell scripts, JAVA, C++, HTML

Scheduling Tools: Autosys, Control-M

PROFESSIONAL EXPERIENCE

Confidential, TN

Sr.ETL/Informatica Developer

Responsibilities:

  • Designed ETL high level work flows and documented technical design documentation (TDD) before the development of ETL components to load DB2 from Flat Files, Oracle, DB2 systems to build Type 2 EDW using Change data capture.
  • Created stored procedures, views based on project needs
  • Develop and coding the ‘real time’ and batch modes loads.
  • Developed standard framework to handle restart ability, auditing, notification alerts during the ETL load process.
  • Created shortcuts for reusable source/target definitions, Reusable Transformations, mapplets in Shared folder.
  • Involved in performance tuning and optimization of mapping to manage very large volume of data
  • Prepared technical design/specifications for data Extraction, Transformation and Loading.
  • Worked on Informatica Utilities Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Develop complex ETL mappings on Informatica 10.x platform as part of the Risk Data integration efforts.
  • Implemented SCD Type 1 and SCD Type 2 for loading data into data warehouse dimension tables.
  • Implemented error handling for invalid and rejected rows by loading them into error tables.
  • Implementing the Change Data Capture Process using Informatica Power Exchange.
  • Extensively worked on batch frame work to run all Informatica job scheduling.
  • Developed complex mappings such as Slowly Changing Dimensions Type II-Time stamping in the Mapping Designer.
  • Used various transformations like Stored Procedure, Connected and Unconnected lookups, Update Strategy, Filter transformation, Joiner transformations to implement complex business logic.
  • Used Informatica Workflow Manager to create workflows, database connections, sessions and batches to run the mappings.
  • Used Variables and Parameters in the mappings to pass the values between mappings and sessions.
  • UsedTeradata utilities fast load, multiload, tumpto load data.
  • Implemented restart strategy and error handling techniques to recover failed sessions.
  • Used Unix Shell Scripts to automate pre-session and post-session processes.
  • Did performance tuning to improve Data Extraction, Data process and Load time.
  • Extensively Worked on Star Schema, Snowflake Schema, Data Modelling, Logical and Physical Model, Data Elements, and Source to Target Mappings.
  • Designed presentations based on the test cases and obtained UAT signoffs
  • Documented test scenarios as a part of Unit testing before requesting for migration to higher environment levels and handled production deployments
  • Recorded defects as a part of Defect tracker during SIT and UAT
  • Identified performance bottlenecks and suggested improvements.
  • Performed Unit testing for jobs developed, to ensure that it meets the requirements
  • Provided performance tuning and physical and logical database design support in projects for Teradata systems.
  • Worked with BI and BO teams to observe how reports are affected by a change to the corporate data model.

Environment: Informatica 10.1.1, 9.5, Oracle, XML, SQL Server 2008, Web services, DB2 Mainframe, HDFS, Tidal (Scheduler), Teradata, Cognos, Remedy (Ticketing tool), GitHub, HP QC (Testing tool),Type 2 EDW.

Confidential - Mclean, VA

Sr. ETL/Informatica Developer

Responsibilities:

  • Involved as primary on-site ETL Developer during the analysis, planning, design, development, and implementation stages of projects using IBM Web Sphere software (Quality Stage v8.1, Web Service, Information Analyzer, Profile Stage, WISD of IIS 8.0.1).
  • Prepared Data Mapping Documents and Design the ETL jobs based on the DMD with required Tables in the Dev Environment.
  • Active participation in decision making and QA meetings and regularly interacted with the Business Analysts &development team to gain a better understanding of the Business Process, Requirements & Design.
  • Used Data Stage as an ETL tool to extract data from sources systems, loaded the data into the ORACLE database.
  • Designed and Developed Data Stage Jobs to Extract data from heterogeneous sources, applied transform logics to extracted data and Loaded into Data Warehouse Databases.
  • Created Informatica maps using various transformations like SAP BAPI/RFC, SAP IDOCs transformations, Web services consumer, XML, HTTP transformation, Source Qualifier, Expression, Look up, Stored procedure, Aggregate, Update Strategy, Joiner, Union, Filter and Router.
  • Extensively worked with Join, Look up (Normal and Sparse) and Merge stages.
  • Extensively worked with sequential file, dataset, file set and look up file set stages.
  • Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek for development and de-bugging purposes.
  • Used the Data Stage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions on ad hoc or scheduled basis.
  • Developed complex store procedures using input/output parameters, cursors, views, triggers and complex queries using temp tables and joins.
  • Converted complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance.
  • Creation of jobs sequences.
  • Maintained Data Warehouse by loading dimensions and facts as part of project. Also worked for different enhancements in FACT tables.
  • Created shell script to run data stage jobs from UNIX and then schedule this script to run data stage jobs through scheduling tool.
  • Design, develop, and expose SAP BAPI objects. Develop and deploy SAP interfaces withexternal enterprise systems.
  • Developed mappings in Informatica using BAPI and ABAP function calls in SAP.
  • Analyze performance and monitor work with capacity planning.
  • Performed performance tuning of the jobs by interpreting performance statistics of the jobs developed.
  • Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
  • Participated in weekly status meetings.
  • Developed Test Plan that included the scope of the release, entrance and exit criteria and overall test strategy. Created detailed Test Cases and Test sets and executed them manually.

Environment: IBM Web Sphere Data Stage 8.1, SAP 1.0, BAPI, Web Services, Quality Stage 8.1, (Designer, Director, Manager), Microsoft Visio, IBM AIX 4.2/4.1 IBM DB2 Database, SQL Server, IBM DB2, Teradata, ORACLE 11G, Query man, Unix, Windows.

Confidential - San Francisco, CA

Etl Informatica developer

Responsibilities:

  • Extensively Worked with Business Users to gather, verify and validate various business requirements.
  • Identified various source systems, connectivity, tables to ensure data availability to start the ETL process.
  • Worked as Data modeler and created Data model for warehouse and involved in ODS and DataMart data models.
  • Worked as Data analyst to analyze the source systems data.
  • Created Design Documents for source to target mappings.
  • Developed mappings to send files daily to AWS.
  • Used UNIX scripting to apply rules on the raw data within AWS.
  • Used Redshift within AWS.
  • Created Complex mappings using Unconnected and Connected Lookup, Aggregator and Router transformations for populating target table in efficient manner.
  • Created Stored procedures to use oracle generated sequence number in mappings instead to using Informatica Sequence generator.
  • Created complex Mappings and implemented Slowly Changing Dimensions (Type 1, Type 2 and Type 3) for data loads.
  • Created complex Mappings to implement data cleansing on the source data.
  • Used Mapping Variables, Mapping Parameters and Session Parameters to increase the re-usability of the Mapping.
  • Created source to target mappings, edit rules and validation, transformations, and business rules. Analyzed client requirements and designed the ETL Informatica mapping.
  • Scheduled and Run Workflows in Workflow Manager and monitored sessions using Informatica Workflow Monitor.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target-based commit interval.
  • Validated and tested the mappings using Informatica Debugger, Session Logs and Workflow Logs.
  • Created detail Unit test plans and performed error checking and testing of the ETL procedures using SQL queries, filtering out the missing rows into flat files at the mapping level.
  • Used Ultra Edit tool and UNIX Commands to create access and maintain the session parameter files, data files, scripts on the server.
  • Used CUCUMBER automated test tool to automate the unit tests for Informatica ETL.
  • Followed and automated the Acceptance Test Driven Development (ATDD) and Test-Driven Development (TDD) for unit tests for Informatica ETL.
  • Scheduled the ETLs using ESP scheduler.

Environment: Informatica Power Center 9.6.1, Oracle 11g, SAP Systems, Tableau 9.2, AWS, Redshift, DB2, Flat files, SQL, putty, UltraEdit-32, shell scripting, Toad, Quest Central, UNIX scripting, Windows NT

Confidential - Reston, VA

Informatica Developer

Responsibilities:

  • Project involved the design and development of Data Warehousing project for the improvement of Financial System.
  • Responsible for creating & running SQL scripts for DDL, DML operations on Oracle Database.
  • Experienced in data migration to import data from one system to another system.
  • Analyzed business requirements and worked closely with the various application teams and business analyst to develop ETL documents.
  • Worked with various sources like Oracle 10g, Teradata, and MS SQL Server 2005 to extract the data.
  • Used the Remote functional call RFCas the SAP interface for communication between systems.
  • Widely used Informatica PowerCenter 8.6.1 - Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer and Informatica Work Flow Manager.
  • Mapping includes Transformations like Lookup, Router, Filter, Joiner, Stored procedure, Source Qualifier, Aggregator, and Update strategy extensively to implements business logics.
  • Tuned performance of Informatica session increased blocked size to buffer large data, large commit interval for both source and target.
  • Developed Mapplets and used them in different Mappings.
  • Involved to create migration document for Informatica Repository to migrate mapping across various repository for Development, Testing, and Production purpose.
  • Used UNIX System to monitor Informatica Repository services and Integration Services.
  • Created on demand Session and batches to move data at the specific define time intervals.

Environment: Informatica Power Center 8.6.1, Oracle 10g, SAP Interfaces, Teradata, SQL*Plus, PL/SQL, OBIEE, Toad, UNIX (Sun Solaris), Data Modelling, Dimensional Modelling, Flat files, UNIX Shell scripting

Confidential

Informatica Developer

Responsibilities:

  • Assisted to prepare design/specifications for data Extraction, Transformation and Loading.
  • Developed Informatica mappings, enabling the extract, transport and loading of the data into target tables.
  • Created Workflow, Worklets and Tasks to schedule the loads at required frequency using Workflow Manager.
  • Prepared reusable transformations to load data from operational data source to Data Warehouse.
  • Wrote complex SQL Queries involving multiple tables with joins.
  • Scheduled and Run Workflows in Workflow Manager and monitored sessions using Informatica Workflow Monitor.
  • Used debugger, session logs and workflow logs to test the mapping and fixed the bugs.
  • Analyzed the dependencies between the jobs and scheduling them accordingly using the Work Scheduler.
  • Improved the performance of the mappings, sessions using various optimization techniques.

Environment: Informatica 8.1, OBIEE, Erwin, Oracle 10g, SQL Server 2008, Flat files, SQL, putty, UltraEdit-32, shell Programming, Toad, SQL Developer, UNIX scripting, Windows NT.

We'd love your feedback!