We provide IT Staff Augmentation Services!

Lead Etl/informatica Developer Resume

3.00/5 (Submit Your Rating)

New York, NY

SUMMARY:

  • 8.5+ years of IT Experience in Designing, Development, Administration, Implementation of Data Warehouse & Data marts with Informatica Products 9.x/8.x/7.x/6.x as an ETL tool, Informatica Metadata Manager, SQL, PL/SQL.
  • Expert in integration of various Operational Data Sources (ODS) to Multiple Relational Database management Systems (RDBMS) like Oracle, SQL Server, Teradata, DB2, Sybase etc
  • Strong experience in Data Warehousing and ETL using Informatica PowerCenter 9.6/8.6.1/7.1, Power Exchange 9.1/8.6, Oracle 11g/10g/9i, Teradata 13/12/V2R6 and Erwin 7/4.5.
  • 3 years of experience on real - time data Warehouse development using change data capture (CDC) tool like Informatica PowerExchange 9.1/8.6/8.1.
  • Experience in Data Warehouse/Data Mart Development Life Cycle using Dimensional modeling of STAR, SNOWFLAKE schema, OLAP, ROLAP, MOLAP, Fact and Dimension tables, and Logical & Physical data modeling using ERWIN 7.5/4.2 and MS Visio.
  • Having Business Intelligence experience using OBIEE 11g/10g, Business Objects XI R2, MS Access
  • Designed & developed custom ETL processes to populate data warehouse & data marts
  • Strong experience in the Analysis, design, development, testing & Implementation of Business Intelligence solutions using Data Warehouse/Mart Design, ETL, OLAP, BI Client/Server applications
  • Expert in Informatica 9.x installation, configuration, debugging, tuning and administration including Systems implementation, operations and its optimization as Informatica admin
  • Experienced in development, maintaining and implementation of Enterprise Data Warehouse (EDW), Data Marts and Data warehouse with Star schema and snowflake schema.
  • Expert in Data Extraction, Transformation, Loading from data sources and targets like Teradata, Oracle, SQL Server, XML, Flat files, DB2, COBOL and VSAM files etc.
  • Experience in Performance tuning of SQL queries and mainframe applications.
  • Involved in Data Extraction from Oracle, Flat files, Mainframe files using Informatica.
  • Developed various mappings to load data from various sources (Mainframe files, flat files, oracle) using different transformations like source qualifier, Normalizer, joiner, Aggregator, Lookup, Router, Update Strategy, filter and Expression to store the data in different target tables.
  • Expert in Stored Procedures, Triggers, Packages for tools like TOAD, SSMS & PL/SQL Developer
  • Expert in using Informatica Designer for developing mappings, mapplets/Transformations, workflows and performing complex mappings based on user specifications/requirements
  • Expert in Operational and Dimensional Modeling, ETL (Extract, Transform and Load) processes, OLAP, OLTP, dashboard designs and various other technologies
  • Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases.
  • Highly experienced in working and managing projects in Onsite-Offshore models. Proficient in leading team and meeting client expectations/requirements
  • Experienced with Netezza Database integration with Informatica and load process with Netezza Bulk load utilities like Netezza Bulk reader and Bulk writer.
  • Expert in migrating Informatica mappings/Workflows/sessions to Production Repository
  • Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, NameNode, Data Node, Resource Manager, Node Manager etc
  • Experience on IBM Infosphere (DataStage, Quality Stage) / Information Analyzer 8.1.
  • Experience on Apache Hadoop technologies like HDFS, MapReduce framework, Sqoop.
  • Expert in using Informatica’s various tools like Power Center 9.1/9.6, Power Exchange, B2B data exchange including Source Analyzer, Target & Mapping Designer, Workflow Manager etc
  • Designed and Prepared Functional specification documents, Technical specification documents, Mapping Documents for Source to Target mapping with ETL transformation rules.
  • Expert in fine-tuning SQL queries, mappings & sessions to improve the performance
  • Hands on experience in installing, configuring, and using Hadoop ecosystem components like Hadoop, MapReduce, HDFS, Sqoop, VSQL
  • Well versed with all Phases of SDLC (Software Development Life Cycle) Process with respect to development, deployment, maintenance & enhancements especially agile and scrum prototypes
  • Experienced in implementing Oracle BI Apps in all phases like implementing mappings, Designing ETL, metadata management using DAC, Creating Oracle DW, OBIEE reports & Dashboards
  • Strong experience in Scheduling jobs using UNIX Shell Scripts and Unix Crontab
  • Excellent analytical, problem solving & communication skills including leading & managing a team

TECHNICAL SKILLS:

ETLDW Tools: Informatica Power Center 9.x/8.x/7.x, Informatica Power Exchange 9.x/8.x Informatica B2B Data Exchange, Metadata Manager, TOAD, SSMS

Reporting Tools: Business Objects XIR2/6.5 (Supervisor, Business Objects), Cognos, SSRS

Data Modeling: Dimensional Data Modeling, Star Schema Modeling, Snowflake Schema Modeling, FACT and Dimensional Tables, Physical and Logical Data Modeling

Programming Languages: Shell Scripts, C Shell, K-Shell Scripts

Databases: Oracle, Oracle Apps, MS SQL Server, Flat Files, Netezza, PL/SQL, DB2

Servers: Oracle 9iAS, Apache, Tomcat and MS Site Server, Web sphere

PROFESSIONAL EXPERIENCE:

Confidential, New York, NY

Lead ETL/Informatica Developer

Responsibilities:-

  • Using Informatica Power Center 9.6 to extract data from sources like ODS, XML files etc transform, convert & load it into staging database & from staging database to Oracle Servers/files. Also included mappings which directly loaded tables from various sources to Oracle databases
  • Responsible for Optimization of mappings, sessions, PL/SQL queries using Informatica, TOAD
  • Leading team on Alternative approach for PL/SQL, mapping creation of 100+ tables, Pre & Post Migration to load data to data warehouse (DW) in Dev, Test, QA & Prod Environments
  • Involved in Designing and implementing Fact & Dimensional tables using Relational and Dimensional Modeling techniques like Star Schema, Snowflake Schema for RDBMS
  • To Develop UNIX shell scripts to run batch jobs & automate workflows
  • To Extract XML data from different sources such as messaging system TIBCO, files & databases using XML Parser Transformation and load them to DWH
  • Responsible to design Informatica Trigger Processes to initiate and monitor Informatica Events
  • To use Informatica Design Quality (IDQ) tool for data profiling & data validation based on the given business rules & convert them into reusable Mapplets to be used in Power Center
  • Working with data modeler in designing Conceptual, Logical and Physical data models making use of ERwin for relational OLTP systems
  • Created and Configured Workflows, Worklets, and Sessions to transport the data to target warehouse Netezza tables using Informatica Workflow Manager
  • Developed initial & incremental data Conversations, validations and loading in Informatica using Update strategy transformation to load data into Oracle databases. Designed and implemented Data mart and operational databases
  • Developed framework for metadata management
  • Designed and Implemented Slowly Changing Dimension (SCD) Type 1 and Type 2 for inserting and updating Target tables in EDW for maintaining the history
  • Performed complete Informatica migrations/upgrade from v9.1 to v9.6
  • Writing Netezza to Oracle Shell script for loading tables which are required by QA Tools from Netezza in to the Oracle
  • Leading onshore and off shore teams to identify and resolve various issues on day to day basis
  • Worked with ETL, reporting and analytics tools - Informatica, Datastage, Sagent, OLAP tools, Business Objects, Cognos, Microstrategy etc
  • Imported metadata from different sources such as Relational Databases, XML Sources and Impromptu Catalogs into Frame Work Manager
  • Used Windows PowerShell which is a powerful scripting shell that lets administrators and developers automate server administration and application deployment.
  • Responsible for various Requirement Gathering like B2B, Analysis & leading End user Meetings
  • To perform and lead typical Admin jobs like Migrations, FTP requests, Performance tuning, Installation of patches, hotfixes and upgrading the tool etc
  • Designed and Developed the Netezza SQL Scripts as per the Customer requirements.
  • To use Unstructured Data like PDF files, spreadsheets, Word documents, legacy formats, and print streams option to get normalized data using MFT consol in B2B Data Exchange of Informatica
  • To Standardize the formats of irregular data coming from upstream systems into a more meaningful dataset for analysis using various transformations like Standardizer, Label, Match etc
  • Extensively worked on Informatica Lookup, stored procedure and update transformations
  • Creating PL/SQL Stored Procedures, Functions, Triggers & Packages to implement business logics
  • Implemented exception handling using autonomous transactions, Pragma, Locks, used save points, commits & rollbacks to maintain transactional consistency & database integrity.
  • Created & Scheduled Sessions/Batch Process based on demand using Informatica Server Manage
  • Tuning of the mappings to ensure data load was completed by SLA. Working with the Cognos developers to create customized BI reports to meet user requirements using Cognos Query Studio
  • Assisted in designing data model structure and E-R modeling with all the related entities and relationship with each entities based on the rules provided by the end users using ERwin.
  • Developed number of Informatica mappings, mapplets and reusable transformations for the product line profitability systems to facilitate daily, monthly loading of Data.
  • Extensively used SQL Transformation Source qualifier, Joiner for heterogeneous sources, look-up, filter, aggregator, update strategy transformations to transform data into the ODS tables in Teradata and then into the base tables.
  • Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and data Conversations and validations based on design specifications
  • Involved in Unit Testing, User Acceptance testing to check if data loads into targets are accurate

Environment: Informatica Power Center 9.6/9.1, Informatica Power Exchange 9.6/9.1, Oracle 10.0.1, PL/SQL developer, SQL Server 2008/2014, Aginity workbench for Netezza, B2B Data Exchange, TOAD, Teradata, SSMS, Netezza 4.2, Putty, UNIX Shell Script, Cognos, IDQ tool

Confidential, New York, NY

Sr. ETL/Informatica Developer

Responsibilities:-

  • Created ETL transformations like Lookup, Joiner, Rank and Source Qualifier Transformations in the Informatica v9 designer for multiple RDBMS
  • Populated data from OLTP application database to Enterprise data warehouse using Informatica
  • Developed mappings to load into staging tables and then to Dimensions and Facts.
  • Developed robust mappings using various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union in the Informatica Designer
  • Worked with business users & RADAR reporting teams to gather detailed reporting requirements
  • Prepared Functional Design Specifications and Technical Design Specifications for BO reports and Technical design specification for new Universe development
  • Involved in all phases of SDLC from requirement gathering, design, development, testing (Agile methodology), Production, user training and support for production environment
  • Configured various big data workflows to run on top of Hadoop and these workflows comprise of heterogeneous jobs like VSQL, Sqoop and MapReduce
  • Created deployment document for migration of code from one environment to another
  • Created Configuration files with XML documents to support the packages in various environments
  • Modified existing mappings for enhancements of new business requirements and Created mapplets to use them in different mappings to load data to data mart and EDW
  • Team lead for the design & creation of Data Mart as part of the Enterprise Data Warehouse. Data Marts used Ralph Kimball methodology, 5 star schemas with conformed dimensions & fact tables
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts
  • Extensively used Informatica Data Explorer (IDE) & Informatica Data Quality (IDQ) profiling tools
  • Loaded data in to the Teradata tables using Teradata Utilities BTEQ, Fast Load, Multi Load etc
  • Developed Mappings for Type 1 and Type 2 Slowly Changing Dimensions (SCD)
  • Exported the Universe to the Repository to make resources available to the users for ad- hoc report needs using BI/BO reporting tools.
  • Develop the new Universe which holds tables from various functionalists like Rating, Quality, and Finance to satisfy the new business area which has entirely different functionality.
  • Created complex reports stating Rating information Region wise per year using cascading and user objects like measure objects to create the summarized reports using Business Object
  • Designed, developed & managed Universes in BOs Designer for generating Ageing, Corrective Action Report etc for QA users making use of Prompts, Customized LOV's, multiple data providers
  • Coordinated with offshore and onshore teams for different exercise including report deliveries

Environment: Informatica Power Center 9.5, Informatica Power Exchange, Data Analyzer 8.1, Erwin, Business Objects XI R2/R3/R4.1,TOAD, Teradata, Oracle 9i, PL/SQL, Hadoop, Hive, WinSCP, JSON files

Confidential, Dublin, OH

ETL/Informatica Developer

Responsibilities:-

  • Developed and supported the Extraction, Transformation, and load process (ETL) for data migration using Informatica Power Center and Exchange
  • Extensively used Joins, Triggers, Stored Procedures and Functions in Interaction with backend database for ERP using PL/SQL, TOAD, EBS
  • Used Normalizer Transformation to split a single source row into multiple target rows by eliminating redundancy and inconsistent dependencies.
  • Extracted XML data from different sources such as messaging system, ODS, flat files, JSON files, HIPAA 4010 transactions & databases using XML Parser Transformation
  • Performed match/merge and ran match rules to check the effectiveness of processed on data.
  • Designed and developed Informatica packages, designed stored procedures, configuration files, tables, views, and functions.
  • Created the labels/deployment groups for code migration.
  • Worked extensively on SQL, BTEQ Scripts and UNIX shell scripts
  • Configured match rule set property by enabling search using rules as per Business Rules
  • Migrated Informatica mapping to the SQL Server Integration Services packages to transform data from SQL 2000 to MS SQL 2005.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica
  • Extracted data from various heterogeneous sources like Oracle, SQL Server, DB2, ODS & Flat Files
  • Worked on Dimension/Fact tables to implement the business rules and get required results. Developed Re-usable Transformations and Re-Usable Mapplets to load data in data mart
  • Used various transformations like Lookup, Filter, Normalizer, Joiner, Aggregator, Expression, Update strategy, Sequence generator, XML Generator, router, SCD etc. in the mappings.
  • Worked on XML Parser transformation to read the XSD file and build the source definition and accordingly to read the XML source file.
  • Used Netezza SQL to maintain ETL frameworks and methodologies in use for the company and also accessed Netezza environment for implementation of ETL solutions
  • Involved in loading the data into Netezza from legacy systems and flat files using scripting on UNIX. Used NZ SQL & NZ LOAD utilities of Netezza
  • Responsible for Unit testing and Integration testing of mappings and workflows, also responsible for Performance Tuning at the Source, Target, Mapping and Session Level.
  • Rational Clear case is used to Controlling versions of all files & Folders (Check-out, Check-in).
  • Provided excellent customer service to the internal functional team by pro-actively following up with the issues on hand (through detailed emails and by setting up short meetings).

Environment: Informatica Power Center 9.5/9.1, Informatica Power Exchange 9.5/9.1, TOAD, Teradata, SQL Server, Oracle 11g, Shell Scripts, UNIX, JSON files, IDQ tool, Auto-sys scheduling tool, Oracle E-Business Suite (EBS)

Confidential, Reston, VA

ETL/Informatica Developer

Responsibilities:-

  • Worked with team to understand the business requirements and designed, transformed & loaded data into data warehouse (DW) using ETL tool Informatica Power Center and Exchange
  • Created new mappings & updated old according to changes in Business logic using Informatica
  • Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, and Transformation Developer for defining Source & Target definitions and coded the process of data flow from source system to data warehouse.
  • Analyzed business requirements, performed source system analysis, prepared mappings and technical design document from source to target for data conversations, validation & loading.
  • Validated data using Informatica Data Explorer (IDE) & performed Proof of Concept for Informatica Design Quality (IDQ)
  • Developed complex mappings using Transformations like Lookup Joiner, Sorter, Rank, Source Qualifier, Router, Union, Aggregator, Filter, and Expression in the Power Center Designer.
  • Analyzing existing database schemas and designing star schema models to support the users reporting needs and requirements to develop Data Mart from Oracle tables
  • Developed initial & incremental data Conversations, validations and loading in Informatica using Update strategy transformation to load data into Oracle databases
  • Addressed performance issues on ETL jobs, views, stored procedures, Reporting & Ad-hoc SQL.
  • Extracted data from various heterogeneous sources like Oracle, SQL Server, DB2 and Flat Files.
  • Formulated the QA plan for black box testing of the application including Functional, Regression, Integration, Systems and User Acceptance Testing.
  • ETL was performed using Informatica Power Center to build the data Mart as required
  • Involved in SQL Query tuning & provided tuning recommendations to ERP using Oracle EBS tools
  • Extensively worked with various re-usable components like tasks, workflows, Worklets, Mapplets and transformations to load data to data mart and EDW
  • Developed Informatica workflows & sessions associated with mappings using Workflow Manager
  • Worked with session logs, Informatica Debugger, and Performance logs for error handling to fix workflows and session failures.
  • Created UNIX Shell Scripts for batch scheduling and loading process for database used for ERP

Environment: Informatica Power Center 8.6.1, Power Exchange, Oracle 10g, DB2, PL/SQL, Teradata, TOAD, SQL*Loader, Oracle E-Business Suite (EBS), SQL Server 2005 and Windows Server 2003.

Confidential, New York, NY

Informatica Developer

Responsibilities:-

  • Designed and created mappings using tools like Source Analyzer, Warehouse, Mapping & Mapplet Designer, Transformation Developer, Informatica Repository Manager and Workflow Manager.
  • Developed mappings to load data to Fact and Dimension tables of EDW, for Type 1, Type 2 SCDs and Incremental loading including unit testing of the mappings.
  • Involved in various phases of the project from design to development, integration and acceptance
  • Extensively used various transformations like Source Qualifier, Aggregator, Filter, Joiner, Sorter, Lookup, Update Strategy, Router, Sequence Generator etc. and used transformation language like transformation expression, constants, system variables, data format strings etc.
  • Analyzed the session logs, bad files and error tables for troubleshooting mappings and sessions
  • Involved in performance tuning of mappings to tune the data load & SQLs for better performance
  • Debugged maps using Debugger and Transformation's verbose data.
  • Used Informatica PowerCenter’s Source Analyzer, Target Designer, Mapping Designer, Workflow Manager, Mapplets, and Reusable Transformations
  • Analyzed business requirements, performed source system analysis, prepared technical design document and source to target data mapping document Profiled data using Informatica Data Explorer (IDE) & performed Proof of Concept for Informatica Design Quality (IDQ)
  • Performance tuning of mappings, transformations and sessions to optimize session performance.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Created shared folders, local and global shortcuts to reuse metadata.
  • Resolved performance issues for huge volume of data & increased the performance significantly
  • Scheduled and monitored workflows by use of Workflow Manager and Workflow Monitor.
  • Worked on Dimension/Fact tables to implement the business rules and get required results. Developed Re-usable Transformations and Re-Usable Mapplets.
  • Defined parameters, variables and parameter files for flexible mappings/sessions.
  • Prepared various kinds of documents like Test case document, mapping design document.
  • Involved in the migration of code from development to QA environment and then to production.

Environment: Informatica Power Center 8.1 (Source Analyzer, Warehouse Designer, Transformations, Mapplet & Mapping Designer, Workflow Manager), Worklets, SQL, MS SQL 2005, Windows NT, IDQ

Confidential, Bloomfield, CT

ETL Developer

Responsibilities:-

  • Analyzed business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.
  • Helped to create detailed Technical specifications for Data Warehouse and ETL processes.
  • Tuned Informatica mappings and SQL queries to improve the execution time by applying suitable Partitioning mechanisms and tuning individual transformations inside the mapping.
  • Worked on Informatica - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Created various sessions, tasks like event raise and event wait from OLTP data sources
  • Designed mappings to load data from HIPAA files into Database as healthcare industry standards
  • Worked on the requirements gathering and analysis for loading data into ODS
  • Extracted/loaded data from/into diverse source/target systems like Oracle, XML, EDW & Flat Files
  • Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Filter, Connected and unconnected Lookups, Joiner, update strategy and stored procedure.
  • Developed mappings to load Fact and Dimension tables, SCD Type I and SCD Type II dimensions and Incremental loading and unit tested the mappings.
  • Creating new and enhancing the existing stored procedure SQL used for semantic views and load procedures for materialized views.
  • Addressed performance issues on ETL jobs, views, stored procedures, Reporting & Ad-hoc SQL.
  • Involved in creating different types of reports including OLAP, Drill Down and Summary in BO.
  • Performed Performance Tuning of SQL, ETL & other processes to optimize session performance
  • Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer to load data to EDW.

Environment: Informatica Power Center 8.1, Oracle 9i, PL/SQL Developer, SQL*Plus, UNIX Shell Script

We'd love your feedback!