We provide IT Staff Augmentation Services!

Sr Etl Informatica Developer Resume

Little Rock, AR

PROFESSIONAL SUMMARY:

  • 7+ Years of experience with all phases of SDLC including A nalysis, D esign, D evelopment and Implementation of operational database systems (OLTP) and data warehouse systems (OLAP). The past 7 years spent primarily on Data warehouse projects using Informatica Power center, Oracle, SQL Server, Business Objects and Cognos on UNIX and Windows platforms.
  • Experience in database modeling, analysis, design, development, data conversion using ETL/Informatica and Performance tuning and implementation of several database and data warehousing applications.
  • Extensive ETL experience using Informatica 9.6.1/9.1/8.6.1/8.5/7.1/6.2 (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager), Power Exchange
  • Extensive experience in implementation of Data Cleanup procedures, transformations Scripts, Triggers, Stored Procedures, meta data manager and execution of test plans for loading the data successfully into the targets
  • Extensively used Data transformation Studio (B2B) to change the data from unstructured to structured forms using Parsers, Mappers and Streamers
  • Experience in Data Modeling (Logical and Physical Design for distributed databases, Reverse - engineering and Forward-engineering using Erwin).
  • Expertise in working with Oracle Stored Programs, Packages, Cursors, Triggers, Database Link, Snapshot, Roles, Privileges, Tables, Constraints, Views, Indexes, Sequence, and Synonyms Dynamic SQL and SQL*Loader in distributed environment .
  • Worked on Power Exchange for change data capture (CDC).
  • Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica PowerCenter. experience in ODI upgrade for 10g to 11g. Tested working ODI 10g and 11g working in parallel.
  • Involved in extracting source data from SAP, DB2, Flat Files, VSAM files on different systems and populated target data warehouse on NCR Teradata.
  • Defined Target Load Order Plan for loading data into different Target Tables.
  • Utilized Teradata Fast Load, Multi Load options for loading data to Teradata database.
  • Used Teradata Tools and Utilities like BTEQ/ODBC/JDBC/OLE.
  • Developed sql code for data validations and data computation process on source DB2 transaction system and on target warehouse using Teradata
  • Implemented performance tuning techniques at application, database and system levels.
  • Expertise in Designing Fact Tables, Dimension Tables, Summary Tables.
  • Strong knowledge in OLAP Systems, Kimball and Inmon methodology and models, Dimensional modeling using Star schema and Snowflake schema.
  • Experience in OLAP / BI tools such as Business Objects, Cognos, and SSRS .
  • Experience in UNIX Shell scripting for automating the Jobs.
  • Significant Testing experience, including designing test cases, UAT testing, and designing test data effective in cross-functional and global environments to manage multiple tasks & assignments concurrently.
  • Highly organized and detail-oriented professional with strong business and technical skills.

TECHNICAL SKILLS:

ETL Tools: Informatica 9.6.1/9.1/8.6.1/8.5/7.1/6.2 (Power Mart/ Power Center) 6.x, Informatica Power Exchange, Informatica Power Connect,IDQ,IDE,, Datastage 7.5.x, B2B Data Transformation Studio.

Reporting Tools: BusinessObjectsXIr2/r 1/6.5/6.1/5.1 , CognosReportnet,Cognos8,SSRS(SQL Reporting Services),SAS

Databases: Oracle 10g/9i/8i/8.0/7.x, IBM DB2 UDB 8.0/7.0, Teradata v2r5/v2r6,MS SQL Server 2005/2000/7.0/6.0 , Sybase,MS Access.

Database utilities: SQL *plus, Stored procedures, Functions, Exception handling.

Data Modeling tool/ Methodology: MSVisio, ERWIN 4.x/3.x, Ralph-Kimball Methodology, Bill-Inmon Methodology, Star Schema, Snow Flake Schema, Extended Star Schema, Physical And Logical Modeling.

Languages: C, C++, JAVA, Visual Basic, T-SQL, PL/SQL, XML

Operating Systems: HP: UX, Unix, IBM AIX 4.3/4.2, SunSolaris 9/8/7, Windows 2003/2000/NT

PROFESSIONAL EXPERIENCE:

Confidential, Little Rock, AR

Sr ETL Informatica Developer

Responsibilities:

  • Coordinated with Business Analysts to understand the business requirements and implemented the same into a functional Data warehouse design.
  • Interpreted logical and physical data models for Business users to determine common data definitions.
  • Developed ETL technical specs, Visio for ETL process flow and ETL load plan, ETL execution plan, Test cases, Test scripts etc.
  • Worked with various Informatica client tools like Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Workflow Manager.
  • Created mappings using different transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator.
  • Developed various ODI interfaces to load data from Flat file & Relational Sources to Oracle DataMart.
  • Implemented Informatica Webconsumer Services Transformation to FTP the data from one server to other servers.
  • Used Power Exchange to source copybook definition and then to row test the data from data files etc.
  • Created procedures, functions and database triggers to enforce Business rules using PL/SQL. .
  • Implemented Performance tuning in Mappings and Sessions by identifying the bottlenecks and Implemented effective transformation Logic.
  • Involved in tuning the database for better performance by analyzing the table, adding Hints and by Query Tuning methods.
  • Utilized Teradata Fast Load, Multi Load options for loading data to Teradata database
  • Used Teradata Tools and Utilities like BTEQ/ODBC/JDBC/OLE
  • Identified and defined changes and enhancements to existing applications.
  • Used ODI Designer for importing tables from database, reverse engineering, to develop projects, and release scenarios.
  • Created custom plans for product name discrepancy check using IDQ and incorporated the plan as a Mapplet into Power Centre.
  • Developed UNIX shell scripts for scheduling workflows.
  • Handled the ETL related activates by interacting with other teams on a regular basis.
  • Managed the Metadata associated with the ETL processes used to populate the Data Warehouse.
  • Developed and tested all the backend programs, Error Handling Strategies and update processes.
  • Involved in Unit Testing, User Acceptance Testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Involved in production support activities including monitoring loads, resolving exceptions and correcting data problems.

Environment: Informatica PowerCenter 9.6.1/ 9.1.0 , IDQ, UNIX, Informatica B2B Data Exchange, Oracle 11g, Unix Shell Scripts, Tidal, SQL, PL/SQL, Flat files, DB2, Teradata.

Confidential, NC

Sr ETL Informatica Developer

Responsibilities:

  • Coordinate with the users/analysts to understand the business of the application, gather requirements and put into technical design documents.
  • Interpreted logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system.
  • Involved in developing the technical documents from functional specifications.
  • Identified and tracked the slowly changing dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
  • Designed Complex Informatica mappings with transformations such as Aggregator, Expression, Joiner, Filter, Source Qualifier, XML, Union, connected/unconnected Lookups, Sequence Generator, Update Strategy, Stored Procedure and Router Transformation’s to transform and load the data from Mainframe files, JMS Queues, Flat Files and Oracle data to the Oracle warehouse.
  • Developed complex Interfaces and packages using ODI and migrated the new generated version to test and prod environments.
  • Used ODI as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.
  • Created and Configured Workflows, Worklets, and Sessions to transport the data to target using Informatica Workflow Manager. Also created various tasks such as command task, email task, timer tasks, Event wait and Event Raise tasks in the workflow manager.
  • Involved in creating Reusable Transformations, Mapping Parameters and Mapping Variables.
  • Worked on Teradata insert, Update, Upsert and Dynamic SQL to transform and Load Summary tables built on Core Data Warehouse using BTEQ.
  • Worked on error handling in TD SQL s using ERROR CODE and ACTIVITY COUNT
  • Developed UNIX shell scripts for scheduling workflows by using pmcmd.
  • Involved in tuning the database for better performance by analyzing the table, adding Hints and by Query Tuning methods.
  • Tuned performance of Informatica sessions for large data files by increasing block size, data cache size and, sequence buffer length
  • Developed Scripts to automate the Data Load processes to target Data warehouse.
  • Implemented Informatica IDQ for Data Cleansing and Data Standardization.
  • Implemented the Schedules for the Historical Load run and Incremental Load runs.
  • Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment .
  • Have worked on testing and gathering statistics and monitoring the run times for the performances and fixing it.
  • Migrated mappings, sessions and workflows from Development to Test environment
  • Involved in Performance testing the mappings and workflows in production staging before installing in the Production environment.

Environment: Informatica PowerCenter 9.1/8.6.1, Business Objects XI 3.1,Crystal Reports 2008, Tidal, Oracle11g, MS SQL Server 2008/2005, Erwin7.3, IDQ, Windows NT, IBM-AIX.

Confidential, STAMFORD, CT

ETL Informatica developer

Responsibilities:

  • Performed major role in understanding the business requirements and designing and loading the data into data warehouse (ETL).
  • Used ETL (Informatica) to load data from source to Data Warehouse.
  • Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplets Designer, and Transformation Developer for defining Source & Target definitions and coded the process of data flow from source system to data warehouse.
  • Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Sequence Generator, Filter, Sorter and Source Qualifier.
  • Used Informatica repository manager to backup and migrate metadata in development, test and production environments.
  • Worked with mapping wizards for slowly changing dimensions and slowly growing dimensions by using Informatica mapping designer.
  • Used Informatica Workflow Manager and Workflow Monitor to schedule and monitor session status.
  • Created complex Informatica mappings, reusable objects of Mapplets depending on client requirements
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Used techniques like source query tuning, single pass reading and caching lookups to achieve optimized performance in the existing sessions.
  • Used CDC for moving data from Source to Target.
  • Identified the data in the source system that has changed since the last extraction with the help of CDC.
  • Used UNIX Shell scripting for automation of the process, invoking PL/SQL procedures, and Informatica sessions.
  • Created, optimized, reviewed, and executed Teradata SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables.
  • Designed and Developed Oracle PL/SQL procedures, performed Data Import/Export, Data Conversions and Data Cleansing operations.
  • Used Autosys for automating Batches and Session.
  • Involved in writing shell scripts on Unix (AIX) for Informatica ETL tool to run the Sessions.
  • Involved in generating reports from DM layer using Business Objects.
  • Involved in tracking, reviewing and analyzing defects.

Environment: Informatica 8.6.1, Business Objects XI 3.1,Crystal Reports 2008,Oracle11g, MS SQL Server 2008/2005, Erwin7.3, Windows NT, IBM-AIX.

Confidential

Data Warehouse Developer

Responsibilities:

  • Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
  • Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
  • Analyzed relationships of Flat Files and to extract the analyzed systems, met with end users and business units in order to define the requirements
  • Documented user requirements, translated requirements into system solutions and develop implementation plan and schedule.
  • Developed data Mappings between source systems and warehouse components.
  • Developed ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translates business rules and functionality requirements into ETL procedures.
  • Developed and tested all the backend programs, Informatica mappings and update processes.
  • Created and Monitored Batches and Sessions using Informatica PowerCenter Server.
  • Responsible to tune ETL procedures and STAR Schemas to optimize load and query performance.
  • Extensively worked in the performance tuning of the programs, ETL Procedures and processes
  • Written documentation to describe program development, logic, coding, testing, changes and corrections.
  • Perform analysis and resolution of Help Desk Tickets and maintenance for assigned applications.

Environment: Informatica PowerMart 7.1, Oracle, PL/SQL, Windws, Remedy, Synergy.

Hire Now