We provide IT Staff Augmentation Services!

Etl/ Informatica Developer Resume

3.00/5 (Submit Your Rating)

Lexington, KY

SUMMARY

  • Over 8+ years of professional IT experience in Analysis, Design, Development, Testing and Implementation of various Data Warehousing and Software Applications.
  • Proven Experience in IDQData Developer and Data Analyst tool installation & configuration.
  • Used IDQ for data analysis, cleansing, matching, exception handling, reporting & monitoring capabilities.
  • Design & implement data quality & business rules usingIDQ and Powercenter workflows, Mappings, Mapplets, Exception tables, ad - hoc reporting, Data quality score cards.
  • Worked on Informatica MDM Hub components such as Informatica MDM Hub Console, Hub Store, Hub Server, Cleanse Match Server, Cleanse Adapter & Data Modeling.
  • Over 6 years of extensive experience on Informatica Power Center with strong business understanding and knowledge of Extraction, Transformation and Loading of data from heterogeneous source systems like Flat files, Excel, XML, Oracle, Sybase, SQL Server.
  • Experienced in logical and physical data modeling of staging and warehouse environments using Datamodeling tools like Erwin, Rational Rose and Oracle Designer.
  • Extensively involved in ETL Data warehousing using Informatica Power Center 7.x/8.x/9 Designer tools like Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer, Transformation Developer, Workflow Manager and Workflow Monitor.
  • Extensively worked on transformations such as Source Qualifier, Joiner, Filter, Router, Expression, Lookup, Aggregator, Sorter, Normalizer, Update Strategy, Sequence Generator and Stored Procedure transformations.
  • Hands on experience in identifying and resolving performance bottlenecks in various levels like sources, mappings and sessions. Hands on experience in optimizing SQL scripts and improving performance loading warehouse.
  • Very strong in writing SQL Joins, Nested Queries, Unions, Multi-table joins.
  • Experience in scheduling the batch jobs using workflow monitor, ESP, Tivoli, Autosys and FTP methods.
  • Worked with Dimensional Data warehouses in Star and Snowflake Schemas, created slowly changing dimensions (SCD) Type1/2/3 dimension mappings.
  • Experience in Optimizing the Performance of SQL scripts and Oracle database/application tuning.
  • Excellent in coding using SQL, SQL*Plus, PL/SQL, Procedures/Functions, Triggers and Packages.
  • Developed Batch Jobs using UNIX Shell scripts to automate the process of loading, pushing and pulling data from different servers.
  • Good experience in performing and supporting Unit testing, System Integration testing (SIT), UAT and production support for issues raised by application users.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 9.5, 9.0.1, 8.x, 7.x (Designer, Workflow Manager, Workflow Monitor, Repository Manager, Repository Server, Administration Console), PowerExchange / PowerConnect, Informatica data quality (IDQ) / MDM

BI Tools: Cognos, Qlick View, Business objects

RDBMS: Oracle11g / 10g / 9i, SQL Server 2000/2005/2008 , Teradata

Operating System: UNIX, IBM AIX 4.3/4.2, Windows NT/2000/XP/7

Job Scheduling Tools: Tivoli 7x, Autosys 4.5

Languages: SQL, PL/SQL, XML, C

Modeling Tools: ER Studio, Erwin 4.1,UML

Other Tools: MS-Office, MS-Access, Adobe Illustrator, Adobe Photoshop.

PROFESSIONAL EXPERIENCE

Confidential, Lexington, KY

ETL/ Informatica Developer

Responsibilities:

  • Worked with the business analysts and DBA to gather business requirements to be translated into design considerations.
  • This also gave some good expertise with some of the tool migration techniques.
  • Identified and tracked the slowly changing dimensions to capture the changes, in heterogeneous sources and determined the hierarchies in dimensions.
  • Speeding up of work by using re-usable transformations and work-lets, for further lowering of errors and enhancing work flow performance. Developed schedules to automate the update process and Informatica Batches/Sessions.
  • Extracted data from Oracle, MS SQL Server and loaded them into Oracle.
  • Worked with team, that was focused on capturing mainly changes in customer plans.
  • Also, generate notices to customers, where the contracts would expire in a span of less than 3 months, and give them more advantage over choosing rate plans depending on their usage. This was achieved through an extensive of LOOK-UP’s and analysis of Data.
  • Tested data quality using IDQ.
  • Worked with Source Analyzer, Warehouse Designer, Transformation designer, Mapping designer and Workflow Manager.
  • Developed data Mappings between source systems and warehouse components.
  • Created various Transformations like Joiner, Aggregator, Expression, Lookup, Filter, Update Strategy, Stored Procedures, and Router etc.
  • Organized data in order to capture the basic nature of the business environment, providing a stable view of the data associated with the core business processes.
  • Monitored sessions that were scheduled, running, completed or failed.
  • To retrieve the data from various Tables and to test the database, wrote SQL Queries.
  • Worked on Documentation to describe program development, logic, coding, testing, changes and corrections.
  • Participated in Enhancements meeting to distinguish between bugs and enhancements.

Environment: Informatica PowerCenter 9.5, IDQ, Windows, Oracle 10g, Flat Files, TOAD, SQL, PL/SQL, UNIX.

Confidential, Frankfort, KY

ETL/ Informatica Developer

Responsibilities:

  • Analysis, involved in acquiring Requirements from Business Requirement Document (BRD), function/technical specification, development, deploying and testing by Business Analysts, UAT, TQA and Developers.
  • Created IDD/MDM configuration as customer Need.
  • Converted the business requirements into technical specifications for ETL process by populating the fact and dimension tables of data warehouse.
  • Created and executed SQL scripts in oracle to test the data flow from OLTP systems to data warehouse.
  • Designed & assisted Informatica ETL developer to create new Aggregate.
  • Created Informatica mappings for initial load and daily updates.
  • Developed several mappings (Source Qualifier, Lookup, Filter, Joiner, Aggregate, Sequence Generator, Expression, Lookup, Router, Normalizer, and Update Strategy) to load data from multiple sources to data warehouse.
  • Created sessions, workflows and database connections using Informatica Workflow Manager.
  • Worked with pre and post session SQL commands to drop and recreate the indexes on data warehouse using source qualifier transformation of Informatica Power center.
  • Involved in troubleshooting the load failure cases, including database problems.
  • Handled Full load and refresh load via staging tables in the ETL Layer.
  • Involved in Design and Data Modeling using Star schema.
  • Configured environments for full ETL and incremental data loads from OLTP to OLAP data sources using DAC and Informatica routines.
  • Configured Informatica connections source DB connection and target DB connection in DAC.
  • Configured new customized mappings and work flows in DAC for new customized subject area.
  • Configured DAC to run nightly data loads on daily basis.

Environment: Informatica 9.1, Oracle 11g, MS SQL, UNIX, Windows, TOAD, SQL, PL/SQL

Confidential, GA

ETL/Informatica Developer

Responsibilities:

  • Gathered user Requirements and designed Source to Target data load specifications based on Business rules.
  • Analyzed the functional specs provided by the data architect and created technical specs documents for all the mappings.
  • Developed slowly changing dimension Type 2 mappings to maintain the history of the data.
  • Developing several mappings in Informatica a variety of PowerCenter transformations, Mapping Parameters, Mapping Variables and Parameter files in Mapping Designer.
  • Created reusable transformations and mapplets and used them in mappings.
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors that occur while loading.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Implemented performance tuning logic on Sources, Targets, mappings, and sessions to provide maximum efficiency.
  • Loading data from various data sources and legacy systems into Teradata production and development warehouse using BTEQ, FASTEXPORT, MULTI LOAD, FASTLOAD and Informatica.
  • Used Informatica Version Control for checking in all versions of the objects used in creating the mappings and workflows and to keep track of the changes in the development, testing and production environments.
  • Implemented various loads like Daily Loads, Weekly Loads, and Quarterly Loads using Incremental Loading Strategy
  • Created Workflows with worklets, event wait, email and command tasks using Workflow Manager and monitored them in Workflow Monitor.
  • Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process. Maintained the batch processes using Unix Shell Scripts.

Environment: Informatica PowerCenter 8.6.1, Oracle 10g/11g, SQL, PL/SQL, UNIX, Flat files, XML, MS Access, SQL Server 2008, Teradata, Erwin, Business Objects.

Confidential, Wilmington, DE

Informatica Developer

Responsibilities:

  • Designed and Developed Informatica mapping and sessions to create flat files and metadata.
  • Developed PL/SQL Stored Procedures, Packages.
  • Developed Database triggers for implementing Business logic.
  • Involved in Informatica Administration activities such as creating & managing Informatica users, groups and privileges.
  • Involved in taking backup of the repository, restoring repository etc.
  • Tuned the performance of Mappings, Sessions, relational source and targets.
  • Interfaced Trillium to cleanse, de-duplicate and identify household Customers.
  • Prepared Software engineering quality documents related to design, development, testing as per CMM standards.
  • Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes.
  • Worked with Business Analyst and Analyzed specifications and identified source data needs to be moved to data warehouse, Participated in the Design Team and user requirement gathering meetings.
  • Work closely with the distributors to identify problem areas, analyzing their data and existing system limitation
  • Used workflow monitor to monitor the jobs, reviewed error logs that were generated for each session, and rectified any cause of failure.
  • Created Documentation(Run book, Technical Design Document and Low Level Design Document) for Supporting the Application using Microsoft Word, Visio.

Environment: Informatica 7X,6X. Power Center, IDQ, Teradata, SQL Server, Windows NT, Unix Shell Scripts.

Confidential

ETL Tester

Responsibilities:

  • Developed Test Plans, Test Cases, and Test Scripts for SIT and support for UAT tests.
  • Performed Integration testing, Regression, database testing and Functional testing of important modules in the application and Mapping the client requirements with the same.
  • Involved in discussions with development team.
  • Reporting the test activity progress and testing status to the test lead.
  • Worked with ETL group for understating mappings for dimensions and facts.
  • Worked with the components of Data Warehousing like components model, ETL tools.
  • Queried databases with SQL to obtain sample data for billing verification.
  • Loaded data to different databases using SQL scripts and maintained a repository for data loading scripts.
  • Involved in extensive DATA validation using SQL queries and back-end testing.
  • Worked on SQL scripts to load data in the tables.
  • Extensive experience in ETL/ Data warehouse backend testing and BI Intelligence reports testing.

Environment: Windows XP, SQL*Plus, Oracle 9i, TOAD, MS Office Suite, SQL Navigator.

We'd love your feedback!