We provide IT Staff Augmentation Services!

Informatica Developer Resume

MD

Summary

  • IT professional with over 6 years and six months of progressive expertise in analyzing the requirements, Application development, testing and a comprehensive background in Database development and Data Warehousing.
  • Over 6 years of development and design of ETL methodology for supporting data transformations and processing, in a corporate wide ETL Solution using using Informatica Power Center 6.x/7.x/8.x
  • Knowledge of data warehousing techniques, Star / Snowflake schema (Ralph Kimball methodologies), data quality, ETL, OLAP and Report delivery methodologies
  • Exposure to Retail, Finance, Telecommunication and Health care business sectors with regards to data warehouse development.
  • Strong work experience in data mart life cycle development, performed ETL procedure to load data from different sources into data marts and data warehouse using Power Center Repository Manager, Designer, Server Manager, Workflow Manager, and Workflow Monitor.
  • Thorough knowledge of the Data Mart Development Life Cycle. Performed all dimensions of development including Extraction, Transformation, Loading (ETL) data from various sources into Data Warehouses and Data Marts using Power Center (Repository Manager, Designer, Server Manager, Workflow Manager, Workflow Monitor).
  • Integrated heterogeneous Data sources using ETL
  • Extensive experience in developing re-usable Transformations, Mappings and Mapplets.
  • Used Informatica for Migrating data from various OLTP servers/databases.
  • Developed transformation logic to cleanse the source data of inconsistencies during the source to stage loading.
  • Average knowledge in Unix, shell scripting and scheduling cron jobs.
  • Extensive experience in Relational Database Systems like Oracle, SQL Server, design and database development experience with SQL, PL/SQL, SQL PLUS, TOAD, Stored procedures, triggers.
  • Proven ability to implement technology based solutions for business problems
  • Strong analytical, problem-solving, organizational, communication, learning and team skills
  • Actively involved in Performance Tuning, product support on various platforms. SQL Tuning and creation of indexes for faster database access and better query performance
  • Troubleshooted production issues while transferring data from sources to target
  • Superior communication skills, strong decision making skills, organizational skills, and customer service oriented. Excellent analytical and problem solving skills.

SKILLS:

  • Data Warehousing
  • Informatica (Power Center 6.2/7.1/8.1), Erwin,
  • RDBMS
  • Oracle 10g/9i/8i/8.0/7.x, MS SQL Server, TOAD.
  • Programming

SQL, PL/SQL, Unix Shell Scripting

  • OS

Windows 95/98/2000/Window XP, Sun Solaris-UNIX, MS-DOS

  • Methodologies

ETL, OLAP, Complete Software Development Cycle

Data Modeling

Conceptual / Logical / Physical / Dimensional, Star / Snowflake

Professional Experience

Confidential,MD April 08 – Present Informatica Developer

Description:
nVision is a modernization and major update to the NIH Data Warehouse (DW). nVision offers the NIH business community significant new business intelligence and reporting technologies. The goal of nVision is to provide NIH decision makers, managers and staff, easy Web-based access to integrated corporate data from NIH enterprise systems. nVision supports (or participates in) the NIH initiative to provide NIH users easier access to enterprise systems via the NIH Portal.
nVision is the successor to the NIH Data Warehouse and is the reporting solution for the NIH Business System (NBS). Since few years, new nVision business areas are being developed based on the deployment of NBS functions. These business areas include travel, budget and finance, acquisition and supply, property, and service and supply fund. Eventually all of the Data Warehouse business areas will be updated and become unified within nVision and, ultimately, nVision will provide reporting capability for all of the current NIH Data Warehouse business areas.

Responsibilities:

  • Performed gap analysis of the existing commercialized version and the new Federalized version of Informatica.
  • Customized the existing mapping provided by the Federalized version of Informatica depending upon the additional NIH requirements.
  • Created new mappings to load the NIH specific Dimension tables using different transformations like Source Qualifier, Expression, Lookup (Connected and Unconnected), Aggregator, Update Strategy, Joiner, Filter and Router transformations.
  • Created mappings using the TYPE II logic to implement Slowly Changing Dimensions.
  • Created session, worklets and workflows to run the mappings in the desired order.
  • Performed Unit testing for individual mappings and Integration testing for the whole module.
  • Scheduled the workflows to run in an automated manner using the PMCMD commands and shell scripts.
  • Performed Performance Tuning to fine tune the mappings so that the load time can de decreases.
  • Implemented the Federalized version of Informatica in production and participated in post implementation support.
  • Actively participated in debugging various production issues.
  • Documented the entire process. The documents included the mapping document, unit testing document and system testing document among others.
  • Extensively worked on Informatica tool for Extraction, Transformation and Loading.
  • Involved in the development of Informatica mappings and tuned existing mappings for better performance. Worked closely with the DBA’s and end users.
  • Involved in designing and modifications of existing Fact and Dimension tables.
  • Created Informatica mappings with PL/SQL Procedures, Functions and various transformations like filters, lookups, and stored procedure, Joiner, update strategy, expressions and aggregations transformations to build business rules.
  • Used Repository manager to add Repository, User groups, Users and managed users by setting up their privileges and profile.
  • Configured sessions to preserve existing records and handle updates.
  • Extensively worked on the Database Triggers, Stored Procedures, Functions and Database Constraints.
  • Created update strategy and stored procedure transformations to populate targets based on business requirements.
  • Implemented Slowly Changing dimension type2 methodology for accessing the full history of accounts and transaction information.
  • Built-in mapping variable / parameters and created parameter files for imparting flexible runs of sessions / mappings based on changing variable values
  • Developed stored procedures on Oracle 9i databases to impart business logic to calculate balances, generating invoices, processing payments into mappings and trigger pre/post session activities through Unix shell scripts.
  • Designed complex mappings involving target load order and constraint based loading.
  • Improved session performance by batch processing and monitored sessions using Workflow Manager.
  • Involved in enhancements and maintenance activities of the data warehouse including performance tuning, rewriting of stored procedures for code enhancements, creating aggregate tables, and modifying target tables.
  • Executed sessions, sequential and concurrent batches for proper execution of mappings and performed event based scheduling.
  • Scheduled several Cron Jobs for running the sessions.
  • Created documentations of different mappings and generated reports as per requirements.

Environment:
Windows XP Professional, UNIX, Informatica 7.1.2, Oracle 9i, PL/SQL., TOAD 8.0

Confidential,GA May 06– Mar 08

ETL Developer

Description:
The project is to provide a central repository of Subscriber Information, Billing Information and vendor Information. The data is extracted, and transformed from the source, which is the company’s OLTP system (AS400) and loaded into a centralized data warehouse for various strategic business-reporting purposes.

Responsibilities:

  • Analyzed specifications and identified source data needs to be moved to data warehouse, Participated in the Design Team and user requirement gathering meetings
  • Interpreted logical and physical data model for business users to determine common data definitions and establish referential integrity of the system.
  • Participated in the analysis of development environment of extraction process and development architecture of ETL process.
  • Redesigned schema from transaction snapshot based to periodic snapshot schema for the risk management team to monitor monthly sales activities
  • Coordinated with customer in finding the sources and targets for data conversion.
  • Provided technical support to customers in the data conversion process.
  • Involved in the preparation of documentation for ETL standards, procedures and naming conventions.
  • Extract data from Flat files, perform mappings based on company requirements and load it into Oracle Tables.
  • Extensively used Informatica functions LTRIM, RTRIM, IIF, DECODE, ISNULL, TO_DATE, DATE_COMPARE in Transformations.
  • Tuned Informatica session for large data files by increasing data cache size, sequence buffer length and target based commit interval in order to increase the performance.
  • Scheduled workflows using pmcmd commands.
  • Production Support and Development, Modifications, Maintenance of Programs using Shell script, provided on call support for batch jobs running on UNIX.
  • Created and Scheduled Sessions and Batch Process based on demand, run on time, run only once using Informatica workflow Manager.
  • Tested the target data against the source system tables by writing the PL/SQL procedures.
  • Interactions with various business users / Source contacts, DBA’s and system administrators for production issues.
  • Solved production issues and data discards during workflow runs.
  • Debugged to check the errors in mapping.
  • Used SQL Loader Utility for moving data between source systems.
  • Involved in writing shell scripts for automating pre-session, post-session processes and batch execution at required frequency using power center server manager.
  • Created effective Test data and developed thorough Unit test cases to ensure successful execution of the data loading processes.
  • Responsible for documentation, version control of schema and version release.

Environment:
Informatica Power Center 6.2/7.1/7.1.1, Designer, Informatica Repository Manager, PL/SQL, Sun Solaris-Unix, Oracle 9i, Win 2000, TOAD

Confidential,TX Sep 05 –April 06
ETL developer

Description:
Metro PCS is a regional wireless carrier company catering to the customers of southeastern cities. The company wishes to increase its customer base by offering discounts on handsets to its customers by analyzing customer demography, buying pattern of the handsets and profitability. To this end, the company wants to add data to its already existing customer information data warehouse (CIDWH) to do this analysis. Sale details captured at the point of sales systems (POS) is extracted, transformed and loaded to the CIDWH using Informatica ETL tool to provide business users with the data they need to do the analysis.

Responsibilities:

  • Performed data analysis of the source data coming from point of sales systems (POS) and legacy systems.
  • Developed approach paper for the project after gathering the requirements from business users.
  • Developed mappings using Informatica 6.2 designer to extract data from the source databases and flat files into oracle staging area.
  • Developed transformation logic to cleanse the source data of inconsistencies during the source to stage loading.
  • Developed re-usable transformations, mappings and mapplets confirming to the business rules.
  • Implemented the business rules and logic by using Expression, Look up, Sequence Generator, Aggregator, Joiner, Router and Update Strategy transformations.
  • Created session tasks, worklets and workflows to execute the mappings.
  • Performed both unit as well as system integration testing.
  • Involved in the migration of the project from development to production.
  • Involved in monitoring the workflows and in optimizing the load times.
  • Extensively used Informatica functions LTRIM, RTRIM, IIF, DECODE, ISNULL, TO_DATE in Transformations.
  • Tuned Informatica session for large data files by increasing data cache size, sequence buffer length and target based commit interval in order to increase the performance.
  • Scheduled workflows using pmcmd commands.
  • Extensively used Informatica debugger to validate mappings and to gain troubleshooting information about data and error conditions.
  • Performed Performance tuning to ensure optimal session performance.

Environment:
Informatica Power Center 6.1, Oracle 9i, PL/SQL SQL, MS Access, UNIX, Win XP, Flat files. Business Objects 5.1(Designer, Supervisor, Reports).

Academic Qualifications:
B.E (Computer Science)

Hire Now