We provide IT Staff Augmentation Services!

Informatica Developer Resume

5.00/5 (Submit Your Rating)

Blvd, DetroiT

PROFESSIONAL SUMMARY

  • 7+ years of Information Technology experience as an Informatica Developer in Data warehousing and Business Intelligence with emphasis on Requirement analysis, Data analysis, Application Design, Development, Implementation, Testing, Maintenance and Documentation.
  • Extensive hands-on experience using Informatica Power Center 8.x/7.x/6.x (Repository Manager, Mapping Designer, Workflow Manager, Mapplet Designer, Workflow Monitor, Source Analyzer, Transformations Designer, Warehouse Designer).
  • Experience in following domains: Health Care, Financial, Investment/Banking, Communication, and Health Insurance.
  • Expert level Data Integration skills using Informatica Power Center to design, develop, implement, and optimize ETL workflows and transformations to move data from multiple sources including flat files, RDBMS tables intoOperational Data Store (ODS), Data Warehouse and Data Marts.
  • Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements. Solid understanding of OLAP concepts and challenges, especially with large data sets.
  • Designed and developed complex mappings, from varied transformation logic like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Stored Procedure, Update Strategy, SCD's Type1, Type2 and Type3.
  • Experience on Debugger to validate the mappings and gain troubleshooting information about data and error conditions.
  • Good understanding with all phases of SDLC (System Development Life Cycle) including Planning, Analysis, Design, Implementation and Maintenance.
  • Expertise in Performance Tuning of overall data warehouse, database, ETL and OLAP tuning coordinating with DBAs and UNIX Administrator for better performance.
  • Worked extensively with Dimensional modeling, StarSchema, and Snowflake modeling, Physical and Logical data modeling using Erwin, Data migration, Data cleansing, Data profiling, Data masking, ETL Processes features for data warehouses.
  • Extensive work experience with Database programming in Oracle 10g/9i/8i/7, MS SQL server 2000/2005.
  • Experience in writing UNIX shell scripts and automated the entire processes using UNIX shell scripting.
  • Extensive knowledge of PL/SQL, SQL programming.
  • Worked with Oracle Utilities such as SQL*Loader.
  • Experience in supporting the offshore people and on-site project delivery process.
  • Strong analysis and design skills with an ability to work with business partners to understand complex functional needs, leadingthe creation and maintenance of technical specification documentation.
  • Excellent Analytical, Critical thinking, and Creative Problem solving skillswith ability to quickly master new applications.
  • Proven communication skills, both written and oral.
  • Enthusiastic, team player, goal-oriented, independent, abreast with latest technology and literature.

Technical Skills

  • ETL Tools: Informatica Power Center 8.x, 7.x, 6.x.
  • Databases: Oracle 11g/10g/9i/8i, MS SQL Server 2008/2005, MS Access.
  • Programming /GUI: SQL, PL/SQL, T/SQL, C, C++, UNIX Shell Scripting.
  • Operating Systems: UNIX, Sun Solaris 8, Windows 98/NT/XP/2000/2003, IBM AIX.
  • Tools: TOAD, Visio, SQL*Loader, Rapid SQL, Oracle SQL Developer, SQL Navigator, PUTTY, Control-M, Autosys, Mercury Quality Center and Business Objects.

PROFESSIONAL EXPERIENCE

Client: Confidential, Detroit. Jul'10 - Till Date
Position: Sr. Informatica Developer.

Description:
TheBlue Cross Blue Shield Association(BCBSA) is a federation of 39 separate health insurance organizations and companies in the United States. BCBS of Michiganis one of the largest independent licensees ofBCBSA.It has about 4.3 million members, which makes it the largest single state Blues Plan.

Responsibilities:

  • Experienced working with team, lead developers, Interfaced with business analysts, coordinated with management and understand the end user experience.
  • Interacted with various business people in Medical Voucher System and Facets side and gathered the business requirements and translated them into technical specifications.
  • Documented business requirements, discussed issues to be resolved and translated user input into ETL design documents.
  • Worked with data analysts to implement Informatica mappings and workflows, shell scripts and stored procedures to meet business requirements.
  • Involved in ETL process from development to testing and production environments.
  • Guided the testing team (three testers) for CVP to perform end to end testing.
  • Developed test cases and mapped them to various business and user requirements.
  • Loaded the test cases in MQC and guided the team with execution of these test cases and logging of defects in MQC.
  • Extracted data from various sources like Oracle and flat files and loaded into the target Oracle database.
  • Created mapping using various transformations like Joiner, Filter,Aggregator, Lookup, Stored Procedures, Router, Sorter, Rank, Expression, Normalizer and Update Strategy.
  • Developed Informatica mappings to generate FVR (facets voucher record - Claims processed for current date) which contains EOB's (Explanation of benefits), Notice of Payments (NOP's) and Check are issued for the current voucher date.
  • Loaded TDS (tactical data store) with the derived voucher fields from MVS (Medical Vouchering System) from the reverse flow flat file.
  • Involved in performance tuning of the ETL process by addressing various performance issues at the extraction and transformation stages.
  • Extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings and sessions.
  • Created and Executed workflows and Worklets using Workflow Manager to load the data into the Target Database.
  • Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables and Session Parameters.
  • Developed Session Parameter files for the workflows.
  • Extensively participated in System/Integration/Performance testing.
  • Analyzed the source fields and wrote SQL queries for field to field validation by referring source to target mapping document.
  • Developed test case's for business and user requirements to identify claims for Institutional, Professional, Subscriber paid, etc. and wrote SQL queries to validate the data on the source and target databases according to source to target mapping document.
  • Involved in regular discussions with the Facets team to enter test data.
  • Provided test data as per the test data requirements provided by the Medical Vouchering System team.
  • Extensively used Mercury Quality Center to load test cases, execute them and log defects found in system testing.
  • Ran JCL's regularly to kick-off jobs from main frames.
  • Extensively worked with UNIX Shell scripting to validate and verify the data in the flat files sent to MVS using Korn Shell.
  • Responsible for ETL process under development, test and production environments.
  • Handled Production issues and monitored Informatica workflows in production.

Environment:Informatica Power Center 8.6.1, Oracle 11g/10g, MS Access, Facets 4.51, TOAD 9.0, SQL Developer, Autosys, JCL, PL/SQL, UNIX(server), Mercury Quality Center, Windows XP professional.

Client: Confidential, Water Street, NY. Dec'09 - Jun'10
Position: Sr. Informatica Developer.

Description:
Standard & Poor\'s (S & P) is a United States-based financial services company. It is a division of The McGraw-Hill Companies that publishes financial research and analysis on stocks and bonds. It is well known for the stock market indexes, the US-based S & P 500, the Australian S & P/ASX 200, the Canadian S & P/TSX, the Italian S & P/MIB and India\'s S & P CNX Nifty.

Responsibilities:

  • Involved with the business analyst and the ETL team to gather Business requirements.
  • Identified the workflows, worklets and Mappings to be enhanced and the new mappings to be created.
  • Created and modified the mappings, worklets and workflows which are affected by the new source modifications.
  • Support the offshore people.
  • Generated the XML files and Performed Power Center Administration and migrated ETL code from Development to Test, and to Production using both Import/Export XML and Repository Manager's copy/paste utility.
  • Created Mappings for Full load and also for the incremental load of the data.
  • Developed new forms, XML Publisher Reports and discoverer reports.
  • Handling VLDF files (Variable Length Delimited Flat Files) by taking a business case.
  • Extensively used Expression, Joiner, Lookup, Aggregator, Update strategy, filter Transformations etc. in various mappings.
  • Mostly utilized the SQL override and Lookup SQL override for filtering the data according to the requirement.
  • Used IDE for data analysis, Data Migration and understanding the different patterns of the data.
  • Enhanced the production environment by modifying the existing mappings by check in and check out, and then moved to the production.
  • Incorporated SQL tuning recommendations for data retrieval by using indexing strategy and using hints.
  • Unit testing was done on each mapping separately for validating the performance and verifying the data.
  • Informatica dataflow partition was used for loading large files of data.
  • Improved the performance of the mappings which were having data loading issues.
  • Worked on development and testing environment and then migrated my workflows to the production environment.
  • Created labels, wrote queries in the repository manager for the deployment folder.
  • Provided test queries for the QA team for the validation of the data.
  • Designed restart recovery logic and also defined point of commit activities for highly volume loads.
  • Involved in writing UNIX shell scripts, PL/SQL procedures, Pre-session and Post- Session Scripts.

Environment:Informatica Power Center 8.5/Oracle 10g, UNIX Server, XML, T/SQL, TOAD 8.6, SQL Navigator.

Client:Confidential, Cleveland, Ohio. Jun'09 - Nov'09
Position: ETL Developer.

Description:
Am Trust Bankwas founded in 1889 in Cleveland, Ohio, where it served customers for 118 years. It has grown from a local savings and loan to having a national presence in retail banking, mortgage and construction lending, investment and insurance services and indirect auto lending.

Responsibilities:

  • Involved in extraction of Data from source systems, transformation in the stage layer and finally loading into Marts.
  • Created Views for the source systems tables in the Bulk layer.
  • Used debugger to validate the mappings and gain troubleshooting information about data and error conditions.
  • Worked with the joiner transformation using Normal Join, Master outer join, detail outer join and full outer join. Implemented slowly changing dimensions type 1 and type 2 for change data capture.
  • Created Expression, Look up, sequence generator, Source qualifier, Rank, Aggregator were implemented to load the data from source to targets. Extensively worked with various Active transformations like Filter, Sorter, Aggregator, Router and Joiner transformations.
  • Extensively worked with various Passive transformations like Expression, Lookup and SequenceGenerator. Created complex mappings using unconnected and Connected Look Up transformations.
  • Expertise in writing SQL, PL/SQL, Stored Procedures, Triggers and Packages in Data Warehouse environments that employ Oracle and DB2.
  • Transformed data was loaded into relational tables and finally loaded into fact table.
  • Triggers, PL/SQL procedures and Packages, sequences were created for the mappings.
  • Created snapshot for the transactional tables in distributed databases and also creation of Triggers, Procedures and functions for the backend Development.
  • Performed error handling of sessions by using terse, normal, verbose initialization and verbose data tracing levels. Performed error handing using session logs.
  • Monitored data quality, resolving problem data issues, and ensuring timely updates.
  • Scripts were run through UNIX shell programs in Batch scheduling.
  • Migrated to the test environment and production environments.

Environment:Informatica Power Center 8.1.1, Oracle 10g, SQL *Loader and TOAD, PL/SQL, SQL Developer, UNIX (server), PUTTY.

Client: Confidential, Houston, TX. Oct'08 - Apr'09
Position: ETL Developer.

Description:
Confidential, is a privately owned subsidiary of Cox Enterprises headquartered in Atlanta, GA and provides digital cable television and telecommunication services in the United States. It is the third-largestcable television provider in the United States, serving more than 6.2 million customers.

Responsibilities:

  • Configured and administered Informatica Power Center.
  • Extensively worked with Repository Manager, Designer, Workflow Manager and Workflow Monitor..
  • Developed transformation logic and designed various Complex Mappings and Mapplets using the Designer.
  • Developed complex mapping in order to implement slowly changing dimensions (SCD).
  • Worked with the Lookup, Aggregator, Expression, Router, Filter, Update Strategy, Stored Procedure and Joiner Transformations.
  • Configured and ran the Debugger from within the Mapping Designer to troubleshoot the mapping before the normal run of the workflow.
  • Used Workflow Manager for Workflow and Session Management, database connection management and scheduling of jobs to be run in the batch process.
  • Created various Autosys entries for scheduling various data cleansing scripts and loading.
  • Worked with pmcmd command line program to communicate with the Informatica server, to start, stop and schedule workflows.
  • Performed SQL tuning using explain plan.
  • Extensively used SQL and PL/SQLScripts and worked in both UNIX and Windows Environment.
  • Generated simple reports from the data marts using Business Objects.

Environment:Informatica Power Center 8.1.1, Windows NT, Oracle 8i/9i, Business Objects 6.1, Erwin, SQL Server2000, UNIX (Solaris).

Client: Confidential, and Trust, MA. Aug'07 - Aug'08
Position: Oracle DB/Informatica Developer

Description:
This was acustodian bank, and the principal operating subsidiary ofInvestors Financial ServicesCorp. This project involves building a Data warehouse. The primary aim of this project is to integrate the data sources from existing system and to provide analysis to make business related decisions.

Responsibilities:

  • Involved in Designing and developing multi-dimensional model (Star Schema and Snowflake Schema) using Erwin.
  • Scheduling meetings with the business users to gather the business analysis reporting requirements.
  • Designing and developed ETL process.
  • Developed Mappings and reusable Transformations and Mapplets.
  • Extensively worked with Informatica Transformations and also Override SQL using Oracle Functions.
  • Informatica Server Manager usedto create database connections, sessions, and batches to run the mappings.
  • Various Performance and Tuning techniques used to improve the load.
  • Involved in writing of Triggers, Functions, and Packages.
  • Converting from SQL/Procedures and SQL Loader scripts to Informatica mappings.
  • Scheduling the batches and sessions for daily loads.
  • Automation of FTP process to and from source systems.
  • Involved in writing UNIX shell scripts and PL/SQL procedures, Pre-session and Post- Session Scripts.

Environment: Informatica power center 7.1, Oracle9i, Sun Solaris, Windows NT/2000, SQL loader, TOAD.

Client: Confidential, Oakland, CA. Jan'07 - Jul'07
Position: Informatica Developer.

Description:
Providian Financial Corporationwas one of the leadingcredit cardissuers in theUnited States headquartered inSan Francisco, California, and had more than 10 million card holders.

Responsibilities:

  • Analyzing the source data coming from various databases and files and working with business users and developers to develop the data Model.
  • Partnered with Business Users and DW Designers to understand the processes of Development, Methodology, and then implement the ideas in Development accordingly.
  • Worked with Data modeler in developing STAR Schemas and Snowflake schemas.
  • Extracting data from Oracle and Flat file, Excel files sources and performed complex joiner, Expression, Aggregate, Lookup, Stored procedure, Filter, Router transformations and Update strategy transformations to extract and load data into the target systems.
  • Created reusable Mailing alerts, events, Tasks,Sessions, reusable worklets and workflows in Workflow manager.
  • Scheduled the workflows at specified frequency according to the business requirements and monitored the workflows using Workflow Monitor.
  • Fixing invalid Mappings, Debugging the mappings in designer, Unit and Integration Testing of Informatica Sessions, Worklets, and Workflows.
  • Extensively used TOAD for source and target database activities.
  • Involved in the development and testing of individual data marts, Informatica mappings and update processes.
  • Involved in writing UNIX shell scripts for Informatica ETL tool to run the Sessions.

Environment:Informatica Power Center 7.1, Oracle 9i, SQL/PLSQL, TOAD, UNIX Shell Programming, UNIX server, and Windows NT.

Client:Confidential, India. Dec'03 - Oct'06
Position: ETL Developer.

Responsibilities:

  • Analysis, Requirements gathering, function/technical specification, development, deploying and testing.
  • Involved in the Design, Development, Testing phases of Data warehouse.
  • Logical and Physical Database Layout Design using Erwin and Involved in Design and Data Modeling using Star schema.
  • Created Informatica mappings for initial load and daily updates.
  • Involved in fixing invalid mappings, testing of stored procedures and functions, unit and integration testing of Informatica Mappings, Sessions, Workflows and the target data.
  • Developed several mappings to load data from multiple sources to data warehouse.
  • Developed, tested Stored Procedures, Functions and packages in PL/SQL for Data ETL
    Performed Data conversions.
  • Involved in troubleshooting the load failure cases, database problems.
  • Extensively used mapping parameters, mapping variables and parameter files.
  • Informatica Metadata Reporter installation and Configuration
  • Involved in the Design, Development, Testing phases of Data warehouse.
  • Generated Reports using Metadata Reporter
  • Customization of Informatica Metadata Reporter.

Environment:Informatica Power Center 6.x, Oracle 8i, Erwin, UNIX Shell Scripts, TOAD, SQL * Loader, Windows NT.

We'd love your feedback!