Sr Informatica Developer Resume
Appleton, WI
SUMMARY:
- 8 years of experience in Analysis, Development, and Implementation of business applications including Development and Design of ETL methodologies using Informatica Power Center in various domains like Health Care and Financial sectors.
- As an ETL Developer in designing and implementing Data Mart / Data Warehouse applications using INFORMATICA Power Center 9.x / 8.x / 7.x / 6.x (Designer, Workflow manager, Workflow monitor and Repository Manager).
- Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.
- Performed land process to load data into landing tables of MDM Hub using external batch processing for initial data load in hubStore.
- Efficient documentation was done for all phases like Analysis, design, development, testing and maintenance.
- Developed Mappings for loading MDM Hub.
- Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.
- Performed land process to load data into landing tables of MDM Hub using external batch processing for initial data load in hubStore.
- Strong hands on experience using Teradata utilities (SQL, BTEQ, Fast Load, Multiload, Fast Export, Tpump and Query man).
- Proficient in writing stored procedures, triggers, views, indexes; using PL/SQL and Transact - SQL, for enforcing referential integrity on SQL Server, and Oracle database ; implemented DBMS best practice .
- Good knowledge on Teradata Parallel Transporter (TPT).
- Designed and developed Complex mappings from various Transformations like re-usable transformations, and Mappings/Mapplets, Unconnected / Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy and more, mappings using Stored Procedure’s, Normalizer, XML, External Procedure.
- Involved in complete SDLC (Software Development Life Cycle).
- Extensively worked with Unicenter WLM & Control-M Job scheduling tool.
- Expertise in working with Control M utilities like ctmfw, ctmcontb, ctmpsm, ctm order etc. for scheduling the jobs as per requirement.
- Experienced in integrating applications and services into data integration framework
- Extensively worked on Customer & Supplier match analysis and Address standardization using IDQ transformations.
- Experience d in developing and maintaining Data Warehouse applications using INFORMATICA Powercenter with optimum design solutions.
- Extensively used Informatica Repository Manager and Workflow Monitor.
- Knowledge in developing reports using Business Intelligence tools like Business Objects, Tableau, and Cognos.
- Extensively used Informatica PowerCenter 9.6/9.5/9.1, Informatica Data Quality (IDQ) 9.6.1/9.5/9.1 as ETL tool for extracting, transforming and loading data from various source data inputs to various targets, in batch and real time.
- Extensive knowledge of data warehouse approaches - Top down (Inmon’s approach) and Bottom up (Kimball’s approach), methodologies- Star Schema, Snowflake.
- Followed Agile methodology, Scrum process.
- Extensive experience as Teradata developer, used various Teradata utilities such as TPT(Teradata Parallel Transporter), Fastload, Multiload, TPump, BTEQ (to invoke these utilities & queries).
- Created various profiles using Informatica Data Explorer (IDE) & IDQ, from existing sources and shared those profiles with business analysts for their analysis on defining business strategies in terms of activities such as assigning matching scores for different criteria etc.
- Good knowledge on SDLC (software development life cycle) and good experience with unit testing and integration testing.
- Preparing the High level Design Specifications for ETL Coding and mapping standards.
- Experience in Performance tuning of targets, sources, mappings and sessions.
- Experience in integration of various data sources like Teradata, Oracle, SQL server, flat files, and XML files.
- Experience in Ralph Kimball Methodology, Logical Modeling, Physical Modeling, Dimensional Data Modeling, Star Schema, Snowflake Schema, FACT tables, Dimension tables
- Expertise in OLTP/OLAP System study, developing Database schemas like star schema, snow flake schema Dimensional Data Modeling used in relational, dimensional modeling and slowly changing dimensions (SCD’s).
- Performed ETL procedure to load data from different sources into data marts and Data Warehouse using Power Center.
- Strong expertise in Relational data base systems like Oracle 8i/9i/10g, SQL Server 2000/2005, MS Access, design and database development using SQL, PL/SQL, SQL PLUS, TOAD,SQL LOADER.
- Good knowledge on Cognos reporting tool
- Experience in Informatica DAC Tool for Customizing Data warehouses and Creating Task and Task Groups in it.
- ETL experience in development of mappings and tuned existing mappings for better performance using Informatica Power Center as per the business rules.
- Experience in development and design of ETL methodology for supporting data transformations and processing, in a corporate wide ETL Solution using Informatica Power Center 9.x/8.6, Informatica Data Integration Hub (DIH) and SSIS
- Experience with UNIX commands, VI editor and writing Shell scripts.
- Developed unit test plans and involved in system testing.
- Expertise in writing VB Scripts for automating several processes.
- Outstanding communication and interpersonal skills, ability to learn quickly, good analytical reasoning and high compliance to new technologies and tools.
- Expert knowledge of SQL, T-SQL, DTS/SSIS, Analysis Services
- Data modeling experience using Erwin Modeling (ERwin 4.5/4.0/3.5. ) Star/Snowflake schemas, FACT & Dimensions tables, Physical & logical data modeling.
- Strong experience in coding using SQL, SQL*Plus, PL/SQL, SQL PL Procedures, Functions, Triggers.
- Experience in the Development of Data Marts and Data Warehouse Using Oracle 9i/10g.
- Good understanding on DW architecture, Data warehousing concepts.
- Sound experience with the RDBMS concepts and strong exposure to client server web applications development.
- Good understanding of Teradata utilities like Fast load and Multiload.
- Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like target, source, mapping and session.
- Experience in performing SQL query optimization using Hints, Indexes and Explain plan.
- Expertise in managing post production issues and delivered all assignments/projects within specified time lines.
- Excellent communication and demonstrational skills for analyzing user’s requirements.
- Strong interpersonal and problem solving skills.
- Team player with good interpersonal and problem solving skills, ability to work in team and also work individually.
TECHNICAL SKILLS:
ETL Tools: Informatica 9.x/8.x/7.x/6.x (Power Center), Informatica Data Explorer (IDE), Informatica Data Quality (IDQ) 9.6.1/9.5, DIH
Databases: Teradata 13/12, Oracle 8i/9i/10g/11g
Query Tools: SQL, PL/SQL, SQL*Plus, and TOAD
Scheduling Tools: Control M 7/8, Unicenter WLM(Work Load Manager), Informatica Scheduler
OLAP/Reporting Tools: Cognos
Languages: UNIX/LINUX/DOS Scripting, T-SQL, Java, Shell Scripting, PL/ SQL, T-SQL, SQL * Plus, Visual Basic 6.0.
Operating Systems: Windows Server 2003, Server 2008, 2008 R2, Windows 7/Vista/XP/Millenium/98, Unix, Linux
PROFESSIONAL EXPERIENCE:
Confidential, Appleton, WI
Sr Informatica Developer
Responsibilities:
- Understood the Business point of view to implement coding using Informatica Power Center Designer.
- Extracted data from various source systems that include Oracle, Sybase, Flat files and Web Services.
- Worked on Informatica Designer tools - Source Analyzer, Target Analyzer, mapping designer, mapplet designer and other client tools such as Workflow manager and Workflow monitor to create and monitor the batches.
- Developed complex mappings in Informatica using different transformations like Source Qualifier, Lookup, Expression, Aggregator, Joiner, Router, Sequence Generator and Update Strategy and Web Service Consumer Transformations.
- Developed ETL mappingsto extract claim information from source system on DB2 database and landed the data into a staging area.
- Strong experience developing enterprise level Data Integration Framework and its packages using Informatica product suite (PowerCenter, Netezza, SQL )
- Develop and maintain the data integration framework and its ETL/ELT packages using Data Integration Technologies, including Informatica, Netezza, and SSIS. Assist in the design of the framework as needed.
- Install update, maintain, monitor and support Data Integration Technologies (including Informatica, Netezza, and SSIS).
- Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.
- Performed land process to load data into landing tables of MDM Hub using external batch processing for initial data load in hubStore.
- The Claim Data Mart (CDM) is used to store, manage and to deliver access to Claims transactions for reporting and analysis.
- Involved in T-SQL, Stored procedure, batch files to automate the ETL jobs on MS SQL Server 2008 databases. Written batch jobs and stored procedures to support automation of manual steps.
- Creating application Schema (Database, Tables, Views and Indexes) and implemented various SQL,T-SQL, statements, stored procedures and UDF (User Defined Functions) according to the business of the project.
- Performed unit testing, integrating testing, and other validation in the data integration solution.
- Experienced integrating applications and services into data integration framework.
- Worked with business SMEs on developing the business rules for cleansing. Applied business rules using Informatica Data Quality (IDQ) tool to cleanse data.
- Perform the data profiling and Data quality solutions to the data sets to the business requirements.
- Extremely worked with DQ transformations like Standardizer, Labeler, Parser, Address Validator, Match, Key Generator, Association etc.
- Worked with the DQA for handling exception records by the business team and implemented the Postal address validation for the Supplier data sets.
- Extensively worked on Informatica Power Center with Web Services.
- Developed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
- Responsible for optimization of SQL queries, T-SQL and SSIS Packages.
- Performance tuning has been done to increase the throughput for both mapping and session level and SQL Queries Optimization as well.
- Implemented Pushdown Optimization to reduce the burden on the Integration service and thereby increase the performance.
- Developed number of Complex Informatica Mappings, Mapplets and Reusable Transformations for the Claim Profitability Systems to facilitate Daily, Monthly and yearly Loading of Data
- Worked on Informatica B2B, Data Integration Hub (DIH), JMS Queue.
- Configured Repository & Integration Service, WebService Hub, Model Repository, Data Integration, Content Management and Monitoring Repo Service in all envs.
- Used SQL Loader to load the data from flat files to the relational tables.
- Worked with Defect Tracking Tool - Quality Center to track the issues.
- Performed informatica code migrations, test, debug, document to maintain programs and deployed.
- Extensively worked on testing the system from beginning to end to ensure the quality of the adjustments made to oblige the source system up-gradation.
- Worked on bug fixes in Informatica mappings to produce the correct output.
- Performed unit testing, SIT (System Integration Test) and UAT (User Acceptance Test) to check the data quality.
- Automation and scheduling of UNIX shell scripts and Informatica sessions and batches using Scheduling tool Control M.
- Responsible for creating stored procedures and optimization using Trasact-SQL programming.
- Post production support, enhancements and performance tuning.
- Efficient documentation was done for all phases like Analysis, design, development, testing and maintenance.
- Developed Mappings for loading MDM Hub.
- Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.
- Performed land process to load data into landing tables of MDM Hub using external batch processing for initial data load in hubStore.
- Involved with Informatica team members in Designing, document and configure the Informatica MDM Hub to support loading,cleansing, matching, merging, and publication of MDM data.
Environment: Informatica Power center 8.6/7.1.4, Microsoft Visio, Oracle 10g, Sybase, Flat Files, Web Services, XML, Crystal Reports, SQL/PLSQL, SQL Loader, UNIX AIX, WinScp, Embarcadero-Rapid SQL, IDQ, Rational Clear Case, Quality Center, Control M.
Confidential, Buffalo,NY
Informatica Developer
Responsibilities:
- Developing new business applications and enhance existing business applications using Informatica Power Center, Oracle SQL and Shell Scripting.
- Write detailed description of user needs, program function and steps required to develop implement applications to live environments.
- Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.
- Prepare Technical Designs, Test case documentation, Data Load strategy, and Operation strategy for all data feeds.
- Worked extensively in implementing the ICD 9 and ICD 10 for the data mapping from ICD 9 - ICD 10 and ICD 10 - ICD 9 in source and target level.
- Trouble shoot production problems.
- Developed Data Warehouse Target Schema (Database, Tables, Views and Indexes) and implemented various SQL,T-SQL statements, stored procedures and UDF (User Defined Functions) according to the requirements of the project.
- Responsible for the Data Integrator using the Business Objects and generating Reports.
- Very Good Exposure to OBIEE reporting and used Cognos as the Data Integrator
- Experience in working with business analysts to identify study and understand requirements and translated them into ETL code in Requirement Analysis phase.
- Experience in creating High Level Design and Detailed Design in the Design phase.
- Expertise in Business Model development with Dimensions, Hierarchies, Measures, Partitioning, Aggregation Rules, Time Series, Cache Management .
- Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements. Solid understanding of OLAP concepts and challenges, especially with large data sets.
- Strong knowledge of Entity-Relationship concept, Facts and dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema).
- Experience in integration of various data sources like Oracle, DB2, SQL server and MS access and non-relational sources like flat files into staging area.
- Experience in creating Reusable Transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer and Rank) and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
- Experience in creating Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).
- Performed land process to load data into landing tables of MDM Hub using external batch processing for initial data load in hubStore.
- Involved in T-SQL, Stored procedure, batch files to automate the ETL jobs on MS SQL Server 2008 databases. Written batch jobs and stored procedures to support automation of manual steps.
- Creating application Schema (Database, Tables, Views and Indexes) and implemented various SQL,T-SQL, statements, stored procedures and UDF (User Defined Functions) according to the business of the project.
- Responsible for optimization of SQL queries, T-SQL and SSIS Packages.
- Involved in creating, monitoring, modifying, & communicating the project plan with other team members.
- Involved with Informatica team members in Designing, document and configure the Informatica MDM Hub to support loading,cleansing, matching, merging, and publication of MDM data.
- Hands on experience with Informatica Data Explorer(IDE) / Informatica Data Quality (IDQ) tools for Data Analysis / Data Profiling and Data Governance.
- Worked with Informatica Data Quality toolkit,Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.6.1
- Tuned mappings using Power Center-Designer and used different logic to provide maximum efficiency and performance.
- Experienced in UNIX work environment, file transfers, job scheduling and error handling.
- Extensively worked on developing and debugging Informatica mappings, mapplets, sessions and workflows.
- Responsible for verifying accuracy of data, testing methods, maintenance and support of the data warehouse.
- Used the Siebel Analytics Administration tool to develop and enhance meta-data
- Worked on Performance Tuning, identifying and resolving performance Bottlenecks in various levels like sources, targets, mappings and sessions.
- Involved in Unit testing, System testing to check whether the data loads into targets are accurate.
- Experience in support and knowledge transfer to the production team.
- Proficient in interaction with the business users by conducting meetings with the clients in Requirements Analysis phase.
- Highly motivated and goal-oriented individual with a strong background in SDLC Project Management and Resource Planning using AGILE methodologies.
- Extensive functional and technical exposure. Experience working on high-visibility projects.
Environment: Informatica 9x, Informatica Data Quality (IDQ) 9.0.1,9.5.1 and 9.6.1Oracle 11g, SQL Server,T-SQL, Tidal, PL/SQL, SQL*Plus, SQL*Loader, XML, Siebel Analytics (7.8), Windows XP Professional, FTP, MS-Excel, MS-Access.
Confidential, Waltham, MA
Sr. Informatica Developer
Responsibilities:
- Analyzed business documents and created software engineering requirement specification.
- Helped business analyst in design, development and implementation of the Enterprise Data Warehouse and Data Marts.
- Interacted with the business users, analysts for requirements, developed conceptual and logical data models using ERWIN tool.
- Extracted data from the various source systems that include Oracle, SQL Server, DB2 and flat files.
- Created and walked the team through mapping specification document.
- Developed several mappings using transformations like Source Qualifier, Expression, Aggregator, Joiner, Lookup, Sequence Generator and Update Strategy etc in Informatica to populate the data to the target systems.
- Developed user defined functions, reusable transformations and mapplets and made use of them in several mappings to streamline the mappings.
- Implemented Incremental Aggregation to capture only the new records from the source, which increases the performance.
- Improved the mapping performance by overriding the default SQL queries.
- Developed mappings using Type2 slowly changing dimensions to keep track of historical data.
- Created sessions and batches in the workflow manager tool and monitored the status using the workflow monitor tool.
- Extensively used various performance tuning techniques to improve the session performance.
- Implemented Pushdown Optimization to reduce the burden on the Integration service and thereby increase the performance.
- Running parallel sessions by using concurrent batches reduced the time for loading the data.
- Partitioning the session improved the session performance by creating multiple connections to the source and target systems.
- Tracked the defects and wrote Test Cases.
- Used debugger to test the data flow and fixed the mappings.
- Extensively used PL/SQL stored procedures to build the business rules and wrote the shell scripts which automated the activities.
- Performed code reviews with peers and created unit test plan document.
- Created test cases for Unit test, System Integration test and UAT to check the data quality.
- Successfully upgraded Informatica 8.0 to 8.6 and responsible for validating the objects in the new version of Informatica.
Environment: Informatica power center 8.6/8.0, Power Exchange, ERWIN 7.2, Oracle 10g/9i, Flat Files, SQL Server 2008, DB2, Teradata V2R5, UNIX AIX, SQL, PL/SQL,T-SQL WinScp, AutoSys, Rational ClearQuest, Rational ClearCase, Business Object XI
Confidential, Jersey City,NJ
Informatica Developer
Responsibilities:
- Worked with business analysts for requirement gathering, business analysis, and translated the business requirements into technical specifications to build the Enterprise data warehouse.
- Used Informatica Data Explorer (IDE) to profile source data, create scorecards and track the progress of data cleansing activities.
- Created explain plans for long running queries, worked with DBA’s to identify and solve the bottlenecks by adding appropriate indexes.
- Informatica powercenter & IDQ developer on a data warehouse initiative responsible for requirements gathering, preparing mapping document, designing ETL flow.
- Extensively worked on Power Center 9.1 Designer client tools like Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
- Worked on Teradata and oracle 11g and AS 400 databases.
- Develop and execute load scripts using Teradata client utilities MULTILOAD, FASTLOAD and BTEQ.
- Extensively worked on Transformations like Lookup, Joiner, SQL and Source Qualifier Transformations in the Informatica Designer.
- Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner.
- Modify and develop new ETL programs, transformations, indexes, data staging areas, summary tables, and data quality routine based upon redesign activities.
- Worked Extensively on Teradata SQL Assistant to analyze the existing data and implemented new business rules to handle various source data anomalies.
- Worked on performance tuning of the ETL processes. Optimized/tuned mappings for better performance and efficiency.
- Defined Target Load Order Plan and to load data correctly into different Target Tables.
- Used Informatica debugger to test the data flow and fix the mappings.
- Loaded data to and from Flat files and databases like Oracle, Teradata and DB2.
- Modified existing and developed new ETL programs, transformations, indexes, data staging areas, summary tables, and data quality routine based upon redesign activities.
- Moved the mappings, sessions, workflows, maplets from one environment to other.
- Worked on UNIX Shell scripting and called several shell scripts using command task in Workflow manager.
- Developed UNIX Shell scripts to archive files after extracting and loading data to Warehouse.
- Used Informatica Power exchange 9.1 for Change Data Capture(CDC).
- Monitored sessions using the workflow monitor, which were scheduled, running, completed or failed. Debugged mappings for failed sessions.
- Involved in Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
Environment: Informatica Power Center 9.1, Power Exchange 9.1, Oracle 11g, Teradata 13, Win7, SQL * Plus, Toad, AS 400, UNIX
Confidential, Rancho Cordova,CA.
Informatica Developer
Responsibilities:
- Involved in requirement analysis, ETL design and development for extracting data from the heterogeneous source systems like MS SQL Server, Oracle, flat files, XML files and loading into Staging and Enterprise Data Vault.
- Responsible for converting Functional Requirements into Technical Specifications.
- Identified facts and dimensions from the source system and business requirements to be used for the data warehouse.
- Estimates and planning of development work using Agile Software Development.
- Involved in analyzing the ICD 9 and ICD 10 for the data mapping from ICD 9 - ICD 10 and ICD 10 - ICD 9 in source and target level.
- Identified and eliminated duplicates in datasets thorough IDQ 8.6.1 components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
- Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Involved in analyzing different modules of facets system and EDI interfaces to understand the source system and source data.
- Imported Data from various sources FLAT FILE, SQL SERVER and ORACLE.
- Worked on the Enterprise Datawarehouse methodologies(Landing, Staging, publishing and promoting the codes to Test regions).
- Used Hierarchies tool for configuring entity base objects, entity types, relationship base objects, relationship types, profiles, put and display packages and used the entity types as subject areas in IDD.
- Worked extensively on the HUB’s, SATELLITE’s, LINK’s, LSAT’s, HLINK’s AND RSAT’s ETL mappings.
- Developed Mappings for loading MDM Hub.
- Involved in Unit testing, User Acceptance Testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
- Involved in Performance Tuning in Informatica for source, transformation, targets, mapping and session.
- Implemented Slowly Changing Dimensions Type-1, Type-2 approach for loading the target tables.
- Designed various tasks using Informatica workflow manager like session, command, email, event raise, event wait etc.
- Good knowledge on the Enterprise Data Vault architecture.
- Created various data marts from data warehouse and generated reports using Cognos.
- Imported metadata from different sources such as Relational Databases, XML Sources.
Environment: Informatica 9.5/9.5.1, Oracle 11g, SQL Server, Cognos 8.1/8.2/8.4/10.1.0, E/R Studio Embarcadero, Informatica IDQ Analyst tool, IDE, Metadata Manager, HPQC.
Confidential, Philadelphia, PA
ETL Developer
Responsibilities:
- Co-ordination with client side IT Team and business users.
- Coordinated with Business Users for requirement gathering, business analysis to understand the business requirement and to prepare Interface Functional Specification (IFS) and Interface Design Document (IDD) to code ETL Mappings for new requirement changes.
- Extracted data from various heterogeneous sources like Oracle, SQL Server, DB2, Netezza, MS Access Flat files.
- Involved in testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
- Involved in data cleansing operations prior to loading the data into staging from flat files.
- Responsible for support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.1.
- Involved in promoting the folders from Development Environment to Test Environment and from Test Environment to Production Environment.
- Used Informatica Power Center 9.1 for Extraction, Transformation and Loading data from source systems into the target data base.
- Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, Update Strategy, Joiner and Stored procedure transformations.
- Designed and configured ETL mappings using Informatica tool from the Siebel application.
- Developed standards and procedures for transformation of data as it moves from source systems to the Teradata Database.
- Used Loader utilities including SQL Loader and Teradata utilities (BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD)
- Involved in performance and tuning of the ETL processes.
- Developed PL/SQL procedures, for creating/dropping of table and indexes of performance for pre and post session management.
Environment: Informatica 9.1, oracle 11g, Teradata, Oracle.
Confidential, Hartford, CT
Informatica Developer
Responsibilities:
- Involved in analyzing source systems and designing the processes for Extracting Transforming and Loading the data.
- Worked closely with Business Analysts and SME’s to understand Business requirements and change requests of clients and implement with my team.
- Worked in Scrum Environment and worked as Scrum lead.
- Used various transformations such as Source Qualifier, Expression, Lookup, Sequence Generator, aggregator, Update Strategy, and Joiner while migrating data from various heterogeneous sources like Oracle, SQL Server, and Flat files to Oracle.
- Developed mappings to bring in data from various sources across Staging, ODS to Reporting.
- Made changes in mappings with data changing capture capability for sources which have data added rapidly.
- Worked extensively on bug fixing of the existing mappings, performance tuning for better performance with best performance techniques and making existing objects to adhere to standards set up for the project.
- Created mapping variables and parameters for incremental loading.
- Handled Slowly Changing Dimensions (Type I, Type II, Type III) based on the business requirements.
- Involved in performance tuning by optimizing the sources, targets, mappings and sessions and eliminating bottlenecks.
- Validated the following HIPAA EDI transactions as 837(Health Care Claims or Encounters), 835(Health Care Claims payment/Remittance), 270/271 (Eligibility request/Response) and 834(Enrollment/Disenrollment to a health plan) by developing mappings.
- Created and Monitored sessions and workflows for daily extract jobs using
- Informatica Power Center, Workflow Manager and Workflow Monitor.
- Deployed objects across various environments from various developer folders in development --- two test environments and Production.
- Created Folders for New users and managed their privileges.
- Experience in Insurance Claims process procedures and knowledge of basic adjudication and insurance business rules.
- Documented the changes and development related to project.
- Assisted in production Support as Tier 2 level.
Environment: Informatica Power Center 9.1, Oracle 11g/10g, SQL Server 2008/2005, HP Quality Center, Flat files, PL/SQL, Unix, BO XI 3.1.