We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

4.00/5 (Submit Your Rating)

Charlotte, NC

SUMMARY:

  • Over 9 Years of IT experience in Data Modeling, OLAP, ETL Design, Development, Programming, Testing, Performance tuning, Implementation, Troubleshooting and Error Handling in the field of Data warehousing and Application Development and gathering Business User Requirements.
  • Strong Data warehousing experience specializing in RDBMS, ETL Concepts.
  • Expert in the ETL Tool Informatica. Have had extensive experience in working with various Power Center Transformations using Designer Tool.
  • Experience in integration of various data sources with Multiple Relational Databases like Oracle, SQL Server, DB2, Sybase, Terada, Netezza, MS Access, and Worked on integrating data from Flat Files, COBOL files and XML Files.
  • Experienced in writing, testing and implementation of Stored Procedures, Triggers, Functions and Packages at database level and form level using PL/SQL.
  • Extensive Data modeling experience in Star Schema/Snowflake schema design, Fact and Dimension tables and Physical and Logical data modeling.
  • Expertise in complete lifecycle Implementation of BI with Star/Snow flake schemas using Ralph Kimball and Bill Inmon Methodology.
  • Experience in Study, Design, Analysis, Development and Implementation of Data Warehousing Concepts.
  • Experience in Reporting and analyzing Business Objects tool.
  • Created and maintained tables using Netezza Performance Server by generating statistics and distributing key columns efficiently.
  • Have knowledge of Informatica MDM hub and execution of Informatica MDM hub processes.
  • Have designed and developed efficient Auditing and Error handling methodologies for Data warehousing implementations.
  • Also an expert with SQL and was extensively involved in writing scripts for Data Analysis, Manual & Automated Testing.
  • Experienced in integration of various data sources. Experience in supporting large Databases, troubleshooting the problems.
  • Good experience in various Industry verticals like Pharmacy, Health Care, Finance, Insurance and Marketing with appropriate DW/ETL implementations.
  • Developed UNIX Shell Scripts for automating the PL/SQL scripts.
  • Involved in Full Life Cycle Implementation of Data warehouses.
  • Expertise in Coding, Review and Unit testing of Informatica mappings.
  • Good experience in Unix Shell Scripting and ETL Process Automation using Shell Programming and Informatica.
  • Also have had quite good experience in Performance Tuning at both Database and Informatica Level.
  • Knowledge in OBIEE Reporting Tool.
  • Motivated to take independent responsibility as well as to contribute to be a productive team member. Strong Team building and mentoring skills and excellent Team leadership capability.
  • Strong Logical and Analytical Reasoning Skills, Excellent Management skills, Excellent Communication with good Listening, Presentation and Interpersonal Skills.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 9.6/9.1/8.6/8.1/7.1/ Informatica Power Exchange/ Informatica MDM/Informatica Data Quality, IBM WebSphere Data stage 8.5.

Business Intelligence: Business Objects XI R 3/6.0/5.1/5.0. , Cognos 8BI/7.0, OBIEE 10.1.3.x

RDB: MS: Oracle 11g/10g/9i/8i/7.3, Netezza 4.6, MS SQL Server 2014/2005, Sybase IQ, DB2 UDB 7.1, MySQL, MS Access, Teradata, HDFS, Hive.

Data Modeling: Dimensional Data Modeling, Star Schema and Snow - Flake Modeling, Fact and Dimension tables, Physical and Logical Data Modeling .

Languages: SQL, PL/SQL, Unix Shell Scripting, Syncsort DMExpress, Transact SQL, Visual Basic 6.0/5.0, HTML, DHTML, XML, C, C++, Java.

Modeling Tools Erwin 4.1, MS Visio

Operating System: Windows 2000, UNIX, Solaris/AIX.

Scheduling Tool: Autosys, Control-M, Tidal, Informatica Scheduling

PROFESSIONAL EXPERIENCE:

Confidential, Charlotte, NC

Sr. Informatica Developer

Responsibilities:

  • Created Detailed Design Document by analyzing the project Requirements Definition Documents.
  • Involved in the Analysis and the design of Logical and Physical Data Model for ETL mapping and the process flow diagrams for the Business Intelligence Data warehouse (BID).
  • Designed and developed the Informatica Mappings to extract data from various (SOR) System of Records which have different lines of business (LOB) to transform the data and load to the target warehouse for reporting to downstream consumer.
  • Created mappings to implement type-2 slowly changing dimensions.
  • Help in establishing Structural Conformance and data conformance.
  • Used Constraint Based loading & target load ordering to load multiple targets system with PK-FK relation in the same mapping.
  • Used Parameter files to override mapping parameter, Mapping Variables, Workflow Variables, Application connection object Name, Relational Connection Names, $Param-Session Parameters and Ftp Session Parameters.
  • Used both Debugger and Test load options in the workflow level.
  • Created SQL script for testing the output from the target table based on the business requirement document.
  • Setting up Batches and sessions to schedule the loads at required frequency using Power Center Workflow manager, PMCMD and also using scheduling tools.
  • Coordinated with Tech lead and Architect for any DDL errors and updates.
  • Created technical specification documents by analyzing the business requirement and functional requirement documents.
  • Studied Session Log Files, Thread Statistics, Performance Statistics File and Reject files to understand the root cause of data rejection and to pinpoint performance deadlocks.
  • Performance tuning on Mapping. Improved performance of sessions with high Update volume by using performance improvement techniques like disable and re-enabling indexes before running the session, created indexes on fields for update and Look-up.
  • Used Pushdown optimization both on the Source and the target side to push transformation logic to the database to improve performance.
  • Devised Re-usability and reduced redundancy of code by creating shortcuts, Mapplets, Reusable Transformations, Reusable Sessions and Worklets.
  • Designed and Developed mappings using Aggregator, Joiner, Normalizer, Rank, Sequence Generator, SQL, Transaction Control, Connected-Unconnected-Lookup, Source Target pre and post load. Stored Procedure Transformations, Update Strategy, Union, XML Transformation, Normalizer etc.
  • Created Test cases for Unit Test, System Integration Test and UAT to check the data quality and data Integrity.
  • Conducted code review meetings and created implementation planning the development phase.
  • Provided Projects related Technical documentation for the Team and updated in Glimmer.
  • Used JIRA for defect logging and tracking the defects.
  • Used FTP and External Loader connections in sessions to Load Bulk data into the oracle tables.
  • Involved in Informatica Object Migrations and script migrations from development to other environments such as SIT and UAT.
  • Have knowledge of Hadoop Distributed file system, MAP Reduce Framework and Hive.

Environment: Informatica Power Center 9.6.1, Oracle 11g, UNIX AIX, SQL Developer, Business Objects and Microsoft Excel, Awetest tool, MS Visio, JIRA

Confidential, Des Moines, IA

Sr. Informatica Developer

Responsibilities:

  • Created technical specification documents by analyzing the business requirement and functional requirement documents.
  • Converted IBM data stage job into Informatica Mapping.
  • Created mappings to implement type-2 slowly changing dimensions.
  • Designed mappings and workflows to load the data into Dimension and Fact tables like BCBSA Host Mbr Dim, BCBSA Host Mbr Addr Dim, BCBSA Host Clm Line Fact, BCBSA Home Mbr Attribution Fact.
  • Devised Re-usability and reduced redundancy of code by creating shortcuts, Mapplets, Reusable Transformations Developed and Implemented Informatica parameter files to filter the daily, weekly data from the source system.
  • Created Power Exchange Data maps based on the copybooks to get the VSAM Mainframe source files into oracle staging tables.
  • Extensive use of Teradata Export tools like Fastload, Multiload, Fast Export and Tpump.
  • Configured power center sessions to use Teradata parallel transporter to load into Teradata target tables.
  • Used ALM for defects tracking and Serena Dimensions for migrating the code.
  • Involved in Informatica Object Migrations and script migrations from development to other environments. Created Informatica labels to migrate Informatica objects.
  • Used External Loader connections in sessions to Load Bulk data into the Sybase IQ target tables.
  • Used Parameter files to override mapping parameter, mapping Variables, Workflow Variables, Relational Connection Names and $Param-Session Parameters.
  • Developed workflow tasks like reusable Email, Event wait, Timer, Command, Decision.
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
  • Have experience in developing mappings according to business rule, migrating to QA, naming conventions, mapping design standards and good sound knowledge in DATAWAREHOUSE, PL/SQL concepts etc.
  • Creating Test case documents for Unit Test, Integration Test to check the data quality and data Integrity.
  • Used PMCMD command to start, stop, and ping server from UNIX and created UNIX Shell scripts to automate the activities.
  • Based on the logic, used various transformation like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner transformation in the mapping.
  • Created Tasks, Workflows and Worklets using Workflow Manager.
  • Created Oracle Stored Procedure to implement complex business logic for good performance and called from Informatica using Stored Procedure transformation.
  • Automated load run on Informatica sessions through UNIX ksh scripts and implemented pre and post-session scripts, also automated load successful and failure notification through email.
  • Developed Informatica parameter files to filter the daily data from the source system.
  • Developed UNIX shell scripts and config scripts for controlling job flow and scheduling jobs.
  • Adjusted data, index cache sizes for cached transformations like Aggregator and Sorter, Adjusted DTM buffer sizes to set the optimum buffer block size for high Performance target loading by the transformation thread and also Adjusted Line Sequential Buffer lengths for Flat-File Loads to achieve optimal performance.
  • Studied Session Log Files, Thread Statistics, Performance Statistics File and Reject files to understand the root cause of data rejection and to pinpoint performance deadlocks.
  • Coordinated with DBA’s to create Oracle Views and Stored Procedures to implement complex business logic and executes DDL.
  • Coordinated with Tech lead and Architect for any DDL errors and updates.
  • Extensively used ETL to load data from wide range of source such as flat files, XML etc.
  • Worked on XML Source files to generate PAD (Patient’s Admission and Discharge), CM (Case Management) and DM (Disease Management) transaction files based on the transaction type under BLUE SQUARE Program.
  • Use XML SPY to modify the XSD’s.

Environment: Informatica Power Center 9.6.1, Informatica Power Exchange, Sybase IQ, Teradata, SQL Server 2014, XML SPY, Serena Dimensions, Win SQL, UNIX Solaris/AIX, WINSCP, Microsoft SharePoint, Business Objects, MS Visio, Ultra Edit, Putty, IBM WebSphere Data Stage 8.5.

Confidential, Brea, CA

Informatica Developer

Responsibilities:

  • Responsible for developing mappings, sessions, workflows and workflow tasks based on the user requirement using Informatica Power Center.
  • Worked on Informatica Power Center tool - Source Analyzer, Warehouse designer, Mapping and Mapplet Designer, Transformations, Informatica Repository Manager.
  • Extensively used ETL to load data from wide range of source such as flat files, SQL server and Oracle.
  • Expert in dealing with flat files, relational tables, archiving the session and workflow log files, FTP the flat files between remote system and UNIX server.
  • Extensively used PL/SQL Procedures/Functions to build business rules.
  • The project involved extracting data from various sources, transforming the data from these files before loading the data into target (warehouse) Oracle tables.
  • Assisted the reporting Confidential in modifying the SQL queries which are used to for making reports using Cognos and other necessary reporting tool.
  • Based on the logic, used various transformation like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner, Java, input, output transformation in the mapping.
  • Informatica Designer tools were used to design the source definition, target definition and transformations to build mappings.
  • Have knowledge on different Agile programming using methodologies like test first development methodology.
  • Designed and Developed complex aggregate join, look up transformation rules (business rules) to generate consolidated data identified by dimensions using Informatica Power Center tool.
  • Created the mappings using transformations such as the Source qualifier, Aggregator, Expression, Lookup, Router, Filter, java and Update Strategy.
  • Setting up Batches and sessions to schedule the loads at required frequency using Power Center Workflow manager, PMCMD and also using scheduling tools.
  • Extensively worked in the performance tuning of programs, ETL procedures and processes.
  • Worked with NZLoad to load flat file data into Netezza Tables.

Environment: Informatica Power Center 9.1.0/8.6.1 , Informatica Data Quality, Netezza 4.6, Erwin, Oracle 10g, SQL Server 2008, PL/SQL, MS Visio, SQL Developer, UNIX Solaris/ AIX, WINSCP, Business Objects XI R3, Microsoft SharePoint, Control-M.

Confidential, Dallas, TX

Informatica Developer

Responsibilities:

  • Designed and developed efficient Error handling, Performance improvement methods and implemented throughout the mappings.
  • Co-ordinate with the Architectural team and come up with possible solutions.
  • Have knowledge of the flow of data through the Informatica MDM hub via a series of processes (land, stage, load, match, consolidate and distribute).
  • Built campaign management data ware house for VCA Animal Hospitals, which involved creating dimensions like Patient, Client, Clinic, Breed, Product, Segment, Campaign etc., and facts like Order Fct, Campaign Response Fct, Cmt Disp Fct etc.
  • To assist production support team in case of any load/performance issues with the existing processes.
  • Cleansed the source data, extracted and transformed data with business rules, and built reusable components such as Mapplets, Reusable transformations and sessions etc.
  • Developed complex Informatica mappings to load the data from various sources using different transformations like source qualifier, connected and unconnected look up, update Strategy, expression, aggregator, joiner, filter, normalizer, rank and router transformations.
  • Developed and Implemented Informatica parameter files to filter the daily data from the source system.
  • To make sure that the ETL standards are followed across the data warehouse.
  • Used Informatica debugging techniques to debug the mappings and used session log files to trace errors that occur while loading.
  • Performed Unit Testing and Integration Testing on the mappings.
  • Involved in designing Audit process and Reconciliation processes for the sessions loaded as part of mapping design.
  • Responsible for Creating workflows and worklets. Created Session, Event, Command, Control Decision and Email tasks in Workflow Manager.
  • Tuned performance of Informatica sessions by Studying Session Log Files to understand the root cause of performance deadlocks and used different performance techniques like sorting, caching data, session partitions etc.
  • Developed Slowly Changing Dimension Mapping’s for Type 1 SCD and Type 2 SCD for implementing the Slowly Changing Dimension Logic.
  • Wrote Pre-session and Post-session shell scripts for dropping, creating indexes for tables, Email tasks and various other applications.
  • Created Materialized views for summary tables for better query performance.
  • Migrated the Informatica objects to QA, production environment by working with the Informatica administrator.
  • Prepared Detail design documentation thoroughly for production support department to use as hand guide for future production runs before code gets migrated.

Environment: Informatica Power Center 8.6.1, Informatica MDM hub, Informatica Data Quality, Oracle Exadata, SQL Server 2008, PL/SQL, SQL Developer, UNIX Solaris/AIX, Ultra edit, Business Objects XI R3, Microsoft SharePoint, Tidal.

Confidential, Bloomington, IL

Informatica Developer

Responsibilities:

  • Responsible for creating Business Design Documents, Source to target Matrix, Unit Test Cases, Performance tuning steps.
  • Load Production and Production Re-Forecasting Application Model with data for system testing.
  • Budget and Salary Estimate Yearly Maintenance, testing for Budget Year and Plan Year.
  • Developing mappings for translations recodes and black box processing.
  • Responsible for publishing data from the EPM Bank applications to the Atomic Layer and then moved to the Data Access/Integration Layer for Reporting purposes.
  • Responsible for publishing data from EPM Bank applications and supplied back to sources - Bancware to other Bank Applications - Production Re-Forecast with Production data.
  • Developing mappings and Designing the Business High Level Data Process Documents.
  • Responsible for Production Support Team maintenance, creating scripts, pam files, paths for the workflow jobs.
  • Handling of CR’s (Change Requests) and enhancements for existing application and followed change management process.
  • Understanding the business requirement specifications provided by the client.
  • Involved in Data Quality, Data profiling, Data Cleansing and metadata management.
  • Performing impact analysis and handling ETL change requests based on the business rules.
  • Created design specifications for ETL coding and mapping standards.
  • Created complex mappings using Unconnected Lookup, joiner, Rank, Source Qualifier, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations to extract, transform and load data to staging area.
  • Created Oracle Stored Procedure to implement complex business logic for good performance and called from Informatica using Stored Procedure Transformation.
  • Used PMCMD command to start, stop and ping server from UNIX and created UNIX Shell scripts to automate the process.
  • Developed Informatica parameter files to filter the daily data from the source system. Created mappings in the designer to implement Type 2 SCD.
  • Scheduling of Informatica workflows using Ctrl-m Scheduler.
  • Fine-tuned the mappings by analyzing data flow and studied the session log reader, writer, transformation threads to find the bottlenecks. Sorted the data to increase the throughput of sessions containing Lookup, Joiner and aggregator transformations.
  • Building mappings to load data from and into various Database DM.
  • Created mappings, which include the Error Handling Logic being implemented to create an error, ok flags and an error message depending on the source data and the lookup outputs.
  • Extensive Experience in working under challenging environments.

Environment: Informatica Power Center 8.6.1, Informatica Data Quality, UNIX AIX 5.3, MS SQL Server 2008, WinSCP, IBM DB2 v9, Flat Files and XML files, Control Center, IBM Cognos Cubes, Control-M.

Confidential, Bridgeport, CT

ETL Consultant

Responsibilities:

  • Coordinating with the client and gathering the user requirements.
  • Designed and developed ETL process using Informatica tool.
  • According to the business logic created various transformations like Source Qualifier, Normalizer, Lookup, Stored Procedure, Sequence Generator, Router, Filter, Aggregator, Joiner, Expression, and Update Strategy.
  • Done caching optimization techniques in Aggregator, Lookup, and Joiner transformation.
  • Implemented OBIEE Applications including Informatica, DAC and OBIEE platform.
  • Created reusable transformations and used in various mappings.
  • Responsible for migrating the folders or mappings and sessions from development to production environment.
  • Responsible for scheduling the workflow based on the nightly load.
  • Used various debugging techniques to debug the mappings for Performance tuning.
  • Expertise in installation, configuration, as well as implementation of OBIEE.
  • Developed mapping to implement type 2 slowly changing dimensions.
  • Developed Informatica parameter files to filter the daily source data.
  • Created Oracle Stored Procedure to implement complex business logic for good performance and called from Informatica using Stored Procedure transformation.
  • Used various Oracle Index techniques like B*tree, bitmap index to improve the query performance and created scripts to update the table statistics for better explain plan.
  • Created Materialized views for summary tables for better query performance.
  • Creating Test cases for Unit Test, System Integration Test.
  • Created tables, table’s relationships, indexes, keys.
  • Used PMCMD command to start, stop and ping server from UNIX and created UNIX Shell scripts to automate the process.
  • Created UNIX shell scripts and called as pre session and post session commands.
  • Developed the multidimensional reports in Business Objects by translating the business validations. Generated different kinds of charts in Business Objects.
  • Created Master Detail, Cross Tab reports to view the data in analyzed way.

Environment: Informatica Power Center 8.1, Oracle 10g/9i,OBIEE, Flat Files, SQL Server 2000, Mainframe DB2, COBOL, TOAD, Quest Central for DB2, MS Visio, Netezza, Windows 2000, UNIX AIX 5.1, Business Objects 6.5, SQL, PL/SQL, Trillium 6.5.

Confidential, Wilmington, DE

ETL Informatica Developer

Responsibilities:

  • Worked with the Business Analysts for requirements gathering, business analysis, testing, and project coordination.
  • Extracted data from various sources such as IMS Data, DB2 and Oracle.
  • Extensively worked with IMS Data to validate Sample History module data.
  • Designed several mappings to extract the data from the Flat file sources and Relational Sources. Reporting and analyzing Business Objects tool.
  • Worked on Web Intelligence-Access and report data through internet.
  • Used various transformations like unconnected lookup, connected lookup, aggregator, rank, joiner and stored procedure. Created Pipeline partitioning to improve Session performance using round robin partitioning.
  • Wrote SQL Scripts for the reporting requirements and to meet the Unit Test Requirements.
  • Created and Scheduled Sessions & Worklets using workflow Manager to load the data into the Target Database.
  • Extensively involved in Reconciliation process and Recovery process for capturing the incremental changes in the source systems for updating in the staging area and data warehouse respectively. Debugged the mappings developed by other developers.
  • Tuned the performance of the mapping to load the data quicker.
  • Worked with Materialized Views as Target to enable easy reporting.
  • Developed Design documents and knowledge center for the modules developed.

Environment: Informatica Power Center 8.1/7.1, Business Objects 6.5, Oracle 9i, SQL*Plus, Toad, Windows 2000, SQL Server 2000, PL/SQL, ERWIN 3.

We'd love your feedback!