We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume Profile

Greenville, SC

PROFESSIONAL SUMMARY:

  • Having 6 years of strong Data Warehousing experience using Informatica PowerCenter 9.1/8.6 Source Analyzer, Data Warehousing Designer, Mapping Designer, Mapplet Designer, Transformation developer and Ab-Initio 2.15 Graphs and BTEQ scripts.
  • Extensive experience in Banking, Financial, Auto Finance and Retail domains.
  • Worked on databases like Oracle 11g/10g, SQL Server 2005/2008 and Teradata V13.10/12.
  • Good command in developing Mappings and Mapplets, Sessions, Workflows, Worklets and Tasks using Informatica Designer, Workflow manager.
  • Developed Complex mappings from varied transformation logics like Unconnected /Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Union, Update Strategy, Mapplets, Worklets and more.
  • Experienced in doing Error Handling and Troubleshooting using various log files.
  • Implemented SCD Type2 load methodology to track and maintain Historical data
  • Having experience in understanding the Business and creation of Mapping Specifications.
  • Ability to write complex SQLs for ETL jobs and analyzing data from various different source systems.
  • Comprehensive knowledge of Dimensional Data Modeling like Star Schema, Snowflake Schemas, Facts and Dimension Tables, Physical and Logical Data Models.
  • Experienced in complete Implementation of Auto data warehouse.
  • Worked on UNIX scripts.
  • Experience in Performance Tuning of sources, targets, mappings, transformations and sessions, by implementing various techniques like partitioning techniques and identifying performance bottlenecks.
  • Good exposure to Development, Testing, Debugging, Implementation, Documentation and Production support.
  • Expertise in doing Unit Testing, Integration Testing and Data Validation for Developed Informatica Mappings.
  • Having good work experience in Trillium Data Cleansing Tool and writing various Business Rules for cleansing.
  • Worked on Scheduling tools Autosys and CRTL-M.
  • Experience in creation of Ab-Initio V2.15 graphs and UNIX wrappers.
  • Strong Team building and mentoring skills and excellent Team leadership capability.
  • Developed effective working relationships with client team to understand support requirements and effectively manage client expectation.

SUMMARY OF TECHNICAL SKILLS

ETL Tool

Informatica PowerCenter 9x/8x, Designer, Workflow Manager, Workflow Monitor.

Ab-Intio V2.15

Databases

Oracle 11g/10g/9i, MS SQL 2005/2008, Teradata V13.10/12.

Scheduling tools

Autosys, Control-M.

Programming Language

PL/SQL, T-SQL.

Data Profiling Tool

Trillium TIBCO Software.

Environment

UNIX, Windows7/ Vista/XP, 2000.

Other Tools

SQL Plus, Toad, SQL Navigator, Putty, MS-Office.

WORK EXPERIENCE:

Confidential

Role: Sr. Informatica Developer/Team Lead

Project Description:

Confidential , currently the customer information is present in different systems, so it is not able to provide a comprehensive view of the customer and also it is difficult to understand customer relationship with Confidential.So implementation of this project will consolidate the data from multiple sources across Confidential and provide single source of customers. The objective of the Data Migration Program is to load one time data from different banners like Stop Shop, Giant Carlisle, Giant- Landover, Peapod and some other sources like A plus, CDW, COMMX and Copient and perform data transformations in ETL- Informatica then generate pipe delimited files. Since the data is from different sources we are joining the data using Informatica mappings and then doing the data profiling and cleansing using Trillium. Once we have cleansed data we are performing different transformations and finally giving the pipe delimited files which is later used to load in

Responsibilities:

  • Interacted with business people to gather the business requirements and translated them into technical specifications.
  • Prepared Technical Design Documents based on the Functional/Business requirements from the Business Users on the Mapping Documents.
  • Coordinate with the different source users to get the data in the required format.
  • Perform data profiling and cleansing using Trillium.
  • Applied various business rules on data as per given by business in Trillium.
  • Extensively worked on Informatica Power Center Designer, Workflow Manager and Workflow Monitor.
  • Designed the ETL processes using Informatica Power Center 9.1 to extract, transform and load data from input source like flat file.
  • Created complex mappings using various transformations such as Joiner, Expression, Lookup Connected/Unconnected , Aggregate, Filter, Update Strategy, Sequence Generator etc to implement the user requirements.
  • Created Reusable Transformations and Mapplets in the designer using transformation developer and Mapplet designer according to the business requirements.
  • Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
  • Worked with Autosys Scheduler to run the Informatica session.
  • Performed Unit Testing and Integration Testing of Mappings and Workflows.
  • Migrating jobs across Development, QA, Production environments.

Environment: Informatica PowerCenter 9.1, Trillium, UNIX 5.3, Windows XP, Autosys.

Confidential

Role: Sr. Informatica Developer/Team Lead

Confidential Project aims to support and improve Confidential revenue enhancement and expense reduction opportunities by interpreting complex banking business requirements and apply database design methodologies , including star-schema , normalization and selective demoralization , to design fit-for- purpose data warehouse architecture .EDI will achieve this through building new capabilities that normalize and streamline data collection and distribution across source to target systems.The project involves a Code changes, code fixes Coordination with Testing team, Defect Fixes, Production Deployment and Warranty Support. It involved extracting data from multiple source systems, performing various transformation and loading the in Data warehouse using Informatica. Once the data is loaded in Data Warehouse, the data is then extracted by business to generate reports.

Responsibilities:

  • Involved in Analysis of functional side of the project by interacting with functional experts to design and write technical specifications.
  • Involved in creation of detailed design document based on technical specifications.
  • Extracted, transformed data from various sources such as Oracle, MS SQL, Teradata and Flat files and loaded into the target data warehouse.
  • Created complex mappings using various transformations such as Joiner, Expression, Lookup Connected/Unconnected , Aggregate, Filter, Update Strategy, Sequence Generator etc. to implement the user requirements.
  • Utilized Expressions, Aggregator, Union, Joiner, Router, Look up, Filter, Source Qualifier and Update Strategy transformations to implement business logic.
  • Implemented SCD Type 1 and Type 2.
  • Involved in the code changes and enhancements of the existing projects.
  • Performance Tuning of the mappings to handle increasing data volume.
  • Involved in migration of jobs to different environments SIT, UAT,PROD
  • Worked with Autosys Scheduler to run the Informatica session on a daily/weekly/monthly basis.
  • Performed Unit Testing and Integration Testing of Mappings and Workflows.
  • Also was involved in Production Support activities which include resolution for Tickets, Maintenance Request and Change Request.
  • Managing and leading the Onshore Offshore interaction.

Environment: Informatica Power Center 9.1, Oracle 11i, SQL, Teradata V13, UNIX 5.3, Windows XP, Autosys.

Confidential

Role: Sr. Informatica Developer/Team Lead

Project Description:

Confidential is a Netherlands based trade credit insurer. The objective of the Confidential I involves extracting Risk related data from multiple source systems lines of business and loading the same in Data warehouse to enable business to calculate SCR/MCR figures. The project involves a complete Software development life cycle SDLC right from Requirement gathering, Architecture Design, Low level design, Code, Coordination with Testing team, Defect Fixes, Production Deployment and Warranty Support. Once the data is loaded in Data Warehouse, the data is then extracted by business via semantic layer Oracle views .

Responsibilities:

  • Involved in business requirement gathering with Business Analyst.
  • Extensively worked on Informatica Power Center Designer, Workflow Manager and Workflow Monitor.
  • Created complex mappings using various transformations such as Joiner, Expression, Lookup Connected/Unconnected , Aggregate, Filter, Update Strategy, Sequence Generator etc to implement the user requirements
  • Involved in the extraction, transformation and loading of the data from various sources into the dimensions, and the fact tables in the data warehouse.
  • Developed complex mappings using various transformations like Source Qualifier, Expression, Filter, Aggregator, Lookup, Update Strategy, and Sequence generator, Joiner Transformations, Router and Normalizer Transformation.
  • Created Reusable Transformations and Mapplets in the designer using transformation developer and Mapplet designer according to the business requirements.
  • Data is extracted from different source systems such as Flat files and Symphony Database.
  • Created the model to perform the data validation.
  • Extensively working on Oracle and Informatica.
  • Worked on Query optimization.
  • Involved in Performance tuning of scripts pushdown optimization.
  • Identified and tracked the slowly changing dimension tables.
  • Performed Unit Testing and Integration Testing of Mappings and Workflows.
  • Migrating jobs across Development, QA, Production environments.

Environment: Informatica Power Center 9.1/8.6,ETL, UNIX, Oracle 11i, Windows XP.

Role: Data Steward

Project Description:

The project involves design and development of Data management processes Data Quality and Metadata for Capital One Auto Finance Data warehouse application on the loan portfolio. The business objective of the project is to migrate from the existing data warehouse which is on SQL server platform to the Teradata platform along with removing the data inconsistencies and maintains the data quality and metadata. The new data warehouse provides analytical and decision making environment to supports business intelligence and reporting for end users. This integration will help to build synergies across different lines of business with Capital One as an enterprise and hence reduce cost of ownership and maintenance of the data.

Responsibilities:

  • Involved in business requirement gathering with Business Analyst
  • Interaction with business users and generating the report as per their requirements.
  • Worked as Data Steward for Capital One Auto Finance Data COAF for managing Data Quality and Metadata process
  • Worked upon the creation of Metadata for the entire Auto Data Warehouse ADW and understanding the data model and coming up with data verification rules for business to validate the new ADW does not contain any inconsistency and maintain the data quality.
  • Analysis of the PL/SQL code for the understanding of the Data models, creating the Metadata, Data Quality Defect Resolutions, monitoring the Data Movement from Source Systems to Auto Data Warehouse ADW .
  • Understanding the Business requirement and perform Business Intent Testing.
  • Written very complex queries to validate both source SQL 2005/2008 and Teradata V13 for UAT.
  • Perform Business testing for 250 Test cases provide by business users for Business Intent testing based on the different subject areas of Auto Data Warehouse
  • Managing and leading the Onshore and Offshore COAF DEP3 project.
  • Analysis of the data models and performs the Data Quality Testing for the data models.
  • .Create the metadata according to Data Model of COAF DWH.
  • Have had direct interaction with End Users for Business Intent Data.
  • Publish reusable artifact Metadata Training Deck which explains about the Metadata in detail.
  • Worked on Query optimization.
  • Data is extracted from SQL server and according to current business Data Quality is performed with removing the data inconsistencies and then loaded to ADW AUTO DATA WAREHOUSE Teradata platform.
  • Created High Level design document and reviewed.
  • Involved in Unit Testing and created Test case document.
  • Provided high level estimate with Weekly Status reporting to the client.
  • Highly appreciated by Client Managers for completion of project within the Client timelines and budget.
  • Awarded Star Award by the Client for exceptional performance.

Environment: Oracle 9i, SQL2008/2005, Teradata V13, Windows XP.

Role: Data Steward

Project Description:

Confidential The objective of the Level 3 Production Support project was to provide the resolution for the Job failures due to source data issue or might be code change required. The change was directly implemented in the Production environment as per the Severity of the tickets and also change request and maintenance request for the code was done .Defects and data issues resolution was also the part of Level 3 project. Project supports more than 300 applications of Capital One finance.

Responsibilities:

  • Extensively working on Ab-Initio V2.15, Teradata, UNIX utilities like BTEQ.
  • Created /modified graphs using various components such as Merge, Rollup, Scan, Reformat Normalize, Dedup Sorted etc. to implement the requirements
  • Requirements analysis, Design, Code Ab-Initio graphs and Unix wrappers
  • Working end to end from design to development, unit testing, system testing and delivery.
  • Writing SQL scripts for ETL development in Teradata environment using BTEQ which involve complex business rules and joining multiple source and target tables.
  • Used Enterprise Meta Environment EME repository and also used for versioning controlling.
  • Use CRTL-M for scheduling.

Environment: Ab-initio V2.15, Teradata V12/V13, CTRL-M, UNIX.

Hire Now