- Around 7 years of extensive experience in Data migration and expertise in use of Informatica Power Center.
- Designed and developed Complex mappings from various Transformations like re - usable transformations, and Mappings/Mapplets, Unconnected / Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy and more, mappings using Stored Procedure’s, Normalizer.
- Involved in designing and implementing Data Mart / Data Warehouse applications using Informatica 9.x/10.x (Designer, Workflow manager, Workflow monitor and Repository Manager).
- Good knowledge on SDLC (software development life cycle) and good experience with unit testing and integration testing.
- Experience in Performance tuning of targets, sources, mappings and sessions.
- Strong expertise in Relational data base systems like Oracle 8i/9i/10g, SQL Server 2000/2005,2014,2017, MS Access, design and database development using SQL, PL/SQL, SQL PLUS.
- Experience in using SQL, PL/SQL, Dynamic SQL, Stored procedures/functions, Triggers and Packages, Complex Joins, Correlated Sub-Queries, Aggregate functions, Analytic functions, Views, Materialized Views, Indexing, Partitioning and Performance tuning the same using explain plan.
- Experience in Ralph Kimball Methodology, Logical Modelling, Physical Modelling, Dimensional Data Modelling, Star Schema, Snowflake Schema, FACT tables, Dimension tables change or transformed of unstructured data to well-formed XML.
- Experienced in configuring Power Exchange components (Listener, Agent, Logger, and ECCR) on Mainframe server.
- Experience in data extraction from Mainframe server from DB2 database and VSAM files using Informatica Power Exchange.
- Experienced in creating the data maps for the VSAM files extraction by establishing the connection from Windows Azure server to main frames server.
- Experienced in creating registration groups in Informatica Power Exchange and extracting the data from DB2 for CDC implementation.
- Experience with high volume datasets from various sources like Oracle, Text Files, DB2 AS400, VSAM, and Mainframe Flat Files.
- ETL experience in development of mappings and tuned existing mappings for better performance using Informatica Power Exchange, Informatica Power Center as per the business rules.
- SQL Tuning and creation of indexes for faster database access, better query performance in Informatica by creating Explain Plan, SQL hints for query and indexing the required columns.
- Expertise in writing DDL, DML, DCL, and TCL commands.
- Ability to achieve project goals within project constraints such as scope, timing and budget. Allocating work and managing the resource planning. Day to day monitoring of the deliverable and meeting the expectations of the business clients.
- Outstanding communication and interpersonal skills, ability to learn quickly, good analytical reasoning and high compliance to new technologies and tools.
Primary Skills: Informatica Power Center 9.x, 10.x, Informatica Power Exchange 10.x
SDLC Methodologies: Waterfall, Agile
Operating Systems: Windows 2000/XP/2007, 2010, Windows 7, 8, 10, Windows Server 2012 R2
Database: MS SQL Server 2017/2014, Oracle 11g/10g/9i, Main frame VSAM, DB2, PL/SQL
Confidential, Plymouth, MN
Environment: MS SQL Server 2014/2016, DB2, VSAM, Oracle, Windows 2012 R2 SP1, Windows 10
- Requirements gathering through table level and column level mappings and knowing client’s data element fields and their data structures.
- Writing the transformation rules based on the column level mapping discussions.
- Installation and configuration of Informatica Power Exchange and Power Center on the Conversion/mainframe servers.
- Created the staging Database with Legacy tables and structures in MS SQL Server.
- Involved in Power Exchange installation on Mainframe server and helped the client in accessing the Runlib and Binlib libraries.
- Experienced in sending all the Power Exchange files to the mainframe server using Windows installation wizard used to FTP the parameter files.
- Configured parameters in the dbmover.cfg file for the Power Exchange Listener.
- Extensively worked on data extraction of VSAM files directly from Mainframe server and through binary files as well.
- Experienced In creating the data maps using the COBOL copy books for the VSAM extraction.
- Experienced in creating the registration groups and extracting the data from DB2 tables from mainframe server for CDC implementation.
- Used Power Exchange to create and maintain data maps for each file and to connect Power center to mainframe.
- Used Power Exchange to source copybook definition and then to row test the data from data files.
- Worked on EBCDIC files using power exchange and loaded data to SQL Server tables.
- Extensively worked with SCD Type-I, Type-II and Type-III dimensions and data warehousing Change Data Capture (CDC).
- Worked on integrating data from Flat files like fixed width /delimited and Mainframe COBOL files.
- Extracted the Client’s VSAM, DB2, Oracle and SQL server data to our SQL server staging Database by creating data maps in Power Exchange.
- Experience in extracting the VSAM binary files, VSAM data directly from the main frame server by establishing the site to site connection, DB2 data using DB2 connect in Informatica PowerCenter.
- Development of transformation rules as per the column level mapping document using various transformation techniques in Informatica.
- Good experience in writing various complex SQL queries using joins at the Informatica Source qualifier level to increase the performance of huge data set mappings.
- Expert in using various transformations like Joiner, Aggregator, Expression, Router, Look up, Union in Informatica.
- Experience in performance tuning of Informatica mappings at both session and workflow level.
- Experience in creating the incremental load mappings to capture the change data in the target tables.
- Experience in designing SCD type 1, type 2 and type 3 mappings.
- Involved in data cleansing to see that accurate data is being loaded in the target.
- Data validation by wring the SQL queries after the data has been converted into target tables.
- Data scrambling and encryption of identified PII columns in the data base to ensure the data Security.
- Identifying and resolving the defects raised by the clients during UAT.
Confidential, New York, NY
Environment: MS SQL Server, Oracle 11g/10g, and LOTUS NOTES
- Worked and coordinated with Data Architects, Business Analysts & users to understand business and functional needs and implement the same into an ETL design document.
- Worked on Source Analyzer, Mapping & Mapplet Designer and Transformations, Informatica Repository Manager, Workflow Manager and Workflow Monitor.
- Develop mappings transformations like Filter, Joiner, Sequence Generator and Aggregator and perform query overrides in Lookup transformation as and when required to improve the performance of the mappings.
- Understanding the data structure and load the data into each client’s region every week after the application build releases.
- Data cleansing of any unstructured data.
- Writing several complex sql queries which will be used for validation purposes which helps in client’s UAT testing.
- Performance tuning of Informatica mappings at mapping, session and workflow level.
- Experienced in using several re-usable mapplets.
- Scheduling the weekly jobs and monitoring of the jobs at the session and workflow level.
- Used Error handling strategy for trapping errors in a mapping and sending errors to an error table.
- Fixing the defects/bugs raised by client’s UAT testers.
- Experience in production support and Front-end application support to the users.
Confidential, Louisville, CO
Environment: MS SQL Server, Oracle 11g/10g
- Interacted with Business Analysts and Users for gathering and analyzing the Business Reports Requirements.
- Closely worked with ETL team to configure Informatica connections, source DB connection and target DB connection in DAC.
- Worked closely with the Project Manager and Data Architect. Assisted Data Architect in design by doing source data analysis, rectifying the requirement documents, creating source to target mappings.
- Worked with power center tools like Designer, Workflow Manager, Workflow Monitor, and Repository Manager.
- Extensively used Informatica Transformation like Source Qualifier, Rank, SQL, Router, Filter, Lookup, Joiner, Aggregator, Normalizer, Sorter etc. and all transformation properties.
- Developed ETL mappings, transformations using Informatica Power center.
- Responsible in developing the ETL logics and the data maps for loading the tables.
- Designed for populating Tables for one time load and Incremental loads.
- Worked on Informatica tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
- Created and scheduled Sessions, Jobs based on demand, run on time and run only once. Monitored Workflows and Sessions using Workflow Monitor.
- Used Informatica reusability at various levels of development.
- Created, launched & scheduled sessions.
Environment: MS SQL Server, Oracle 11g/10g
- Requirement gathering and then develop and design ETL solutions as required.
- Understanding the requirement and Data model to develop the mappings accordingly.
- Creating High Level Design and Low Level Design documents.
- Designed and Developed Informatica Mappings from Scratch to Load the Data from Source System to Staging system and Warehouse System.
- Developed many complex Full/Incremental Informatica Objects (Workflows / Sessions, Mappings/ Mapplets) with various transformations.
- Writing ETL codes, SQL queries to check the desired results.
- Delivering the code on time as per schedule.
- Provided technical leadership to the team in order to produce system designs for the framework and code that are scalable, robust, reusable and flexible.
- Involved with the Data Modelling team and provided suggestions in creating the data model.
- Provide inputs to Project Manager for project planning, providing robust estimates for development tasks.