- Over 7+years of experience Informatica / ETL expert in data warehouse Development, Testing, Deployment, Maintenance and Production Support of Data Warehousing Business Intelligence ETL.
- Proficient in using Informatica Power Center (9.6x/9.5/8.6)for developing the Data warehouse loads with work experience focused in Data Integration as per client requirement.
- Highly proficient in integrating data with multiple Databases involving Teradata, Oracle, My SQL, SQL Server, DB2, Mainframe and Flat Files like Delimited, Fixed width.
- Strong in database programming using PL/SQL and SQL with creation of packages, stored procedures, functions, triggers, materialized views, cursors and performance tuning of SQL.
- Experience in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch. Experienced in writing test cases, test scripts, test plans and execution of test cases reporting and documenting the test results.
- Experience in UNIX environment, file transfers and job scheduling.
- Experience in working with sales operations team for analyzing and optimizing data from Third party vendors.
- Expertise in Informatica Data Quality (IDQ) tool installation and Standardization, Address Validation and Deduplication.
- Extensive experience in Data warehousing tools Informatica Power Center using different modules like Designer, Workflow Manager, Workflow Monitor and Repository Manager.
- Expertise in Installing and Configuring of Informatica MDM Hub Console, Hub Store, Cleanse and Match Server, Address Doctor, Informatica Power Center applications.
- Extensively worked on integrating various data sources - Oracle, SQL Server, DB2, Teradata, Netezza, Sybase, XML files and Flat files.
- Experience in UNIX shell scripting, Perl scripting and automation of ETL Processes.
- Worked in production support, resolving the production job failures, interacting with the operations support group for resuming the failed jobs.
- Experience in Dimensional Modeling (Erwin), Star / Snowflake Schema, Fact / Dimension Tables, Business Process Analysis, Production support, Data cleansing, Data analysis and Performance tuning of sources, targets, mappings sessions and SQL's.
- Strong in scheduling ETL load using utilities like Control M, Tivoli, Austosys.
- Strong in implementation of data profiling, creating score cards, creating tables and documenting Data Quality metrics/dimensions like Accuracy, completeness, duplication, validity, consistency.
- Developed complex Mappings in Informatica using various transformations like Source Qualifier, Joiner, Aggregator, Update Strategy, Rank, Router, Java, Lookup - Connected & Unconnected, Sequence Generator, Filter, Sorter, Stored Procedure transformation etc.
- Proficiency in data warehousing techniques for SLOWLY CHANGING DIMENSIONS phenomenon, surrogate key assignment, Normalization & De-normalization, Cleansing, Performance Optimization along with the CDC (Change Data Capture).
- Experienced in working both Waterfall & Agile Methodologies.
- Good experience in performing and supporting Unit testing, System Integration testing (SIT), UAT.
- Experience working with offshore and onsite co-ordination.
- Able to work independently and collaborate proactively & cross functionally within a team.
- Good team player with ability to solve problems, organize, prioritize multiple tasks, with excellent analytical, communication and presentation skills.
ETL Tools: Informatica Power Center, Informatica Power Exchange, Informatica Data Quality, Informatica Data Director, Informatica Metadata Manager, Informatica B2B
Oracle, SQL Server, DB2, MySQL, MS: Access
Data Modelling: WRWIN, Teradata
Reporting Tools: OBIEE, Business Objects
Operating Systems: LINUX, Sun Solaris, Windows, IBM AIX
SQL, PL/SQL, T: SQL, Shell Scripting
Autosys, Control: M
Confidential, Quincy, MA
- Responsible for Business Analysis and Requirements Collection.
- Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
- Parsed high-level design specification to simple ETL coding and mapping standards.
- Designed and customized data models for Data warehouse supporting data from multiple sources on real time
- Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
- Created mapping documents to outline data flow from sources to targets.
- Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
- Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
- Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
- Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
- Developed mapping parameters and variables to support SQL override.
- Created mapplets to use them in different mappings.
- Developed mappings to load into staging tables and then to Dimensions and Facts.
- Used existing ETL standards to develop these mappings.
- Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
- Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
- Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
- Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
- Modified existing mappings for enhancements of new business requirements.
- Used Debugger to test the mappings and fixed the bugs.
- Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
- Involved in Performance tuning at source, target, mappings, sessions, and system levels.
- Prepared migration document to move the mappings from development to testing and then to production repositories.
Environment: Informatica Power Center 8.6.1, Workflow Manager, Workflow Monitor, Informatica Power Connect / Power Exchange, Data Analyzer 8.1, PL/SQL, Oracle 10g/9i, Erwin, Autosys, SQL Server 2005, Sybase, UNIX AIX, Toad 9.0, Cognos 8.
Confidential, Minneapolis, MN
- Coordinated with Business Users for requirement gathering, business analysis to understand the business requirement and to prepare Technical Specification documents (TSD) to code ETL Mappings for new requirement changes.
- Develop strategy for implementing data profiling, data quality, data cleansing and ETL metadata.
- Analysis of Source, Requirement, existing OLTP system and Identification of required dimensions and facts from the Database.
- Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center.
- Set up batches and sessions to schedule the loads at required frequency using Power Center Workflow manager and accessing Mainframe DB2 systems.
- Developed various Mappings, Mapplets, and Transformations for data marts and Data warehouse.
- Involved in creation of Data Warehouse database (Physical Model, Logical Model) using Erwin data modeling tool.
- Set up batches and sessions to schedule the loads at required frequency using Power Center Workflow manager.
- Worked on OBIEE Answers to create the reports as per the client requirements and integrated them into the Dashboards.
- Extensively worked on Autosys to schedule the jobs for loading data.
- Worked on Power Exchange for change data capture (CDC).
- Executing DML, DDL and DCL commands as SQL queries.
- Developed shell scripts, PL/SQL procedures, for creating/dropping of table and indexes of performance for pre and post session management.
- Designed and Developed Oracle PL/SQL and UNIX Shell Scripts, Data Import/Export.
- Analysis and code development using Agile methodology.
- Used mapping parameters and variables for pulling incremental loads from source
- Identified and fixed the Bottle Necks and tuned the Mappings and Sessions for improving performance. Tuned both ETL process as well as Databases.
- Defined the program specifications for the data migration programs, as well as the necessary test plans used to ensure the successful execution of the data loading processes.
- Involved in the design, development and implementation of the Enterprise Data Warehousing (EDW) process.
- Provided data warehouse expertise including data modeling, Extract, Transform and Load (ETL) analysis, design and development.
- Hands-on Experience in working with Source Analyzer, Warehouse Designer, Mapping Designer, Mapplets to extract, transform and load data.
- Created Mapping Parameters, Session parameters, Mapping Variables and Session Variables.
- Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
- Worked with various Active and Passive transformations like Source Qualifier, Sorter, Aggregator, Filter, Union, and Router Transformations, Sequence Generator and Update Strategy Transformations.
- Handled versioning and dependencies in Informatica.
- Developed schedules to automate the update processes and Informatica sessions and batches.
- Resolving technical and design issues.
- Developed data transformation processes, maintain and update loading processes.
- Developed and implemented the UNIX shell scripts for the start and stop procedures of the sessions.
- Used UNIX shell scripts to run the batches.
- Developed standards and procedures to support quality development and testing of data warehouse processes.
Environment: Informatica Power Center, Oracle, MS SQL Server, UNIX(Sun Solaris/AIX), Data Marts, Erwin Data Modeler, Agile Methodology, Teradata, FTP, MS-Excel, Ms-Access, UNIX Shell Scripting, Data Modeling, PL/SQL, Autosys
Confidential, Cleveland, OH
- Used update strategy to effectively migrate data from source to target.
- Designed ETL Process using Informatica to load data from Oracle, Flat Files to target Oracle Data Warehouse.
- Interacted with the business community and database administrators to identify the Business requirements and data realties.
- Created various Transformations as per the business logic like Source Qualifier, Normalizer, Lookup, Stored Procedure, Sequence Generator, Router, Filter, Aggregator, Joiner, Expression and Update Strategy.
- Wrote stored procedures, functions, Packages and used in many Forms and Reports.
- Wrote database triggers for automatic updating the tables and views.
- Involved in the Performance Improvement project.
- Involved in designing of testing plan (Unit Testing and System Testing).
- Tested scripts by running workflows and assisted in debugging the failed sessions.
- Improving Workflow performance by shifting filters as close as possible to the source and selecting tables with fewer rows as the master during joins.
- Used persistent caches whenever data from workflows were to be retained.
- Used Connected and Unconnected lookups whenever appropriate, along with the use of appropriate caches.
- Perform Maintenance, including managing Space, Remove Bad Files, Remove Cache Files and monitoring services.
- Set up Permissions for Groups and Users in all Development Environments.
- Involved in the team meetings and providing status report to project manager.
Environment: Informatica Power Center, Oracle, Flat Files, OBIEE, PL/SQL, UNIX, Autosys, ERWIN, TOAD, UNIX Shell Scripting.
Confidential, San Antonio, TX
- Worked with business analyst to understand business requirement in order to transform business requirements into effective technology solutions. Also involved in robust teamwork with Managers, Architects and Data Modelers to understand the business process and functional requirements.
- Created high-level, low-level design documents, ETL specification documents, data model document and test document.
- To implement business rules created robust mapping, Mapplets and reusable transformation using Informatica Power Center and its different transformation including Joiner, Look-up, Rank, Filter, Router, Expression, Aggregators, Sequence Generator, Sorter, Update Strategy.
- Improved and enhanced various jobs in different cycles by creating reusable and common job techniques, extensively used parameters at run time to push in job related parameters.
- Worked with connected and unconnected look-up and configured the same for implementing complex logic.
- Worked on various Salesforce.com objects like Accounts, Contacts, Cases, Opportunity.
- Worked on performance tuning by identifying the bottlenecks in Sources, Targets, and Mapping. Enhanced Performance on Informatica sessions using large data files by using partitions.
- Actively participating in proving technical proposal for upgraded existing ETL and OBIEE code at client locations (In order to make use of advanced features of Informatica newer version).
- Involved in the design, development and testing of the PL/SQL stored procedures, packages for the ETL processes.
- Created Unix Shell Scripts for ETL jobs, session log cleanup, dynamic parameter and maintained shell scripts for data conversion * Managed the authenticity of jobs by checking-in checking-out code from Star Team code versioning tool.
- Maintaining daily batch cycle and providing 24-hour Production support.
- Developed business object reports like universe, crystal reports for business validation.
- Worked with Offshore team to oversee the development activity, reviewed code to make sure it confirms the standard programming practice at Confidential .
- Good working experience in Tableau Dashboard and extensive uses of Tableau features like Data blending, Extracts, Parameters, Filters, Calculations, Context Filters, Hierarchies, Actions, Maps etc.
- Build number of UNIX Shell scripts for PL/SQL programs to schedule them on Control M.
Environment: Informatica PowerCenter, PowerExchange, Oracle, Teradata, SQL/PLSQL, OBIEE, Salesforce, SQL, UNIX/LINUX, Control M, Putty, Rally.