- Over 8 plus years of progressive hands - on experience in analysis, ETL processes, design and development of enterprise level data warehouse architectures, designing, coding, testing, integrating ETL.
- Experience in Dimensional data modelling techniques, Slow Changing Dimensions (SCD), Software Development Life Cycle (SDLC) (Requirement Analysis, Design, Development & Testing), and Data warehouse concepts - Star Schema/Snowflake Modelling, FACT & Dimensions tables, Physical & Logical Data Modelling.
- Experienced in integration of various data sources like Oracle 11g,10g/9i/8i, MS SQL Server, XML files, Teradata, Netezza, Sybase, DB2, Flat files, Salesforce, Mainframe sources into staging area and different target databases.
- Expertise in the ETL Tool Informatica and have extensive experience in Power Center Client tools including Designer, Repository Manager, Workflow Manager/ Workflow Monitor.
- Extensively worked with complex mappings using various transformations like Filter, Joiner, Router, Source Qualifier, Expression, Union, Unconnected / Connected Lookup, Aggregator, Stored Procedure, XML Parser, Normalize, Sequence Generator, Update Strategy, Reusable Transformations, User Defined Functions, etc.
- Good knowledge and hands-on experience in OBIEE 10g and of OBIEE 11g in integration with Bi-Publisher.
- Extensively worked on Relational Databases Systems like Oracle11g/10g/9i/8i, MS SQL Server, Teradata and Source files like flat files, XML files and COBOL files
- Excellent background in implementation of business applications and in using RDBMS and OOPS concepts.
- Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities using IDQ 9.6.1.
- Experienced in mapping techniques for Type 1, Type 2, and Type 3 Slowly Changing Dimensions
- Strong experience in SQL, PL/SQL, Tables, Database Links, Materialized Views, Synonyms, Sequences, Stored Procedures, Functions, Packages, Triggers, Joins, Unions, Cursors, Collections, and Indexes in Oracle
- Sound knowledge of Linux/UNIX, Shell scripting, experience in command line utilities like pmcmd to execute workflows in non-windows environments.
- Have hands on experience in tools like Address Doctor which is used for Address validations
- Proficiency in working with Teradata utilities like (BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, Teradata Administrator, SQL Assistant, PMON, Visual Explain).
- Implemented change data capture ( Confidential ) using Informatica power exchange to load data from clarity DB to TERADATA warehouse.
- Experience in integration of various data sources with Multiple Relational Databases like DB2, Oracle, SQL Server, MS Access, Teradata, Flat Files, XML files and other sources like Salesforce, etc.
- Extensive experience using database tool such as SQL *Plus, SQL *Developer, Autosys and TOAD.
- Proficient in Oracle Tools and Utilities such as TOAD and SQL*Loader.
- Identified and fixed bottlenecks and tuned the complex Informatica mappings for better Performance.
- Excellent analytical, problem solving, technical, project management, training, and presentation skills.
ETL Tools: Informatica Power Center 10.1/9.6.1/9.5/9.1/8.6 , Informatica Cloud, Informatica Power Exchange 5.1/4.7/1.7, Power Analyzer 3.5, Informatica Power Connect and Master Data Management (MDM), Data Quality Tool (IDQ), Informatica Data Services(IDS) 9.6.1, DataStage
Databases: Oracle12C/11g/10g/9i/8i/8.0 Teradata14.1, DB2 UDB 8.1, MS SQL Server 2008/2005, SQL Server Management Studio (2012), Netezza 4.0 and DB Artisan ( Sybase) ASE 12.5.3/15
Operating Systems: UNIX (Sun-Solaris, HP-UX), Linux, Windows NT/XP/Vista, MSDOS
SQL, SQL: Plus, PL/SQL, TSQL, Perl Scripting, UNIX Shell Scripting
Data Modelling: Dimensional Data Modelling, Star Schema Modelling, Snow Flake Modelling, FACT, Dimensions, Relational Modelling, Physical, Logical Data Modelling, and ER Diagrams.
Tools: Erwin, Tortoise SVN, CA Scheduling Tool, ESP, Tidal, Tivoli Job Scheduler.
Other Tools: SQL Navigator, SQL for DB2, Quest Toad for Oracle, Toad for Data Analyst, SQL Developer, Autosys, Telnet, MS SharePoint, MS Excel, MS Access, Mercury Quality centre,, JIRA, SSIS
Agile, Waterfall: .
Confidential, Jersey City, NJ
Sr. Informatica Data Quality (IDQ) Developer
- Involved in gathering and analysis business requirement and writing requirement specification documents and identified data sources and targets.
- Developed ETL programs using Informatica to implement the business requirements.
- Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment.
- Created mappings in Informatica Data Quality (IDQ) using Parser, Standardizer and Address Validator Transformations.
- Developed several IDQ complex mappings in Informatica a variety of Power Center, transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using Informatica Power Center.
- Profiled the data using Informatica Data Explorer (IDE) and performed Proof of Concept for Informatica Data Quality (IDQ).
- Design, document and configure the Informatica MDM Hub to support loading, cleansing of data.
- Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems.
- Worked on data cleansing and standardization using the cleanse functions in Informatica MDM & Worked on maintaining the master data using Informatica MDM
- Used Informatica Power Center to load data from different data sources like xml, flat files and Oracle, Teradata, Salesforce etc.
- Imported the IDQ address standardized mappings into Informatica Designer as a Mapplets.
- Used relational SQL wherever possible to minimize the data transfer over the network.
- Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
- Integrate the address doctor with address validation transformation to cleanse the address.
- Exporting the Mapplets from IDQ into Informatica Power center to use the Mapplet in various mappings for implementation of Address doctor.
- Wrote UNIX Shell Scripts for Informatica Pre-Session, Post-Session and Autosys scripts for scheduling the jobs (work flows).
- Involved in creating UNIX shell scripts for Datastage job and Informatica workflow execution.
- Actively involved in Exceptional handling process in IDQ Exceptional transformation after loading the data in MDM and notify the Data Stewards with all exceptions.
- Imported the mappings developed in data quality (IDQ) to Informatica designer.
- Worked on Confidential (Change Data Capture) to implement SCD (Slowly Changing Dimensions) Type 1 and Type 2.
- Used Autosys for Scheduling the Informatica Workflows & have done testing using Autosys scheduling tool.
- Converted all the jobs scheduled in Maestro to Autosys scheduler as the per requirements.
- Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads, and process flow of the mappings
Environment: Informatica Power Center 10.1/9.6.1, Data Quality 9.6.1, UNIX, SQL, IDE, Confidential, MDM, Linux, Perl, PL/SQL, Netezza, Teradata, Oracle 11g/10g, Microsoft SQL Server 2008, and Microsoft Visual studio.
Confidential, Dallas, TX
Sr. Informatica / MDM Developer
- Developing the ETL components as well as Oracle procedures, functions & triggers.
- Defined Trust and validation rules for the base tables & created PL/SQL procedures to load data from Source Tables to Staging Tables
- Created Oracle PL/SQL Cursors, Triggers, Functions and Packages
- Created, executed and managed ETL processes using Oracle Data Integration (ODI) & Customized ODI Knowledge modules like Loading Knowledge Modules and Integrated Knowledge Modules.
- Involved in implementing the Land Process of loading the customer Data Set into Informatica MDM from various source systems.
- Involved in Installing and Configuring of Informatica MDM Hub Console, Hub Store, Cleanse and Match Server, Address Doctor , Informatica Power Center applications.
- Worked on data cleansing and standardization using the cleanse functions in Informatica MDM .
- Performed land process to load data into landing tables of MDM Hub using external batch processing for initial data load in hub store.
- Data governance application for Informatica MDM Hub that enables business users to effectively create, manage, consume, and monitor master data using IDD (Informatica Data Director).
- Experienced in creating IDQ mappings using Labeler, Standardizer, Address Validator transformations with Informatica Developer and migrated to Informatica Power Center.
- Worked on IDQ parsing, IDQ Standardization, matching, IDQ web services.
- Imported the mappings developed in data quality (IDQ) to Informatica designer.
- Worked on Informatica Analyst Tool IDQ, to get score cards reports for data issues.
- Responsible in using Data Integration Hub (DIH) in creating topics and applications to publish and subscribe data.
- Designed and Developed SSIS Packages using various Control Flow and Data Flow items to Transform and load the Data from various Databases using SSIS.
- Configuring with checkpoints, package logging, error logging and event handling to redirect error rows and fix the errors in SSIS .
- Worked on Teradata Utilities like Fast-Load, Multi-Load & Fast-Export.
- Created Teradata External loader connections such as MLoad, Upsert and Update, Fastload while loading data into the target tables in Teradata Database.
- Created scripts in Teradata to load data in multiple layers and designed and developed FLOAD for stage load, MLOAD for OLAP load and worked on TPT for reporting table load.
- Extensively tested the Address doctor files and updated them with the new monthly release files from Informatica address doctor site
Environment: Informatica Power Center 9.6.1, MDM, UNIX, Oracle, Linux, Perl, Shell, IDQ, IDS, PL/SQL, Tivoli, Oracle 11g/10g, Teradata 14.0.
Confidential, Richardson, TX
ETL / Teradata Developer
- Documented high and low-level design document specifications for source-target mapping, based on the transformation rules .
- Documented technical requirements for ETL process and Design documents for each source. Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data migration.
- Uploaded data from operational source system (Oracle 8i) to Teradata .
- Used utilities of FLOAD, MLOAD, FEXP, TPUMP of Teradata and created batch jobs using BTEQ .
- Worked on Teradata SQL Assistant querying the source/target tables to validate the BTEQ scripts & imported Metadata from Teradata tables.
- Written Teradata BTEQs & as well Informatica mappings using TPT to load data from Staging to base.
- Fine-tuned Teradata BTEQs as necessary using explain plan and collecting statistics
- Has a very good knowledge of FACETS tool and Healthcare domain, Worked on the various modules like Subscriber/Member, Groups, Enrollment, Claims, Billing, Accounting, Provider, MTM and Utilization Management.
- Good experience on FACETS CTP (Claims Test Pro) and FACETS Testing
- Used IDQ's standardized plans for addresses and names clean ups.
- Worked on IDQ file configuration Confidential user's machines and resolved the issues & used IDQ to complete initial data profiling and removing duplicate data.
- Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality(IDQ) environment.
- Created and used the Normalizer Transformation to normalize the flat files in the source data.
- Worked on Maestro job scheduling and Unix Scripting.
- Developed UNIX shell scripts to run the pmcmd functionality to start and stop sessions, batches and scheduling workflows.
- Involved in migrating the ETL Code to different environments from Dev to UAT and then to Production with ETL Admins.
- Experience in working with reporting team in building collection layer for reporting purpose.
Environment: Informatica Power Center 9.6.1/9.5.1 , IDQ, Oracle 11g/10g, My Sql, Teradata 13.10/12, Flat File, UNIX, Windows.
Confidential, Kansas City, KS
ETL / Informatica Developer
- Designing the dimensional model and data load process using SCD Type 2 for the quarterly membership reporting purposes.
- Derived the dimensions and facts for the given data and loaded them on a regular interval as per the business requirement.
- Extracted data from multiple sources such as Oracle, XML, and Flat Files and loaded the transformed data into targets in Oracle, Flat Files.
- Wrote Shell Scripts for Data loading and DDL Scripts.
- Designing and coding the automated balancing process for the feeds that goes out from data warehouse.
- Implement the automated balancing and control process which will enable the control on the audit and balance and control for the ETL code.
- Improving the database access performance by tuning the DB access methods like creating partitions, using SQL hints, and using proper indexes.
- All the jobs are integrated using complex Mappings including Mapplets and Workflows using Informatica power center designer and workflow manager.
- Use debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations, and create mapplets that provides reusability in mappings.
- Analyzing the impact and required changes to in corporate the standards in the existing data warehousing design.
- Following the PDLC process to move the code across the environments though proper approvals and source control environments & Source control using SCM.
Environment: Informatica Power Center 9.1/8.5, Power Exchange, UNIX, Oracle 10g, SQL Server 2008, SQL Assistant, DB2.
ETL / DWH Developer
- Involved in analysis, design, development, test data preparation, unit and integration testing, Preparation of Test cases and Test Results
- Coordinating with client, Business and ETL team on development
- Developed Batch jobs using extraction programs using COBOL, JCL, VSAM, Datasets, FTP to Load Informatica tables
- Involved in full project life cycle - from analysis to production implementation and support with emphasis on identifying the source and source data validation, developing logic, and transformation as per the requirement and creating mappings and loading the data into BI database.
- Based on the business requirements created Functional design documents and Technical design specification documents for ETL Process.
- Developing code to extract, transform, and load (ETL) data from inbound flat files and various databases into various outbound files using complex business logic.
- Most of the transformations were used like Source Qualifier, Aggregator, Filter, Expression, and Unconnected and connected Lookups, Update Strategy.
- Created automated shell scripts to transfer files among servers using FTP, SFTP protocols and download files.
- Expertise in creating control files to define job dependencies and for scheduling using Informatica.
Environment: Informatica Power Center 8.5/8.1, ETL, Business Objects, Oracle 10g/9i/8i, PL/SQL.