- 6 years of IT experience in Planning, Analysis, Data Warehousing teamed with Data Analysis, Data Modeling, Business Requirements Analysis, Business Intelligence in different domains like Insurance, Healthcare, Financial, Retail with a strong conceptual background in Database development and Data warehousing.
- Worked around 5 years on ETL Tool developing Data warehousing environment and real - time environment using Informatica Tool ( Designer , Workflow manager, Workflow monitor, Repository Manager ), providing solutions for like Insurance, Healthcare, Financial, Retail
- Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
- Highly proficient in Development, Implementation, Administration and Support of ETL processes for Large-scale Data warehouses by Using Informatica Power Center 10.2.0/9.6/9.5/9.1/9.0/8. x.
- Involved in all phases of the SDLC (Software Development Life Cycle) from analysis, design, development, testing, implementation and maintenance with timely delivery against aggressive deadlines.
- Proficient in the Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g/9i, MS SQL Server, DB2, Teradata, Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
- Superior SQL skills and ability to write and interpret complex SQL statements and also skillful in SQL optimization and ETL debugging and performance tuning.
- Slowly Changing Dimensions Management including Type 1, 2, 3, Hybrid Type 3, De-normalization, Cleansing, Conversion, Aggregation, Performance Optimization.
- Extensively experience in developing Informatica Mappings / Mapplets using various Transformations for Extraction, Transformation and Loading of data from Multiple Sources to Data Warehouse and Creating Workflows with Worklets& Tasks and Scheduling the Workflows.
- Expertise in using Informaticaclient tools - Designer, Repository, Manager, Repository Server, Administration Console, Mapplets, Mappings, Workflow Manager, and Workflow Monitor, Re-usable Transformations, Shortcuts, Import and Export utilities.
- Experience in Data Warehouse development working with Extraction/Transformation/Loading using Informatica Power Mart/Power Center with flat files, Oracle, SQL Server.
- Experience working on Data quality toolsInformatica IDQ 9.1.
- Experience Relational Modeling and Dimensional Data Modeling using Star Snow Flake schema, Denormalization, Normalization, and Aggregations.
- Experience working in multi-terabytes data warehouse using Databases like Oracle 11g/10g/9i, MS Access 2000/2002, XML, IBM UDB DB2 8.2, SQL Server 2008, MS Excel and Flat files.
- Very strong in SQL and PL/SQL, extensive hands on experience in creation of database tables, triggers, sequences, functions, procedures, packages, and SQL performance-tuning.
- Have Good understanding of ETL/Informatica standards and best practices, Slowly Changing Dimensions SCD1, SCD2 and SCD3 and Enterprise Data Warehouse Concepts.
- Experience in testing coordination, writing test cases and executing test scripts and logged defects in Quality Center QC.
- Working closely with ETL developers and other leads during development and support of BI application.
- Experience with Data Extraction, Transformation, and Loading ETL from disparate Data sources.
- Creation of tables, packages, mappings, batch jobs, roles and users in Informatica MDM Hub.
- Worked extensively in various kinds of queries such as Sub-Queries, Correlated Sub-Queries, and Union Queries for Query tuning.
- Extensively worked on Data migration , Data cleansing and Data Staging of operational sources using ETL processes and providing data mining features for data warehouses.
- Hands on experience using query tools like TOAD , PLSQL developer , Teradata SQL Assistant and Query man .
- Developed UNIX scripts for dynamic generation of Files & for FTP/SFTP transmission.
Operating System: UNIX, Windows, MS-DOS
Language/Tools: SQL, PL/SQL, TSQL, C, C++
Scheduling Tools: Autosys, Control-M, Informatica Scheduler
ETL Tools: MDM, Informatica Power Center 10.x/9.x/8.x, ETL Informatica Cloud, SSIS
Database: MS SQL Server, Oracle 8i/9i/10g/11g/12c RDBMS DB2, Netezza, Teradata, PostgreSQL, Redshift
Scripting: Shell Scripting, Python
Data Modeling Tools: Microsoft Visio, ERWIN 9.3/7.5
Data Modeling ER: (OLTP) and Dimensional (Star, Snowflake Schema)
Data Profiling Tools: Informatica IDQ 10.0, 9.5.1
Excel Tools & Utilities: TOAD, SQL Developer, SQL*Loader, Putty
Other Tools: Notepad++, Toad, SQL Navigator, Teradata SQL Assistant, Snaplogic, AWS
Defect Tracking Tools: ALM, Quality Center, JIRA
Reporting Tools: IBM Cognos, Tableau 9
- Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
- Involved in the entire SDLC (Software Development Life Cycle) process that includes implementation, testing, deployment, documentation, training and maintenance.
- Responsible for designing and development, testing of processes necessary to extract data from operational databases , Transform and Load it into data warehouse using Informatica Power center.
- Used Informatica PowerCenter 10.2 for extraction, loading and transformation (ETL) of data in the data mart.
- Designed and developed ETL Mappings to extract data from flat files, SQL Server, and Oracle to load the data into the target database .
- Developing several complex mappings in Informatica a variety of PowerCenter transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using Informatic PowerCenter.
- Extensively used ETL processes to load data from various source systems such as SQL Server and Flat Files, XML files into target system applying business logic on transformation mapping for inserting and updating records when loaded.
- Developed Advance PL/SQL packages, procedures, triggers, functions, Indexes and Collections to implement business logic using SQL Navigator. Generated server-side PL/SQL scripts for data manipulation and validation and materialized views for remote instances.
- Experience in Informatica upgrade testing, troubleshoot and resolve issues.
- Created complex mappings in the designer and monitored them. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator transformations .
- Worked on complex Source Qualifier queries, Pre, and Post SQL queries in the Target.
- Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.
- Created sessions, batches for incremental load into staging tables and scheduled them to run daily.
- Ran the workflows on a daily and weekly basis using Tidal Scheduling tool.
- Examined the workflow log files and assigning the ticket to the Informatica support based on the error.
- Experience in auto adjudicating claim processing using FACET tool.
- Experience in testing claims using FACET tool.
- Performed operational support and maintenance of ETL bug fixes and defects.
- Maintained the target database in the production and testing environments.
- Supported migration of ETL code from Development to QA and QA to Production environments.
- Migration of code between the Environments and maintaining the code backups.
Environment: Informatica PowerCenter 10.2, PL/SQL, Flat files, Facets, XML, SQL Server 2014,2016, Microsoft Visual Studio 2014, SVN, Tidal, Microsoft Visio, JIRA.
Confidential, New York
- Involved in creating Informatica mappings in uploading the Data files into the Staging tables and then finally into Data Warehouse system.
- Interacting with the end users to get the business Requirements, reporting needs and involved in preparation of Business requirement documentation (BRD), and functional specification documentation (FSD).
- Made substantial contributions in simplifying the development and maintenance of ETL by creating re-usable Source, Target and Transformation objects.
- Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
- Created mapping documents to outline data flow from sources to targets.
- Extracted data from Flat files, AS400 sources, DB2, SQL server and loaded into Oracle staging and target tables applying business rules as per the requirement.
- Implemented SCD Type1 and SCD Type2 techniques to update slowly changing dimension tables.
- Design, document and configure the Informatica MDM Hub to support loading, matching, merging, and publication of MDM data.
- Used Debugger to test the mappings and fixed the bugs.
- Configured lookups, foreign-key relationships, queries/custom queries, query groups, and packages in MDM .
- Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
- Extensively worked on the Informatica Designer , Repository Manager, Repository Server, Workflow Manager/Server Manager and Workflow Monitor.
- Developed the Informatica Mappings parameters by using Aggregator , SQL overrides by using Lookups, source filter by using Source Qualifiers and Data Flow Management into multiple targets using Router Transformation .
- Knowledge in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.
- Created primary and secondary indexes and performed performance tuning for physical Tables i n database using views.
- Involved in design and creation of Fact tables and Dimensions and developed mappings to load into staging tables.
- Created sessions and workflows to run in parallel, sequentially and nested.
- Hands on experience in SQL* loader to load .csv files to oracle tables.
- Implemented performance tuning techniques by identifying and resolving bottlenecks in SQLQueries, Sources, Targets, transformations and mappings to improve performance.
- Prepared migration document to move the mappings from development to testing and then to production repositories.
- End-to-end ETL development of the Data Mart and worked with Informatica Data Quality Analysis, data matching and data conversion to determine cleansing requirements. Designed and developed Informatica mappings for data loads.
- Created Informatica mappings with PL/SQL Procedures / Functions/triggers to build business rules to load data.
- Hands on experience on Data Profiling/IDQ tool and Worked on enhancements for stored procedure and IDQ web services.
- Designed and developed ETL strategies and mappings from source systems to target systems. ETL strategies were designed to cater initial load and incremental load.
- Developed unit test scripts and test cases for unit testing and system testing.
- Involved with remote offshore team, creating the requirement documents, verifying coding standards and conducting code reviews.
Environment: Informatica Power Center 9.6.1, Control M, Oracle11g, MDM, Toad, DB2, WinSCP, WinSQL, Quality Centre, ERWIN, UNIX and TWS.
- Provided technical leadership and developed new business opportunities.
- Extracted data from oracle database and spreadsheets and staged into a single place and applied business logic to load them in the central oracle database.
- Developed Complex mappings to feed Data warehouse & Data marts by extensively using Informatica Mappings, Mapplets, and Transformations like Lookup, Filter, Router, Expression, Aggregator, Joiner, Stored Procedure and Update Strategy.
- Used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings.
- Created procedures to truncate data in the target before the session run.
- Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
- Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
- Have Experience on Teradata Utility scripts like BTEQ, FastLoad, MultiLoad and FastExport to load data from various source systems to Teradata.
- Working knowledge on Oracle 11g, SQL Server, Teradata, Netezza, DB2, MY SQL and UNIX shell scripting.
- Troubleshoot problems by checking Sessions and Error Logs. Also used Debugger for complex problem troubleshooting.
- Experience in building, enhancing and managing Universes, creating WebI, DeskI, Complex, Adhoc Reports, Canned Reports, and Charts using Business objects.
- Experience in complex data analysis in Designer, used data from multiple data providers, performed extensive data analysis with complex queries, drill up, drill down, slice and dice.
- Strong hands-on experience in creating/enhancing Universes using BO Designer and in creating and maintaining reports using Business Intelligence , Business objects , Web Intelligence , and Desktop Intelligence Written documentation to describe program development, logic, coding, testing, changes and corrections.
- Fine-tuned Informaticatransformations and workflows for better performance and Used Informatica Metadata Manager to show data lineage.
- Excellent knowledge in data analysis, data cleansing and data validation.
- Developed SSIS packages to load data into warehouse and benchmark the performance and supported multiple lines of business and IT initiatives.
- Collaborated with remote offshore team, creating the requirement documents, verifying coding standards and conducting code reviews.
Environment: Informatica Power Center 9.1, Power Exchange, DB2, Tivoli, SQL server 2005 / 2008, Oracle, Linux.