- Over 7+ years of IT experience in all phases of Data Warehousing and Business Intelligence including Requirements Gathering & Analysis, Design, Development, Testing and Production support.
- Expertise in implementing complex business rules by creating robust Mappings, Mapplets, Sessions, and Workflows using Informatica Power Center.
- Extensive experience in Extraction, Transformation, and Loading (ETL) data from various data sources into Data Warehouse and Data Marts using Informatica Power Center tools (Repository Manager, Designer, Workflow Manager, Workflow Monitor).
- Experience in designing/developing complex mapping using transformations like Source Qualifier, Router, Filter, Expression, Sorter, Aggregator, Normalizer, Joiner, Sequence Generator, Connected and Unconnected Lookup and Update Strategy.
- Extensive experience in Data Warehousing, ETL, Design, Development, Implementation, and Testing of Data Warehouse/Data Mart Systems.
- Experience working in IDQ (Informatica Developer Tools) to perform data cleansing, data matching, data conversion, exception handling, data profiling, reporting and monitoring.
- Used Address validator, Parse Transformation, Join Analysis Transformation to perform IDQ activities.
- Used IDQ to perform Unit testing and create Mapplets that are imported and used in PowerCenter Designer.
- Involved in tuning the Oracle PL/SQL queries by using Indexing, Hints and Parallel execution process.
- Proficient in the integration of various data sources involving multiple relational databases like Oracle, MS SQL Server, DB2, Greenplum, Teradata, XML and Flat Files (fixed width, delimited).
- Worked with PLSQL Stored Procedures, Triggers, Cursors, Indexes and Functions.
- Proficient in developing and maintaining various PLSQL/Database objects like packages, functions, stored procedures, triggers, tables, Views, Materialized views, Indexes, Sequences, partitions, etc.
- Experience in developing SQL*Loader control programs and PL/SQL validation scripts for validating the data load from staging area to production.
- Strong Knowledge on Oracle Architecture and Database Design using Normalization and E/R diagrams.
- Excellent conceptual knowledge of Oracle 10g/11g, Data Warehouse Concepts, ETL, Data modeling, and Dimensional Modeling.
- Strong experience in writing UNIX Shell scripts, SQL Scripts for development, automation of ETL process, error handling, and auditing purposes.
- Created different types of validation and reconciliation scenarios using the data from different databases, including Oracle, Teradata, MSSQL, and MySQL.
- Good understanding of Data modeling concepts ER Diagrams, UML, Use Cases, Normalization, and De - normalization of Tables, Excellent analytical, problem-solving skills with strong technical background and interpersonal skills.
- Experience in Scheduling Informatica sessions for automation of loads in Autosys, Control M, UC4.
- Expertise in understanding the Business process, writing Test Plans, Test Strategies, Test Cases, Executing scenarios as part of designing.
- Experience working with offshore and onsite coordination.
- Flexible, enthusiastic and project-oriented team player with solid communication and leadership skills to develop a creative solution for challenging client needs.
- Able to work independently and collaborate proactively & cross-functionally within a team.
Data Warehousing Tools (ETL): Informatica Power Center 10.2,10.1.1,9.6.1, 9.5.1,8.x, Informatica Data Quality 10.2,9.6.1,9.5.1,8.x, Informatica Data Validation Option (DVO), Informatica Cloud, Informatica Analyst, Informatica PowerExchange, Informatica Metadata Manager
RDBMS: Amazon Redshift, Oracle 12C,12G,11G, DB2, Teradata, SQL Server, Netezza, PostgreSQL, Mysql
Reporting Tools: Power BI, Tableau, Business Objects, Micro Strategy, Tableau
Languages: SQL, PL/SQL, UNIX Shell Scripting, Python, Apache Spark, Apache Scala
Operating Systems: Windows, UNIX, Linux, Centos
DB Utilities: Aginity WorkBench, TOAD, SQL* Loader, SQL Developer, SQL* Plus
Applications/Other Tools: AWS S3, AWS EC2, AWS SCT, AWS DMS, MS office, Silectis Magpie (Data Lake), Salesforce, Workday, Composite Data Virtualization, MS office, Control - M, Win SCP, Autosys, Putty, JIRA, Confluence, Service Now, Share Point, Cherwell Service Management, Nagios, Splunk.
Confidential, Centreville, VA
- Actively participated in requirement gatherings and analyzed Business Requirements with the Business SMEs, Project Managers, Users, Data Model Architects, DBAs and ETL leads to develop a data model for Dealer, Inventory, Finance, Franchise in Amazon Redshift.
- Migrating the Used Car Listings (UCL) data to Amazon Cloud from On-premises and Loading the Incremental Data to Data lake and testing the same using Silectis Magpie Datalake(AWS S3) for Data Analytics.
- Integrated Data warehouse Design for enormous sources feeding the Warehouse like Billing (Bill Immon), Finance, Sales, Call Details, Customer, Usage, Product Integration ( Confidential for Life, VHR, Used Car Listings).
- Involved in migration of the BI Data Warehouse from Actian Matrix (formerly Paraccel) to Redshift.
- Integrated data from Salesforce System using Snap logic ETL tool to handle Bulk Loads into Redshift using BULK writer.
- Used Bitbucket code versioning tool, loaded and retrieved Snap logic assets based on commit id using snaps.
- Introspect Salesforce Data by using Composite data virtualization studio before loading the data into Enterprise Data Warehouse.
- Design and Build the ETL processes to load Core Confidential Data from various source systems in order to perform data cleansing and transformations using various Informatica transformations like Aggregator, Update Strategy, Lookup, Joiner, Transaction Control, Mapplets and reusable Transformation etc.
- Implemented Incremental Data Loads for Daily loads using Informatica Power Center.
- Impelemented SCD type 2 for target as a flat file by using MD5 function, and loaded the Informatica Output file into Enterprise Data Warehouse tables using Insert, Upsert Scripts which are cost efficient, improves ETL runtimes, loads data in bulk.
- Developed test cases and test plans. Perform unit test for new and/or modified ETL programs.
- Utilized Control - M to automate daily, weekly, monthly, billing jobs in FTP Server, Informatica, Snaplogic, Data Lake.
Environment: Silectis Magpie (DataLake), AWS S3, AWS EC2, Python, Apache Spark SQL, Informatica Power Center 10.2, Informatica Power Exchange, Informatica Meta Data Manager, Snaplogic, Oracle 12C, AWS Redshift, MySQL, Flat files, JSON Files, Parquet Files, SOAP API, REST API, SAP, Composite Data Virtualization Studio, Workday, Salesforce, Toad for Oracle, Toad for MySQL, SAP Business Objects, Microstrategy, Bitbucket, Box, Putty, File Zilla, Unix, JIRA, Service Now, Confluence, Share Point.
Confidential, St. Louis, MO
- Coordinating with various business users, stakeholders and SME to get Functional expertise, design and business test scenarios review, UAT participation and validation of data from multiple sources.
- Defined and modified standard design patterned ETL frameworks, Data Model standards guidelines and ETL best practices.
- Provided technical design leadership to this project to ensure the efficient use of offshore resources and the selection of appropriate ETL/CDC logic.
- Performed detailed data investigation and analysis of known data quality issues in related databases through SQL.
- Actively involved in Impact Analysis phase of the business requirement and design of the Informatica mappings.
- Used web service source to define the input message of a web service operation and represented the metadata for a web service SOAP request.
- Performed data validation, data profiling, data auditing and data cleansing activities to ensure high quality Business Objects report deliveries.
- Configured sessions for different situations including incremental aggregation, pipe-line partitioning etc.
- Established new processes to bring in external datasets to Greenplum and make them available for larger reporting.
- Created effective Test Cases and performed Unit and Integration Testing to ensure the successful execution of data loading process.
- Documented Mappings, Transformations and Informatica sessions.
- Analyzed Session Log files in case the session failed to resolve errors in mapping or session configurations.
- Involved in designing the ETL testing strategies for functional, integration and system testing for Data warehouse implementation.
- Written Unix shell scripts for file manipulation, ftp and to schedule workflows in UC4.
- Co-ordinated offshore team on daily basis to leverage faster development.
Environment: Informatica Power Center 10.1, Informatica Power Exchange, Informatica Cloud Real Time, Informatica DVO, Oracle 12C, Greenplum, Flat files, SOAP, SAP, Workday, Toad, Business Objects, Get access, Box, Gnosis, Putty, Unix,UC4, Share Point.
Confidential, St. Louis, MO
- Responsible for Business Analysis, Requirements Collection and HLD creation
- Worked on Informatica Power Center tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
- Designing & developing cleansing and standardization scenarios to cleanse Address, Telephone Numbers, Email ID using IDQ Cleanse Adapter.
- Created various data quality mappings in Informatica Data Quality tool and imported them into Informatica PowerCenter as mappings, Mapplets.
- Replicate Salesforce data to data warehouse for advanced business analytics using Jitterbit Data Loader.
- Developed Cloud mappings to extract the data for different regions.
- Developed the audit activity for all the cloud mappings.
- Automated/Scheduled the cloud jobs to run daily with email notifications for any failures.
- Experience using various Teradata utilities such as Fastload, Multiload, TPump, BTEQ (to invoke these utilities & queries).
- Developed PowerExchange CDC mappings to retrieve data from Oracle to staging tables in Oracle.
- Strong experience on Informatica Metadata Manager, involved in pulling metadata from multiple sources into Metadata Manager to run the end to end Data Lineage.
- Modified existing mappings for enhancements of new business requirements.
- Documenting the ETL mapping strategies and transformations involved in the extraction, transformation and loading process.
- Participated in weekly status meetings and conducting internal, external reviews as well as formal walk through among various teams and documenting the proceedings.
- Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
- Involved in Performance tuning at source, target, mappings, sessions, and system levels.
- Defined Target Load Order Plan for loading data into Target Tables.
- Proficient in data warehousing techniques for Data cleansing, Slowly Changing Dimension phenomenon’s (SCD’s), Change Data Capture (CDC).
- Developed many Proof of concepts (POC’s) related to Meld parameterization.
- Ensured timely execution of test cases in ALM on a daily basis and reporting defects, if found.
Environment: Informatica Power Center 10.1.1, Informatica Cloud 10.1.1, Informatica DataQuality 10.1.1, Informatica MDM 10.1.1, Informatica Power Exchange, Informatica DVO, Power Connect, SQL Query Analyzer, SAP, DB2, Teradata, Netezza, SQL Server 2012, Toad, SQL DeveloperSales Force, Jitterbit Data Loader, Putty, Share Point, Unix.
Confidential, Charlotte, NC
- Involved in production support and development projects working as Informatica developer, which included full life cycle development including design, ETL strategy, troubleshooting reporting etc.
- Interaction with Business SME's on requirement gathering, understanding the business and functional requirements.
- Developed informatica mappings, mapping configuration task and Task flows using Informatica cloud service(ICS).
- Worked on the installation and setup ETL (Informatica cloud) applications on Linux servers.
- Utilized Informatica Data Quality (IDQ) for data profiling and matching/removing duplicate data, fixing the bad data, fixing NULL values.
- Used IDQ (Informatica Data Quality) as data Cleansing tool to create the IDQ Mapplets and to correct the data before loading into the Target tables.
- Monitored and cleansed data using Informatica Data Quality (IDQ) - source data profiling using Informatica Analyst and creating mappings in Informatica Developer to apply cleansing and business rules.
- Developed PowerExchange CDC/Real Time mappings to retrieve data from Oracle.
- Utilized Informatica Cloud's Real Time service, OData Service to provide the data, which exposes services to consumers.
- Optimized/tuned mappings for better session performance and eliminated bottlenecks by analyzing session statistics and throughput.
- Performance tuned Informatica mappings by adopting explain plans, cutting down query costs using Oracle hints, changing the mapping designs.
- Tracking and fixing the defects being raised in previous releases.
- Created several Functions, Procedures, Packages, and Triggers according to the Clients requirements in further enhancement projects.
- Tuning the performance of SQL statements using HINTS, INDEXES, and procedures using EXPLAIN PLAN and SQL Trace. Developed a set of views to fine tune the performance tuning process.
- Developed Complex SQL queries using various joins and also extensively used various dynamic SQL's in stored procedures and functions.
- Performed POC for migrating from Oracle 11g to Amazon Redshift using AWS Schema Conversion Tool (AWS SCT) and AWS Database Migration Service(AWS DMS).
Environment: Informatica Power center 10.0,9.6.1, Informatica Cloud, Informatica Data Quality 10.0,9.6.1, Informatica Analyst 9.6.1,10.0, Informatica DVO, Informatica Power Exchange, SQL, PL/SQL, Oracle 11g, Redshift, AWS S3, AWS EC2, AWS SCT, AWS DMS, SQL Developer, Oracle Hints, Unix.
Confidential, Phoenix, AZ
- Involved in project planning, development, assign tasks to various team members and tracking their activities.
- Used Informatica to extract, load and transform operations. This role participates in the design and the movement of data from the operational and external environments to the business intelligence environment.
- Designed and developed Informatica mappings to build business rules to load data.
- Extensively worked on Informatica Lookup, Aggregator, Expression, Stored Procedure and Update Transformations to implement complex rules and business logic.
- Developed complex Informatica ETL jobs to migrate the data from flat files to the database.
- Performed thorough data profiling to understand the quality of source data and to find data issues using Informatica Data Quality.
- Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
- Involved in massive data profiling using Informatica IDQ prior to data staging.
- Created IDQ Mapplets for address cleansing, telephone cleansing and SSN cleansing and used them as Informatica Mapplets in Power Center.
- Involved in supporting the application until it is completely stable in Production.
- Co-ordinating with offshore teams for the deliverables.
- Conducting reviews with Business SME's for the design, Code, Unit Test Plan, and cases.
- Extracted source definitions from databases like Oracle and Flat files into the Power center repository.
- Importing Source/Target tables from the respective databases and created reusable transformations and mappings using Designer Toolset of Informatica.
- Involved in the Performance Tuning & Preparation of unit test cases
- Resolving if there are any issues in transformations and mappings.
- Involved in requirements analysis and design and also source system analysis to derive the business logic, validation of data mappings for enhancement releases.
- Coding Oracle PL/SQL Stored Procedures, Functions, and Packages
- Analyzing the Defect Resolution document for implementation of the requirements.
- Developing database scripts for each release and deploying them using WorkBench.
- Involvement in code deployment and post production deployment activities.
- Carrying impact Analysis for the changes made in source system and implementing the required changes in target DW environment
Environment: Informatica Power center 9.5.1, Informatica Data Quality 9.5.1, Oracle 11g, Teradata, Flat Files, Unix.
- Used Informatica 9.1 to extract, transform and load data from multiple input sources.
- Worked with Business Analyst to gather business requirements and Involved in weekly team meetings.
- Reviewed the business requirements and created an internal and external design for Informatica jobs.
- Involved in the transfer of data to Connects Promotion Manager.
- Worked with the Connect3 support team to maintain the Informatica jobs and Data Analyzer.
- Developed complex mappings using Informatica Power Center Designer to transform and load the data from various source systems like Flat files, SQL Server, Excel, DB2 and loading to Oracle target database.
- Extensively used various types of transformations such as Expression, Normalizer, Joiner, Update strategy, Lookup (Connected and Unconnected) to load the data.
- Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
- Devised extensive PL/SQL Stored Procedures and Triggers to populate the base tables on daily basis and implementing business rules and transformations.
- Performed Mapping Optimizations to ensure maximum Efficiency.
- Extensively used complex SQL queries to unit test the system and to test the existing project also used to validate the data in data warehouse.
- Involved in Optimizing and Performance tuning logic on targets, sources, mappings, and sessions to increase the efficiency of the session and Scheduled Workflows using AutoSys.
- Implemented complex mapping such as Slowly Changing Dimensions (Type II) using Flag.
- Designed and developed UNIX shell scripts as part of the ETL process to compare control totals, automate the process of loading, pulling and pushing data from and to different servers.
- Involved in unit, system and end-to-end testing of the design.
Environment: Informatica Power Center 9.1, Teradata, Oracle 10g, SQL Server, DB2, PL/SQL, AIX, Flat files, SQL*Loader, TOAD, UNIX
- Worked on Informatica Designer, Repository Manager, Workflow Manager, Workflow Monitor.
- Extracted the data from the flat files and other RDBMS databases into the staging area and populated onto data warehouse.
- Developed and documented data Mappings/Transformations and Informatica sessions.
- Responsible for definition, development, and testing of processes/programs necessary to extract data from client’s operational databases, Transform and cleanse data, and Load it into Datamart’s.
- Used the Informatica Designer, Source Analyzer, and Mapping Designer.
- Created complex mappings in Power Center Designer using Aggregator, Expression, Filter, Sequence Generator, look-up, Update Strategy, Rank, Joiner, and Stored procedure transformations.
- Used Informatica Power Center Workflow manager to create sessions, batches to run with the logic embedded in the mappings.
- Implemented Type 1 and Type 2 slowly changing dimensions.
- Used the update strategy to effectively migrate data from source to target.
- Designed several mappings to extract the data from the Flat file sources and Relational Sources.
- Designed and developed complex aggregator, join, look up transformation rules (business rules) to generate consolidated (fact/summary) data identified by dimensions using Informatica ETL (Power Center) tool.
- Maintained stored definitions, transformation rules and targets definitions using Informatica repository manager.
- Used Informatica as an ETL tool to extract data from Oracle manufacturing applications to Target system.
Environment: Informatica Power Center 9, Oracle 10g, DB2, PL/SQL, TOAD, SQL Plus, SQL Loader.