Data Integration Lead Resume
SUMMARY
- 16 years of dynamic IT experience dat includes Product Evaluation, System Analysis, Data Analysis, System Design, Development, Integration Testing, Implementation, User Training, Database Administration and Maintenance & Production Support.
- Expertise in Requirement Gathering, Data Analyst, Data Conversion and Design specialization in the area of Data Warehouse. Worked in SDLC and Agile methodology.
- 10 Years in ETL & Data warehouse full life cycle Projects as Technical Lead for Data warehouse/ETL /OLAP - design and development and in couple of cases part of Data Analyst & Architecture team.
- Extensive experience in ETL using Informatica Power Center and TALEND.
- Decent exposure and created Poc’s in Data Virtualization tool - Denodo.
- Poc’s in Latest Technologies Cloud data warehouse Snowflake, ELT Matillion and Fivetran.
- Experience in Data Lake Architecture using Hadoop, Hive, Kafka, Talend.
- Well acquainted with Talend, TAC, Informatica Designer Components and OBIEE DAC.
- Worked as Data Analyst for the Finance Modules AP, AR, GL, PO.
- Hands on experience in identifying and resolving performance bottlenecks in various levels like Sources, mappings and sessions. Hands on experience in optimizing SQL scripts and improving performance during ETL Loads.
- Worked with Dimensional Data warehouse in Star and Snowflake Schemas, created slowly changing (SCD) Type I/II/III dimension mappings, Fact/s.
- Architecture and preparing the HLD and LLD for mappings/Jobs.
- Expertise in using SQL*Loader to load Data from external files to Oracle Database.
- Developed Batch jobs using UNIX shells scripts PMCMD commands, sftp commands to automate the process of loading, pushing and pulling data from different servers.
- Excellent command over SQL, PL/SQL having strong programming experience in creating Materialized Views, Views, Packages, Procedures, Functions, Triggers using SQL and PL/SQL.
- Overall data warehouse and project Domains were in Finance, Surgical Robotics, Manufacturing, Insurance, Sales, Product Development, and Telecommunication.
- Excellent Communication Skills and a Good Team Player with problem-solving capabilities and has worked on Multi Large Scale Projects.
- Managed typical on-site and offshore resources of up to 12 in the development and maintenance team.
- Prepare status report, Data Warehouse ownership and also ensure SLA fulfillment.
- Onsite - Offshore coordination.
TECHNICAL/FUNCTIONAL SKILLS
Operating Systems: Windows, UNIX, Linux
Programming Languages: PL/SQL, Transact-SQL, ViewStar Scripts, R Language, ASP, ASP.NETJAVA SCRIPT, HTML, DHTML, VB-SCRIPT
Databases: MS-SQL Server 7.0/2000/2012/2016, Oracle 8i,/9i/11g/12c, Sybase 11.0, EloquaPostgreSQL, MySQL, Snowflake
BI & DW Technologies: Star Schema /Dimensional Modeling for Data warehouse Processes, ETL Design, Strategy and Architecture, Snowflake, Denodo
Tools: Talend, HIVE, Fivetran, Informatica (Power center), Informatica Cloud, IDQ, Matillion, Oracle OBIEE, OBIA, DAC, Tableau, ISQL, JIRA, BCP and DTS for SQL Server, Toad, PL SQL Developer, JSON, Erwin & Designer 2000, VISIO, VMWare, Remedy/Jira tool for Production Support, MS Visual Source Safe 6.0, SAP HANA, SOQL in Confidential .com
PROFESSIONAL EXPERIENCE
Confidential
Data Integration Lead
Responsibility- Architecting the Finance Modules and Confidential Implementation. Architected the Hadoop Data Lake.
- Worked for Requirement Gathering, Design & Development. Database Estimation Planning and accurate sizing.
- Coordination between Business Analyst, DBA, Development Teams.
- Designed and implemented complicated data integration modules for Extract/Transform/Load (ETL) functions.
- Created data transformation of structured, semi-structured and unstructured data
- Designed the Complex Reusable Joblets for Exception Handling, CDC Handling, Post Load Statistics Capturing.
- Created Reusable Dynamic jobs for Faster Development.
- Created Testing Strategy Templates, Test Preparation, Test Execution and Issue Resolution and Report Generation to assure dat all aspects of a Project are in Compliance with the Business Requirements.
- Prepared Complex SQL Queries to validate the data in both source and target databases.
- Talend upgrade from 5.6 to Talend 6.4 to make use of all of the Big Data components for data ingestion.
- Optimized several ETLs/SQLs for performance bottlenecks created due to huge amount of data.
- Perform TAC Administration and TAC Issue Resolutions.
Environment Talend 6.4, Oracle 12c, SQL Server 2016, ERWIN, HIVE, Hadoop, kafka, Snowflake, JIRA, Confidential .com
Confidential
ETL Architect/Team Lead
Responsibility- Single point of contact for entire application (Exigen, SALESX, PRIORITY Quote)
- Responsible for defining the key identifiers for each mapping/interface.
- Designed the Quote Manufacturing Process for Policy without the Quote in Exigen System for data rationalization. Extensively used PL SQL for dis.
- Worked with data architects in the development of target IDS data Model.
- Standardization of Landing to IDS ETL.
- Designed and implemented data integration modules for Extract/Transform/Load (ETL) functions.
- Used bulking techniques, partition utilization, proper INDEX, HINTS & Parallelism for Performance tuning.
- Involved in Test Planning, Test Preparation, Test Execution and Issue Resolution and Report Generation to assure dat all aspects of a Project are in Compliance with the Business Requirements.
- Written Complex Data Reconciliation SQLs between Source System and IDS Model.
- Decoded Very Complex SQL Server Stored Procedures and converted into ETL.
- Coordination between Business Analyst, DBA, Development Teams.
Confidential, CA
Tech Lead/Data Analyst
Responsibility- Analyzed highly complex and large NetSuite Robotics Data and written complex SQL to get insight of raw data.
- Identify and quantify data issues within the Confidential environment with the help of Stakeholders.
- Written Complex Data Reconciliation SQLs between NetSuite Source System and Confidential using SOQL.
- Coordination between Business Analyst, DBA, Development Teams.
- Data cleansing and standardization.
- Assisted in SAP HANA Views Development.
- SAP HANA Data Validation using Confidential system, Data Mining.
- Design & Data modeling of complex SAP HANA Data model, logical relationships, and the data structures.
- Very Complex SQL Server Stored Procedures conversion to SAP HANA Environment
- Involved in Test Planning, Test Preparation, Test Execution & Issue Resolution and Report Generation to assure dat all aspects of a Project are in Compliance with the Business Requirements.
- Assist the development plans to resolve data issues.
- Used Agile methodology for repeated testing.
- Written Data Reconciliation SQLs between NetSuite to Confidential and tan to SAP HANA System.
EnvironmentSAP HANA, SQL Server 2012, Informatica 9.6, Tableau
Confidential, CA
Consultant
Responsibility- Coordination with BSA for Requirement Gathering.
- Requirement Analysis.
- ETL Development.
Confidential, CA
Team Lead
Responsibility- Worked for Requirement Gathering, Design & data modeling. Database Estimation Planning.
- Coordination with BU for Requirement Gathering. Task allocation for the team(Onsite/Offshore).
- Created Informatica Mappings/Sessions/Worklets/Workflows. Status reporting.
- Stored Procedures performance tuning. OBIEE Reports source data mapping changed.
- Optimize the DTM pipeline for performance. Performance tuning for long running SQLs/Mappings.
- Responsible for code migration and conducting test and running the workflows and monitoring them.
Confidential, CA
ETL Architect
Responsibility- Perform the day-to-day configuring/administrating/architecting/security for Informatica PC & Cloud based on requests and other initiatives.
- Being the main hub of communication between Confidential stakeholders and Informatica support, for ad-hoc issues and concerns.
- Monitor Informatica secure agent.
Confidential, CA
ETL Architect/Team Lead
Responsibility- Continuous coordination with Business for Requirement Gathering. Task allocation for the team, Status reporting, Production support, Admin Support.
- Participate in Data Assessment, finding data sources, data elements definitions, data source flow, hierarchy diagram, logical and physical data modeling.
- Informatica Administration(Repository backups, Repository Creation, User/Group Creation)
- Create reusable transformations and Mapplets to be used in multiple mappings. Worked on almost all transformations. Tuning of Informatica mappings to improve performance. Created the Hierarchy mappings.
- Written SQLs for validating the integrity of the data after successful ETL load.
- Providing emergency support in case of load failure. Coordinating with the respective client during off hours to resolve the same.
- Design a migration plan from Informatica 8.6 to 9.5 and implemented it.
- Written ETL to load data from AP, AR, GL, RevRec modules.
- Used bulking techniques, partition utilization to increase performance.
- Designed and Developed pre-session, post-session routines for Informatica sessions to drop and recreate indexes and key constraints for Bulk Loading.
- Worked as development lead and was the SPOC for all day to day deliverables from Offshore. Had to work on different modules simultaneously and ensure proper milestone delivery of the individual modules.
- Created Auto email Alert in UNIX to send notification to the operations team in case of job failure.
- Prepared SQL Queries to validate the data in both source and target databases.
- Record Count Verification DWH backend/Reporting queries against source and target as an initial check.
Confidential
Team Lead
Responsibility- Continuous coordination with Business for Requirement Gathering. Task allocation for the team, Status reporting, Production support, Admin Support.
- Participate in Data Assessment, finding data sources, data elements definitions, data source flow, hierarchy diagram, logical and physical data modeling.
- Informatica Administration (Repository backups, Repository Creation, User/Group Creation)
- Create reusable transformations and Mapplets to be used in multiple mappings. Worked on almost all transformations. Tuning of Informatica mappings to improve performance. Created the Hierarchy mappings.
- Written SQLs for validating the integrity of the data after successful ETL load.
- Providing emergency support in case of load failure. Coordinating with the respective client during off hours to resolve the same.
- Design a migration plan from Informatica 8.6 to 9.5 and implemented it.
- Used bulking techniques, partition utilization to increase performance.
- Designed and Developed pre-session, post-session routines for Informatica sessions to drop and recreate indexes and key constraints for Bulk Loading.
- Monitoring server and cleaning folders by deleting unwanted files, fixing issues in mappings/scripts.
- Worked as development lead and was the SPOC for all day to day deliverables from Offshore. Had to work on different modules simultaneously and ensure proper milestone delivery of the individual modules.
- Created Auto email Alert in UNIX to send notification to the operations team in case of job failure.
- Preparation of checklist of jobs and job run status which are essential for production run for the cut over. Maintain production log, analysis and ABEND history. Trouble shooting and fixing failed jobs.
- Prepared SQL Queries to validate the data in both source and target databases.
- Record Count Verification DWH backend/Reporting queries against source and target as an initial check.
Confidential
Team Lead
Responsibility
- Continuous coordination with Onsite for work related clarification, Create reusable Transformations and Mapplets to be used in multiple mappings.
- Implemented CDC in SFDC. Use debugger to test the data flow and fix the mappings.
- Estimating, Planning and Executing enhancements to the Data Mart.
- Cleanse and Scrub the Data in uniform data type and format. And Load to STAGE tables and them to the Mart and Finally to Dimension/FACT/Aggregate tables.
- Validating the integrity of the data after successful load
- Responsible for code migration and conducting test and running the workflows and monitoring them.