- 13 years of IT experience in total
- 12 years of extensive experience in designing, developing and implementing Data Warehouse Projects using Informatica, Oracle & Unix shell scripting
- 1 year of Informatica Administration & MDM experience
- 1 year of BI consulting experience in Financial services sector
- 2 years of Master Data Management consulting experience in Insurance Services sector
- Good experience in Planning & Efforts estimation
- Expertise in Business Requirement gathering and converting the same into Technical Specification documents (HLD & LLD)
- Possess excellent interpersonal and communication skills, quick learner, teamwork minded
Data Warehousing: Informatica PowerCenter 9.5.1/9.0.1/8.6.1/8.1.1/7.1.3/6.0/5.1 , Informatica Powerexchange, Business Objects XI, Trillium, Informatica MDM Multidomain Edition 9.5.1
Database: Oracle Exadata/10g/9i/8i, DB2, Sqlserver, Salesforce.com, Postgres
Languages: Unix shell scripting, SQL, PL/SQL
HP: UX 11i, IBM AIX 5.1/5.2/5.3
Scheduling Tools: Tidal, Unicenter Autosys, Maestro Scheduler
Senior ETL Consultant
- Acted as the Technical lead for the ETL team comprised of both internal and contractor developer, creating and maintaining business intelligence and data warehousing design principles using industry leading practices
- Provided technical leadership and guidance to the development team for the design and development of highly complex or critical ETL architecture and leading industry practices
- Collaborated with project, architecture, and release teams, providing input on architectural design recommendations, driving standards, plan and execute effective transition to production operations
- Studied the existing source systems, analysed the business requirements & prepared ETL Specifications
- Explored the Salesforce.com application, created POC mappings to verify Salesforce integration using Informatica
- Designed ETL framework to integrate data from various source systems such as Postgres, DB2, Files into Salesforce.com
- Designed & implemented the following concepts:
- Automatic reprocessing of salesforce rejections
- Data Threshold governance
- Table driven parallelization
- Configured Webservices consumer transformation to read data from Workday system
- Analyzed the Change request/Defects. Conducted meetings with BAs, DBAs & Tech leads to finalize the design & implementation
- Hands on Experience in Informatica PowerCenter Administration
- Installed Informatica 9.0.1 in Unix Platform
- Experience with Informartica PowerExchange, pmcmd command line interface, and Security (including Native and LDAP security)
- Installed and configured LDAP, Sales Force & Web - service Plugins with Power Center
- Extensively worked on Powercenter upgrade from 9.0.1 to 9.5.1HF3
- Created Deployment groups for code promotion
- Performed the activities like User creation, Recycling/Disabling services through Informatica Admin console
- Hands on experience working in Informatica MDM
- Created Data model elements, Defined relationships & lookups within the Data model using Hub console schema tool
- Configured mappings that use Functions and Cleanse Lists, setting options for Delta detection, Raw data detention using mapping tool
- Configured exact matching, fuzzy matching & merge processes using the Schema tool
- Configured Batch jobs to execute the stage, load, match & merge processes
- Analysed the ESM & CHORUS Feed thoroughly before loading the data into ODH
- Responsible for loading static data from ESM & CHORUS into ODH, which then, allows ODH to do necessary data mapping
- Actively participated in analysis of ISTAR & DOLPHIN data feed via ODH
- Responsible to generate position & margin feed of ISTAR & DOLPHIN systems from ODH & send them to TDB
- Understanding and analyzing new and changing business requirements for adding new source systems and their impact on the Confidential design. Proposing enhancements and changes to the technical and business solution to meet the new requirements.
- Estimated Efforts accurately & actively worked with PM’s in completing the project plan
- Involved in discussion with Confidential SME’s to finalise the design of integrating the new source systems in Confidential platform. Managed the Design of taxonomy logic for few source systems
- Coordinated with Offshore team & made sure to clarify their Design & Requirement queries. Worked with Tidal administrators to implement some complicated scheduling design
- Conducted meeting with QA to demonstrate the design for each source systems. Provided support to QA for SIT releases. Promptly responded back to the users during UAT
- Actively monitored the releases into higher environments. Participated in all production release calls & cleared the issues that arises during the release
- Involved in overall estimation and planning for the Project.
- Actively participated in all Design Discussions and prepared Design documents (HLD & LLD) for certain Load Stages. Trained offshore team members with Exceed Product & Auto insurance business Knowledge
- Shared & Provided details about all Design & Transformation rules document with Offshore & guided them to build the necessary components. Worked with offshore team members to prepare Coding Standards, ETL Specification & Test case documents template. Provided Informatica/Oracle/Unix/Powerexchange technical consultation to offshore team members & helped in resolving key technical issues
- Assisted Project Business Analysts by providing key Design & Data mapping inputs for documenting the FSD. Involved in QA Test plan & Testcase review meetings. Worked with Release Management team in migrating the components to QA environment for QA testing. Provided necessary technical assistance in fixing the QA defects
- Lead a Team of 8 members in Offshore & 1 member in Onsite
- Managed the end to end delivery of the UVE, Offer mailing & MI from requirement analysis to Build and test
- Lead a team of 5 people in Onsite & 5 people in Offshore. Involved in high level design and architecture for operational data store TPDB and the calculation engine TVDB application using Informatica, Oracle PL/SQL and Business Objects
- Analysed the Source system thoroughly by going through the existing Design & Data Model documents, querying the database to capture the data quality issues & prepared the Source system Analysis document.
- Participated in Business Requirements workshop & gathered the thorough knowledge of the requirements & then, prepared the Requirement Analysis document
- Actively worked with project manager, data modeller & designers to come-up with Build estimate
- Provided knowledge transfer to offshore team by sharing & explaining about the necessary project related documents
- Reviewed Low Level Design & Testcase documents, Informatica, Oracle & Unix components
- Assisted Offshore Team members in clarifying any Informatica, PL/SQL & Shell script related queries
- Involved in the support for Link Testing, System Testing & Performance Testing
- Created few Business object reports while working in MI stream
- Worked as a Technical Leader for the stream ADMINISTRATOR. Lead a team of 5 people in Offshore.
- Involved in Design of the Streams, ADMINISTRATOR & UAPS.
- Identifying the list of Components & Doing Efforts Estimation.
- Prepared Source System Analysis & Requirement Analysis documents.
- Assisted the offshore team members to do the Link Testing & Regression Testing.
- Involved in the support for System Testing & Performance Testing.
- Assisted the Team Members to perform the following activities as part of a Trillium code modification:
- Creating a new trillium project. Modifiying the existing Converter driver file.
- Creating a new Converter Input DDL file as per the definition.
- Developed a Generic Unix Shell Script to process the files produced from Mainframe and also to run all the mappings used in this project.
- Created complex informatica mappings that reads COBOL source files & loads into XML files.
- Designed XML Schemas, which will be used to create XML Source Qualifier transformations.
- Automated all ETL processes through Maestro Scheduler.
- Designed and developed ETL layer using Informatica for extracting Adviser firm related Informations To & From DPDB.
- Created various Triggers in DPDB Database to capture any changes done through Dipas front end.
- Analysed the existing Unix & PL/SQL code to replace this with Informatica mappings.
- Created Reusable Transformations, Mapplets, and made use of the Shared Folder Concept using shortcuts wherever possible to avoid redundancy.
- Reviewed Mappings, Sessions and Workflows and logged all review comments.
- Prepared & Reviewed LLD & Testcase documents. Prepared Testdata for Component Testing.
- Used Debugger by making use of Breakpoints to monitor data movement and troubleshoot the mappings.
- Contributed to the technical architecture and high level design for data extraction, cleansing and integration including reusable frameworks for Change Data Capture, Matching & Merging, Load Batch Management and Exception handling.
- Extracted, Transformed and Loaded data into the staging area and Data Warehouse (Oracle) using Informatica mappings which contains complex transformations.
- Created PL/SQL Stored procedures, which are to be used in Informatica mappings.
- Developed Unix shell scripts to Pre-process the files.
- Worked in Informatica Powerconnect (It is now called as Powerexchange). Created Data Maps for Bulk extraction & Changed Data Capture (CDC) using Detail Navigator for ADABAS source & VSAM files. Tested the Data maps using row test feature.