We provide IT Staff Augmentation Services!

Data Warehousing Resume Profile Resume Profile

4.00/5 (Submit Your Rating)

Profile:

  • 13 years of IT experience in total
  • 12 years of extensive experience in designing, developing and implementing Data Warehouse Projects using Informatica, Oracle Unix shell scripting
  • 1 year of Informatica Administration MDM experience
  • 1 year of BI consulting experience in Financial services sector
  • 2 years of Master Data Management consulting experience in Insurance Services sector
  • Good experience in Planning Efforts estimation
  • Expertise in Business Requirement gathering and converting the same into Technical Specification documents HLD LLD
  • Possess excellent interpersonal and communication skills, quick learner, teamwork minded

Technical Skills:

  • Data Warehousing:

Informatica PowerCenter 9.5.1/9.0.1/8.6.1/8.1.1/7.1.3/6.0/5.1, Informatica Powerexchange, Business Objects XI, Trillium, Informatica MDM Multidomain Edition 9.5.1

  • Database:

Oracle Exadata/10g/9i/8i, DB2, Sqlserver, Salesforce.com, Postgres

  • Languages:

Unix shell scripting, SQL, PL/SQL

  • Operating Systems:

HP-UX 11i, IBM AIX 5.1/5.2/5.3

  • Scheduling Tools:

Tidal, Unicenter Autosys, Maestro Scheduler

Experience Summary:

  • Senior ETL Consultant in New York Technology Partners Worked in the following client places during the tenure with NYTP:
  • Senior ETL Consultant in Teach for America, NYC, US from Mar' 2012-Till Date
  • Technical Manager in Headstrong Services LLC from Feb 2011-Feb 2012.
  • ETL Lead in MF Global, NYC, US from Feb' 2011-Nov' 2011
  • ETL Lead in Barclays Capital, Jersey City, US from Jan' 2012-Mar' 2012
  • Worked in the following client places during the tenure with Headstrong:
  • ETL Architect in Wipro Technologies from Jan 2003 to Feb 2011
  • Technical Leader in Prudential, Reading, UK from Aug' 2004-Aug' 2008
  • ETL Architect in The Hartford, Connecticut, US from Sep' 2008-Feb' 2011
  • Worked in the following client places during the tenure with Wipro:
  • Associate consultant in Mascot Systems Ltd from Mar 2001 to Jan 2003

Professional Experience:

Confidential

Responsibilities:

  • Acted as the Technical lead for the ETL team comprised of both internal and contractor developer, creating and maintaining business intelligence and data warehousing design principles using industry leading practices
  • Provided technical leadership and guidance to the development team for the design and development of highly complex or critical ETL architecture and leading industry practices
  • Collaborated with project, architecture, and release teams, providing input on architectural design recommendations, driving standards, plan and execute effective transition to production operations
  • Studied the existing source systems, analysed the business requirements prepared ETL Specifications
  • Explored the Salesforce.com application, created POC mappings to verify Salesforce integration using Informatica
  • Designed ETL framework to integrate data from various source systems such as Postgres, DB2, Files into Salesforce.com
  • Designed implemented the following concepts:
  • Automatic reprocessing of salesforce rejections
  • Data Threshold governance
  • Table driven parallelization
  • Configured Webservices consumer transformation to read data from Workday system
  • Analyzed the Change request/Defects. Conducted meetings with BAs, DBAs Tech leads to finalize the design implementation
  • Hands on Experience in Informatica PowerCenter Administration
  • Installed Informatica 9.0.1 in Unix Platform
  • Experience with Informartica PowerExchange, pmcmd command line interface, and Security including Native and LDAP security
  • Installed and configured LDAP, Sales Force Web-service Plugins with Power Center
  • Extensively worked on Powercenter upgrade from 9.0.1 to 9.5.1HF3
  • Created Deployment groups for code promotion
  • Performed the activities like User creation, Recycling/Disabling services through Informatica Admin console
  • Hands on experience working in Informatica MDM
  • Created Data model elements, Defined relationships lookups within the Data model using Hub console schema tool
  • Configured mappings that use Functions and Cleanse Lists, setting options for Delta detection, Raw data detention using mapping tool
  • Configured exact matching, fuzzy matching merge processes using the Schema tool
  • Configured Batch jobs to execute the stage, load, match merge processes

Confidential

The Basel Committee on Banking Supervision have issued a new regulatory framework, known as Basel 3, to replace the existing Basel 2 regulations. As part of Basel-3 requirement, Eagle requires sourcing Non-netted or Gross Long / Short Position data, or Netted Position data in case of auto 'close out' or 'tear up' in the trade settlement process, and Margin Data from upstream systems for all Centrally Cleared Products. TDB is identified as one main upstream data source systems to feed those data to Eagle. ODH is a Strategic data store for Exchange Traded Products and will be the single point of integration for Exchange Traded Product data extraction. For BASEL III CCP project, ODH will be responsible to provide GMI, all RANSYS regional instances excluding India business , ISTAR and DOLPHIN Futures and Options Positions and Margin data to TDB, in order to meet the BASEL III requirements

Responsibilities:

  • Analysed the ESM CHORUS Feed thoroughly before loading the data into ODH
  • Responsible for loading static data from ESM CHORUS into ODH, which then, allows ODH to do necessary data mapping
  • Actively participated in analysis of ISTAR DOLPHIN data feed via ODH
  • Responsible to generate position margin feed of ISTAR DOLPHIN systems from ODH send them to TDB

Confidential

Core Foundation Services CFS provides a single, global, consolidated source for MF global business critical data enabling cross-system, cross-region reports, with roll-ups and drill downs and historical analysis. Currently CFS manages roughly 14 source systems. CFS CAFE Taxonomy are enhancement projects. CAFE aims to add new source systems into CFS. Taxonomy aims to implement the business taxonomy for each of the Source Systems available in CFS

Responsibilities:

  • Understanding and analyzing new and changing business requirements for adding new source systems and their impact on the CFS design. Proposing enhancements and changes to the technical and business solution to meet the new requirements.
  • Estimated Efforts accurately actively worked with PM's in completing the project plan
  • Involved in discussion with CFS SME's to finalise the design of integrating the new source systems in CFS platform. Managed the Design of taxonomy logic for few source systems
  • Coordinated with Offshore team made sure to clarify their Design Requirement queries. Worked with Tidal administrators to implement some complicated scheduling design
  • Conducted meeting with QA to demonstrate the design for each source systems. Provided support to QA for SIT releases. Promptly responded back to the users during UAT
  • Actively monitored the releases into higher environments. Participated in all production release calls cleared the issues that arises during the release

Confidential

Responsibilities:

  • Involved in overall estimation and planning for the Project.
  • Actively participated in all Design Discussions and prepared Design documents HLD LLD for certain Load Stages. Trained offshore team members with Exceed Product Auto insurance business Knowledge
  • Shared Provided details about all Design Transformation rules document with Offshore guided them to build the necessary components. Worked with offshore team members to prepare Coding Standards, ETL Specification Test case documents template. Provided Informatica/Oracle/Unix/Powerexchange technical consultation to offshore team members helped in resolving key technical issues
  • Assisted Project Business Analysts by providing key Design Data mapping inputs for documenting the FSD. Involved in QA Test plan Testcase review meetings. Worked with Release Management team in migrating the components to QA environment for QA testing. Provided necessary technical assistance in fixing the QA defects
  • Lead a Team of 8 members in Offshore 1 member in Onsite

Confidential

Responsibilities:

  • Managed the end to end delivery of the UVE, Offer mailing MI from requirement analysis to Build and test
  • Lead a team of 5 people in Onsite 5 people in Offshore. Involved in high level design and architecture for operational data store TPDB and the calculation engine TVDB application using Informatica, Oracle PL/SQL and Business Objects
  • Analysed the Source system thoroughly by going through the existing Design Data Model documents, querying the database to capture the data quality issues prepared the Source system Analysis document.
  • Participated in Business Requirements workshop gathered the thorough knowledge of the requirements then, prepared the Requirement Analysis document
  • Actively worked with project manager, data modeller designers to come-up with Build estimate
  • Provided knowledge transfer to offshore team by sharing explaining about the necessary project related documents
  • Reviewed Low Level Design Testcase documents, Informatica, Oracle Unix components
  • Assisted Offshore Team members in clarifying any Informatica, PL/SQL Shell script related queries
  • Involved in the support for Link Testing, System Testing Performance Testing
  • Created few Business object reports while working in MI stream

Confidential

Responsibilities:

  • Worked as a Technical Leader for the stream ADMINISTRATOR. Lead a team of 5 people in Offshore.
  • Involved in Design of the Streams, ADMINISTRATOR UAPS.
  • Identifying the list of Components Doing Efforts Estimation.
  • Prepared Source System Analysis Requirement Analysis documents.
  • Assisted the offshore team members to do the Link Testing Regression Testing.
  • Involved in the support for System Testing Performance Testing.
  • Assisted the Team Members to perform the following activities as part of a Trillium code modification:
  • Creating a new trillium project. Modifiying the existing Converter driver file.
  • Creating a new Converter Input DDL file as per the definition.

Confidential

Responsibilities:

  • Developed a Generic Unix Shell Script to process the files produced from Mainframe and also to run all the mappings used in this project.
  • Created complex informatica mappings that reads COBOL source files loads into XML files.
  • Designed XML Schemas, which will be used to create XML Source Qualifier transformations.
  • Automated all ETL processes through Maestro Scheduler.

Confidential

Responsibilities:

  • Designed and developed ETL layer using Informatica for extracting Adviser firm related Informations To From DPDB.
  • Created various Triggers in DPDB Database to capture any changes done through Dipas front end.
  • Analysed the existing Unix PL/SQL code to replace this with Informatica mappings.

Confidential

Responsibilities:

  • Created Reusable Transformations, Mapplets, and made use of the Shared Folder Concept using shortcuts wherever possible to avoid redundancy.
  • Reviewed Mappings, Sessions and Workflows and logged all review comments.
  • Prepared Reviewed LLD Testcase documents. Prepared Testdata for Component Testing.
  • Used Debugger by making use of Breakpoints to monitor data movement and troubleshoot the mappings.

Confidential

Responsibilities:

  • Contributed to the technical architecture and high level design for data extraction, cleansing and integration including reusable frameworks for Change Data Capture, Matching Merging, Load Batch Management and Exception handling.
  • Extracted, Transformed and Loaded data into the staging area and Data Warehouse Oracle using Informatica mappings which contains complex transformations.
  • Created PL/SQL Stored procedures, which are to be used in Informatica mappings.
  • Developed Unix shell scripts to Pre-process the files.
  • Worked in Informatica Powerconnect It is now called as Powerexchange . Created Data Maps for Bulk extraction Changed Data Capture CDC using Detail Navigator for ADABAS source VSAM files. Tested the Data maps using row test feature.

We'd love your feedback!