We provide IT Staff Augmentation Services!

Project Manager,resume Profile

5.00/5 (Submit Your Rating)

UsA

SUMMARY

  • Over 15 years of experience in IT industry as an Architect, Project Lead in Software Analysis, Design development, Maintenance, Enhancement, Testing, and Production support of applications.
  • More than 15 years of ETL and data integration experience in developing ETL mappings and scripts using Pentaho, Informatica Power Center 8.x/7.x/6.x/5.x
  • Over 15 years of strong Data Warehousing experience using Informatica Power Center/Power Mart, Oracle 11g,10g/9i/8i/8.0/7.x, DB2, MS Access 7.0/'97/2000, OLTP, OLAP, Erwin 3.5.2/4.x.
  • More than 7 years of Business Objects reporting using Xir2, Xir3.1 by building a complex universe and webi reports.
  • Played a technical lead/manager role in multiple projects and managed both onshore and offshore resources, mentored them and given them proper guidance.
  • Designed executed complex ETL activities using INFORMATICA Power Center/Power Mart and Pentaho between Oracle/MS SQL Server/IDMS/DB2 databases, Flat files XML files.
  • Extensively worked on Dimensional modeling, Data migration, Data cleansing and Data Staging of operational sources using ETL processes and providing data mining features for data warehouses.
  • Expertise in OLAP and Reporting Systems using Crystal Reports, Cognos and Business Objects.
  • Strong working experience on Data Warehousing applications, directly responsible for the Extraction, Transformation and Loading of data from multiple sources into Data Warehouse.
  • Proven Data Design skills Data Architecture, Data Modeling, Data Analysis Definition
  • Experience in ER Dimensional Data Modeling to deliver Normalized ER STAR/SNOWFLAKE schemas
  • Strong experience in Oracle 11g,10g/9i/8i/7.x database programming using PL/SQL Stored Procedures, Functions, Triggers, Materialized Views, and Cursors and standard built-in Oracle Packages in UNIX environment.
  • Expert in UNIX Shell Scripting Used SED, GREP CUT utilities scheduling extensively .
  • Expertise over preparing report specifications, database designs to support the reporting requirements.
  • Expert in Troubleshooting/Problem Solving skills.

TECHNICAL EXPERTISE:

  • Data Warehousing - ETL
  • Pentaho 4.2, Informatica Power Center/Power Mart 8.x/7.x/6.x/5.x, Informatica Power Exchange V5.x/8.x
  • Informatica Power Connect 4.1/5.1,
  • MX2, Designer and IDQ/IDE V8.x
  • Scripts
  • Korn-Shell, C-Shell.
  • Programming Languages
  • COBOL II, JCL, REXX, PL/SQL, SQL, SAS
  • Tools
  • QMF, File-Aid, Abend Aid, Playback, Hyperstation, Xpeditor, Intertest, SPUFI, Endevor, Changeman, Eztrieve, DCLGEN, FTP, Strobe, Quality Center.
  • Software
  • CICS, TSO/ISPF, IMS DB/DC, PL/I, MQ Series
  • Database
  • Oracle 11g/10g/9i/8i/8.0/7.0, DB2, VSAM, DB2 UDB, MS Access, SQL Server, Teradata V2R5/V2R4, Star Schema, Snowflake Schema, OLTP, Erwin 4.0/3.5.2/3.x
  • Reporting Tools
  • Business Objects 5.1/xiR2, xiR3, Cognos, Actuate, BEx Analyzer and Web reporting.
  • OS
  • OS/2, Windows, MVS/ESA, MVS/XA, OS/390, Z/S, HP - UNIX 11.0, SUN UNIX 5.8, IBM AIX.
  • Industry Expertise
  • Financial, Health Care, Banking, and Insurance.
  • ERP/SAP BW/BI
  • SAP BW/BI 3.5/BI 7.0, mySAP, SAP R/3

SYNOPSIS OF PROFESSIONAL EXPERIENCE:

Confidenital

Senior Associate

PB LATAM Mosaic.

  • The SMART program delivers a more efficient sales reporting capability, including a database which provided these benefits Timeliness, Flexibility and Granularity, Accuracy and Control, Sales Metrics Reporting and Efficiency to Finance, Management and the IM Sales Channels:
  • IM Sales Metrics and Reporting Tracker SMART
  • Created multiple oracle 11g database views and stored procedure to expose the Pega tables into Beacon datamart for reporting.
  • Created and ETL process and automated using control M to extract a feed for certification process of Beacon users.
  • Developed a new universe to help the Beacon Guidelines coding team by creating various webi reports.
  • Started implementing agile methodologies in the team by creating story points for every Jira item.
  • Played a scrum master role and started running the daily morning calls.
  • Managed 8 developer team onshore/offshore and provided them proper directions and mentored them.
  • Created on adhoc webi reports to support the operations team.
  • Maintained and enhanced the BO universe and Webi reports to cater the Ops Risk dashboard.
  • Beacon is the Guideline and Account Review workflow tool. A workflow is a sequence of procedural steps required to complete a request or case . Each step is passed on to the appropriate Team or Role responsible for review resolution of the case.
  • Ops Risk Dashboard is a reporting platform that helps analyze and assess the business process to ensure their effectiveness and efficiency. The dashboard provides summary level information around the following Key Operational Risk components CSA Control Self Assessment , Errors Risk Events , Business resiliency, Privacy, AML/KYC Anti-Money Laundering/Know Your Customer , NBIA New Business Initiative Approval , IRC Investment Review Committee , Control Heat Map, Trending Heat Map.
  • AM Operational Risk reporting and Beacon.
  • Participated in the daily scrum call and provided the daily status to PMO/Scrum Master and updated the JIRA and ALM with the progress.
  • Played a key role in the business discussion and design by providing inputs to enhance the MOSAIC feed to support IPB Online enhancements
  • Designed the reconciliation process to monitor the breaks using Pentaho, Currently Position and Transaction key fields are compared between the extract from Mosaic against YMF.
  • Managed 5 developers onshore/offshore and supported other projects DOI, FRRS, Counterparty feeds from YMF.
  • Picked up YMF Brazil accounting system knowledge and Pentaho ETL tool used to extract from YMF and started supporting the project and closed all the defects and had a successful release.
  • Joined the project in crucial stage when developer who worked on this project left the firm with so many open items and defects.
  • The purpose of this project is to land Brazil/Chile PB client data from the YMF system into Mosaic to allow Brazil PB clients to view their account details including positions, transactions and balances on IPB Online.
  • Joined the project during the crucial stage and picked up fast and started coding the ETL's using Informatica Power center 8.6.1
  • Contributed to the ETL standards for betterment and started following a template for automation of all the sources.
  • Developed BO universe using version xir3.1 to slice and dice the data by creating adhoc webi reports.
  • Created multiple contexts in the universe to avoid any looping.
  • Created multiple webi reports to generate monthly, weekly and daily war view reports.
  • Automated the BO reports using the BO SDK to display the report in the front end dashboard.

Pentaho, Informatica Power Center 8.6.1, Business Object XIr3.1, Oracle 11g, Sybase, SQL Server, Control M, Pega.6.x

Confidenital

Lead Associate Business Intelligence

  • As a Lead Associate - Business Intelligence/Application Support in the Information Technology department I was providing support for Business Intelligence tools. The platform currently consists of Actuate, Informatica, Business Objects, Cognos, SAS, Composite and Essbase/Hyperion.
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
  • Responsible to migrate Informatica from V7.1 to V8.6 without outage and affecting the business.
  • Managed the team of 10 both offshore and onshore and mentored them and given them proper directions to deliver projects on time and with in the allocated budget.
  • Build the environments for Informatica Power Center/Power Exchange V8.6.1 and migrated all the WMUS applications from V7.1.1 without affecting the batch cycles.
  • Backed up and migrated the Informatica Metadata repository from the shared Oracle instance to the dedicated instance using the UNIX command line.
  • Managed change control implementation and coordinating daily, monthly releases and reruns.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Automated the creation of project folders in the Repository manager, creation of groups/users with proper privileges using shell scripting.
  • Build the environments for Business objects XIr2. Migrated and converted about 170 reports from Crystal 7 to CR XIr2.
  • Created and maintained several custom reports for the client using Business Objects.
  • Installed and configured SAS 9.1.2 on the UNIX environment and also configured SAS Metadata setup for SAS and Oracle database libraries.
  • Installed SAS Access to Oracle and Xythos 4.2 and properly configured as per UBS standard.
  • Back loaded the one year data into the SAS Data mart using SAS Data Integration studio ETL jobs.
  • Informatica Power Center V7.1.1, V8.6.1, Informatica Power Exchange V5.2, V7.1.1, V8.6.1, Business Object XIr2, XIr3, SAS 9.1.2, Actuate V9, Essbase V9, UNIX Sun Solaris 10, Oracle 10g, SQL Server Autosys.

Confidenital

Technical Consultant Lead.

About the Project:

The DIS Vendor-based application performs commission calculation and distributes money via the Pay Period process. Producer, Contract and Hierarchical data that are maintained in DIS are back-bridged both to legacy databases and a large DB2 data delivery warehouse. This project includes developing Data warehouse from different data feeds and other operational data sources.

Responsibilities:

  • Played a major role in the Informatica upgrade to V8.5.1.
  • Analyzed the functional specs provided by the data architect and created technical specs documents for all the mappings.
  • Played a key role in designing the application that would migrate the existing data from relational sources to corporate warehouse effectively by using Informatica's Power center.
  • Documented Meta data definitions.
  • Extensively used Joins, Triggers, Stored Procedures and Functions in Interaction with backend database using PL/SQL.
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner.
  • Created Mapplet and used them in different Mappings.
  • Tested all the applications and transported the data to the target Warehouse Oracle tables, schedule and run extraction and load process and monitor sessions and batches by using Informatica Server Manager.
  • Extensively used SQL Loader, Informatica tool to extract, transform and load the data from MS SQL Server, Flat Files and Oracle to Oracle.
  • Embedded PL/SQL and SQL Plus were also used for several data extracting purposes.
  • Developed complex ETL Extract/Transform/Load procedures were developed using SQL Server's DTS and T-SQL package to provide a way to update/insert/delete data from multiple heterogeneous sources.
  • Tested all the business application rules with test/live data and automated as well as monitored the jobs using Informatica Server Manager.
  • Checked and tuned the performance of application.
  • Created UNIX shell scripts to automate the generation of parameter files for the various mappings and sessions and to execute various SQL/PLSQL commands to populate the weekend/month end dates in the file.
  • Analyzed various defects and provided the specs on the fixes or possible data issues.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Using Debugger to troubleshoot the mappings.
  • Worked on evaluation of Metadata Interchange between Modeling Tool Erwin 4.1 , ETL Tool Informatica .
  • Written Unix Shell Scripts for getting data from various source systems to Data Warehouse systems.
  • Written Maestro scripts to schedule the different workflows.
  • Informatica Power Center 8.5.1/7.1.5, Data Junction, Teradata V2R5 Business Objects 5.0,5i, Oracle 10g,SQL Navigator, PL/SQL, SQL LOADER Windows NT SQL, Korne-Shell, DB2, TIVOLI Scheduler, ERWIN 4.1, HP-UX , AIX 4.3.3, Shell Scripting.
  • Confidenital
  • After gaining a very good experience from TCS, I have joined Accenture. During my tenure with them I have worked on couple of projects.

Confidenital

Responsibilities:

  • Joined project after it had fallen behind schedule. Took a problematic situation and produced a successful outcome despite an aggressive implementation schedule.
  • Involved in Data modeling and design of data warehouse in star schema methodology with confirmed and granular dimensions and FACT tables.
  • Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping Mapplet Designer and Transformation Designer.
  • Created multiple Type 2 mappings for both Dimension as well as Fact tables.
  • Worked on ETL tool like Informatica Power Center 6.2.
  • Created, updated and maintained ETL technical documentation.
  • Created Type 1 and Type 3 mappings, for Isolation tables, which serve as the Sources for the actual Datamart tables.
  • Created Informatica Mappings to load data using transformations like Source Qualifier, Sorter, Aggregator, Expression, Joiner, and Connected and Unconnected lookups, Filters, Sequence, Router and Update Strategy.
  • Improved the performance on the Aggregate Layer mappings by introducing Temp tables and Flat Files.
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner.
  • Written Unix Shell Scripts for getting the data from all systems to the data warehousing system. The data was standardized to store various business units in tables.
  • Defined Target Load Order Plan and Constraint based loading for loading data correctly into different Target Tables.
  • Created special data loading techniques using Informatica External loader and Teradata tools, providing efficient load times.
  • Used PMCMD command for running Informatica from backend.
  • Played a very important role in Production support team which involved Object migrations, and Workflow scheduling.
  • Used Debugger extensively to test the mappings and fix the bugs.
  • Informatica Power Center 6.2 Source Analyzer, Data warehouse designer, Mapping Designer, Mapplet, Transformations, Workflow Manager, Workflow Monitor , Oracle 8i, SQL, PL/SQL, SQL Loader, UNIX AIX and Korne-Shell, HP UNIX, UNIX Scripting, Visio, DB2, SQL, TOAD, ERWIN 3.5/ 4.1

Confidenital

Responsibilities:

  • Coded most of the impacted COBOL, DB2 programs according to the standard guidelines.
  • Extensively used the tools EASYTRIEVE and REXX for conversion data analysis and SPUFI, QMF, BMC, DB2 utilities, for DB2 database conversion activities
  • Executed unit and regressing testing and recording test results.
  • Coded and maintained SAS programs that sends reports weekly.
  • Resolved issues and adhoc requests.

OS390, VS Cobol, CICS, MQ Series, Changeman, SPUFI, QMF, File-Aid, Comparex, SAS VSAM, Visio, MS Word, MS Excel, DB2, SQL, Eztrieve, HTML, ASP, JSP, JAVASCRIPT, PERL, VBSCRIPT, ORACLE

Confidenital

TCS is one the top consulting company and during my tenure with them I have worked on various projects using different technologies like Mainframe, Informatica Power Center etc.

Confidenital

Responsibilities:

  • Involved in business analysis and technical design sessions with business and technical staff to develop requirements document, and ETL specifications.
  • Extensively interacted with user and Involved in requirement gathering, prototyping and prepared various documents like Interface Requirement Document, Customer Requirement document, Integration test plan, Unit test plan, Release notes etc.
  • Designed and developed the ETL Mappings for the source systems data extractions, data transformations, data staging, movement and aggregation.
  • Worked on Informatica Power Center 5.1 Used Source Analyzer and Warehouse designer to import the source and target database schemas and the Mapping Designer to map the sources to the target, Mapplets and Transformation Developer.
  • Developed standard and re-usable mappings and mapplets using various transformations like expression, aggregator, joiner, source qualifier, router, lookup, and filter.
  • Used DB2 and BMC utilities to load, backup, copy, recover, reorganize and unload DB2 databases.
  • Responsible for maintaining enhancement of various software applications using JCL, COBOL, CICS, DB2, DB2 STORED PROCEDURES, and ENDEVOR and the web applications using HTML, Java Script, JSP, PERL
  • Informatica Power Center 5.1 Source Analyzer, Data warehouse designer, Mapping Designer, Mapplet, Transformations, Workflow Manager, Workflow Monitor , SQL, PL/SQL, DB2, Oracle 8i, TOAD, UNIX, ASP and PERL

We'd love your feedback!