Team Lead Resume Profile
PA
Professional Synopsis:
- 10 years of total experience in managing and administrating Informatica and Cognos Finance and have been involved in Develop, Deploy and Troubleshoot Informatica and Cognos Finance Application. Also been into Design, Development and Support of Business Intelligence sources Oracle as primary and Business Objects, Cognos as secondary skills.
- Expertise using different technologies including Informatica 9.5/9.1/8.x/7.1, SQL, PL/SQL, and Oracle 9i/10g/11g,Db2, Netezza.
- Experience in Application Design, Data Extraction, Data Acquisition, Data Mining, Development, Implementations and Testing of Data Warehousing and Database Business Systems.
- Good knowledge in Data Modeling using data modeling tools, Star Schema/Snow Flake Schema, FACT and Dimensions tables, Physical and Logical data modeling.
- Experience in Integration of various data sources like Oracle, SQL Server, Fixed Width and Delimited Flat Files, DB2, COBOL files and XML Files.
- Excellent understanding of Project Life Cycle and gathering requirements for ETL Development.
- Strong experience in Extraction, Transformation, Loading ETL data from various sources into Data Warehouses and Data Marts using Informatica Power Center Repository Manager, Designer, Workflow Manager, Workflow Monitor , Power Exchange, Power Mart, Power Analyzer, and Power Connect.
- Strong Data Warehousing ETL experience using Informatica Power Center 9.1/8.6/8.1/7.1.1.
- Extensively worked on Informatica Designer Components - Source Analyzer, Warehousing Designer, Transformations Developer, Mapplet and Mapping Designer.
- Strong experience on Workflow Manager Tools - Task Developer, Workflow and Worklet Designer.
- Hands on experience with mappings from varied transformation logics including Unconnected and Connected, Lookups, Router, Aggregator, Joiner, Update Strategy, Java Transformations and Re-usable Transformations.
- Created ETL mappings using Informatica Power center to move data from multiple sources like Flat files and Oracle into a common target areas such as Data Marts and Data Warehouse.
- Extracted data from multiple operational sources of loading staging area, Data Warehouse and Data Marts using CDC/ SCD Type2 loads.
- Good understanding of the concepts and handling Repository Manager, Designer and Informatica Server Manager.
- Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema.
- Extensive knowledge in DataQuality in using of Informatica.
- Involved in Data Profiling using Data Explorer.
- Experience in PL/SQL Programming Stored Procedures, Triggers, and Packages using Oracle.
- Good knowledge in Transact-SQL DDL, DML .
- Excellent analytical and logical programming skills with a good understanding at the conceptual level and possess excellent presentation, interpersonal skills with a strong desire to achieve specified goals.
- Have clear understanding of Business Intelligence and Data Warehousing Concepts with emphasis on ETL and SDLC, quality analysis, change management, compliance and disaster recovery.
- Design, development and coding with ORACLE 10g/9.2/8.x/, experienced in writing complex queries, functions and cursors using PL/SQL Programming.
- Design, develop and code with Db2 10.2.
- Coding with Netezza database 7.0
- Strong experience on RDBMS concepts.
- Excellent understanding of the Oracle Architecture and Database Design.
- Expertise in loading data from Legacy Systems into Oracle databases using SQL Loader.
- Collaborated in developing ETL jobs using Data Integrator.
- Extensively worked on PL/SQL Object Types, Autonomous Transaction and Table Partitioning.
- Excellent knowledge of system health reviews, capacity planning, disaster recovery planning, etc.
- Worked in up gradation of all GE Energy Applications from Informatica7.1 to 8.1 to 8.6.
- Highly experienced in preparing Project Estimates and Project Plans.
- Proficient in supporting Cognos reports.
- Hands on experience with designing and making changes to the BO Universe.
- Develop BO Reports.
- Preparation of Documentation for BO Reports and Universes.
- Working Knowledge on UNIX, Windows and UNIXShell Scripting.
- Experience in interacting with Business Users in analyzing the Business Process requirements and transforming them into documenting, designing, and rolling out the deliverables.
- Excellent communication and social skills.
Roles Responsibilities:
- Requirements gathering and analysis, preparing design documentation, design reviews, development, testing and deployment of application enhancements, and project planning.
- Develop, execute change enhancements based on requirements, resolve existing problems and improve the application stability
- Fix all the issues and bugs that may come up during the phase of production. This may include helping the end user in understanding the application and setting up the system environment for them to make the application up and running.
- Deploying the application in the test environment and initiate the process of User Acceptance Testing UAT . This includes fixing of data variances issues and fixing technical errors that may come up during this phase.
- Regular interaction with the GE Energy's Finance Analysts, Unix Administrators, Database administrators, Network Administrators, Application Users all belonging to different departments and divisions of GE Energy to ensure the smooth running of the applications.
- Prepare the cost estimates and get them approved by GE business for new on-boarding and enhancements.
- Parallel working on Support projects.
- Lead the team of 20 members of different locations which include the US, India, China and Malaysia.
- Actively participated in various Data Warehousing training programs and impart training to new joiners.
- Work closely with different business GE IM's owners.
- Point of Contact to auditing team for GE Energy Services business.
- Work with HR team and get resources mapped to the business unit.
- Pull monthly ticket trends throughout technologies and review with business.
Technical Experience:
- Define maintain input forms, reports system hierarchy to support closing estimate processes.
- Administration for Corporate reporting implement requirements identified by FP A team.
- Assist in validation of field inputs GE Energy consolidation reporting.
- System Health reviews, capacity planning, disaster recovery planning.
- Support different business cycle's: Quarter Close Estimate Opplan Growth Play Book
- Define maintain submissions required for financial reporting
- Worked in up gradation of all GE ES business from Informatica 8.1 to 8.6
- Tuning the queries and reduce long running session time.
- Extensively worked on Power Center Designer Source Analyzer, Warehouse designer, Transformation Developer, Mapping Designer and Mapplet Designer
- Currently undergoing training in Informatica with SAP as source system and building Cognos reports
- Have hands of experience in Precise tool to track the behavior of Informatica loads and queries at DB.
- Worked in critical development project where used different transformations.
- Supporting Cognos reports.
- Designing and making changes to the BO Universe.
- Developing BO Reports.
- Preparation of Documentation for BO Reports Universes.
- Involved in all phases of the SDLC Software Development Life Cycle from analysis, design, development, testing, implementation and maintenance with timely delivery against aggressive deadlines.
Technical Skills:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Recognitions:
- Spotlight award for completing the most critical ES BI Consolidation project within deadline and due to which the renewal of GAMS E1 was in considerations.
- Pat On Back POB award for successfully executing the projects by the GE management.
- Associate of the Month Mar-09 Aug-10 June -11, Feb-12 by MSat HR team.
Project Profiles:
Confidential | ||
Role | Technical Lead | |
Modules: Customer Services | ||
|
|
|
Project Description:
Thisproject is for converting the Database from Db2 to Netezza for all Stage and Data warehouse tables and keep the Data Mart tables in DB2.
Contribution:
- Worked in creation of tables in Netezza environment.
- Modified the queries from Db2 to Netezza in Informatica.
- Tuned the mappings and reduced the ETL load time window
- Re-Designed ETL Logic for mappings for SCD1 SCD2.
- Resolved the multibyte character issues in NZ.
- Resolved different kind of issues in run-time.
- Unit and Integration testing.
- Was part of all the implementations and roll-back plans.
Confidential | ||
Role | Team Lead | |
Modules : Parts | ||
|
|
|
Project Description:
The major goal of this project is to focus on improving the design of the PARTS data warehouse in an effort to lower cost of ownership and improve performance availability. It involves Design Performance Improvements for the Extraction, Transformation Load Processes and Design Performance Improvements for the Data Model, e.g. Staging, Fact-Dimensions, etc.
Contribution:
- Implemented the common staging area per source system.
- Designed the Incremental Load strategy for daily loads.
- Tuned the mappings and reduced the ETL load time window
- Designed ETL Logic for mappings.
- Implemented Parallel loading mechanism
- Implemented single workflow for both daily and weekly loads.
- Optimized the Dimension structure and created the confirmed dimensions.
- Extensively worked on Power Center Designer Source Analyzer, Warehouse designer, Transformation Developer, Mapping Designer and Mapplet Designer .
- Mappings and better maintenance. Creating/ Building and scheduling Batches and Sessions using the Server manager
- Developed Reusable Transformations, Aggregations and created Target mappings that contain business rules.
- Optimizing/Tuning mappings for better load performance and efficiency.
Confidential | ||
Role | Team Lead | |
Modules : GEE HFT RRI | ||
Environment Windows Professional | Windows Server 2003 | Languages: NA |
Project Description:
- Confidential is the primary Financial Planning Analysis reporting tool for consolidatingGE Energy business results, plans, and forecasts and submitting the consolidated data to GE Corporate.
- The Cognos system currently holds 4 on-going production instances: GE Energy Financial Planning Analysis FP A GEE , Regional Reporting Instance RRI and Headquarters Financial Tracker HFT . The GEE FP A instance is the primary instance that is used to supply general ledger data balances to the GE Corporate MARS system for financial statement roll-up and consolidation associated with the quarterly closing process. It is also the repository for planning data. The yearly Op Plan is created in Cognos, and eventually uploaded to MARS the corporate consolidation system. The Op Plan or budget is due in January every year. In addition to the yearly Op Plan, a number of other estimates/forecasts are created in Cognos and eventually interfaced to MARS. . Satyam supports the finance applications and the relevant Systems using an onsite/offshore support team.
Contribution:
- Create and maintain Submissions for business cycles: Actuals, Estimate, Opplan, GPB.
- Submitting the numbers to corporate.
- Re-Org changes.
- Structure Analysis on Monthly basis
- Loading the data from Leger to Cognos
- Unit testing/ Integration testing
- Resolving different kinds of issues.
- Create New Reports Input forms and maintain them.
Confidential | ||
Role | Team Lead | |
Modules : All Energy Datamarts | ||
|
|
|
Project Description:
- Confidential is to upgrade all Energy Services Data marts and consolidate them into a single repository in all 3 environments Dev/QA/Prod
- Oracle: 10g to 11g
- Informatica: 8.1 to 8.6
- Cognos:8.1 to 8.4
Contribution:
- All DB/INFA validations.
- Unit testing/ Integration testing
- Resolving different kinds of issues in run time.
- Updating daily status report throughout business.
- Implementation Shut down plan in 8.1 prod environment.
Confidential | ||
Role | Team Lead | |
Modules : ESGAP | ||
|
|
|
Project Synopsis
The Chart of Accounts Key Accounting Flex field Segment Hierarchies created in Energy General Ledger GL will be used for creating financial reports in the Data Warehouse systems for Energy Services P Ls Instance Charlie, Galaxy, etc. In order to achieve this, the GL Hierarchies will be transferred from Energy GL to P Ls Instances and thereafter Jaros will pull the GL Hierarchies details from P Ls Instances. I cover the functional design for extracting GL Hierarchies from Energy GL and loading into custom tables in P Ls Instances.
The Interface program is Design and Build to
1. Extract all GL AFF segment values and hierarchies from Energy GL
2. Load extracted data initially into staging tables and then into custom tables in P L Instance.
3. Notifications in case of errors in transmission from Energy GL to P L Instance.
Contribution:
- As a Team Member was responsible for
- Analysis of the BRD's provided by my IM.
- Creation of Mappings as per the Business Logic.
- Documentation.
- Regularly 30 minutes interval Monitor the Trigger Table in Energy GL instance and check
- for any new record insertion in the table columns to signal that Hierarchy Maintenance work is completed.
- If Trigger table is updated with new record, then extract all records from Energy GL Base
- tables as listed above and detailed in the Sections Business Rules and P L Table Design and Mapping.
- Validate that the number of records are same in the Energy GL tables and Staging Tables of
- P L instance.
- In case of Loading Error, then the program should rollback and report Exception.
- In the case of loading error in P L GL the next process of Loading to Jaros should not happen.
Confidential | ||
Role | Team Member/ Team Lead | |
Modules: PARTS | ||
|
|
|
Project Description:
Confidential data warehouse accesses data from various sources including the Parts ERP Database, COSDOM, and trade sphere to provide a single view of key Parts business information and metrics. The current phase of the project is to maintain a near real time data warehouse. The Parts Interim Data Warehouse IDW is populated once a day with data from the Parts ERP System and then updated every two hours. The data from Parts IDW is used in BO Reports, MS Access Reports and other downstream applications. Informatica mappings and Business Objects Reports are created and maintained to provide a complete and accurate single-source, near real time data for the critical fields of Quotes, Orders and Shipping to the ERP Parts business users.
Contribution:
- As a team member was responsible for
- Analysis of the specifications provided by the clients.
- Designing, developing Informatica mappings and Business Objects Reports.
- Preparation of Documentation for Informatica mappings and Business Objects Reports and execute the transition for the post production support for all the projects.
Confidential | ||
Role | Team Member | |
Modules : CSDWH | ||
|
|
|
Project Description:
The CSDWH data warehouse accesses data from various Alpha ERP sources including the AR, AP, PA, Parts ERP Database, COSDOM to provide a single view of key business information and metrics. The current phase of the project is to develop a near real time data warehouse. The Contractual Services datamart is populated once a day with data from the Alpha ERP System .The data from CSDWH is used in COGNOS reports and other downstream applications. Informatica mappings and Cognos Reports are created and maintained to provide a complete and accurate single-source, near real time information.
Contributions:
- Involved in the design, development and implementations of the Enterprise Data Warehouse EDW process and Data mart. Used Informatica PowerCenter 8 for migrating data from various OLTP servers/Databases.
- Developed standard and reusable mapping and mapplets using various Transformations
- Created complex Informatica mappings to load the data mart and monitored them. The mappings involved extensive use of transformations like Aggregator, Filter, Router, Expression, Joiner, Sequence generator.
- Developed PL/SQL procedures for processing business logic in the database.
- Designed the Incremental Load strategy for daily loads.
- Created updated and maintained ETL technical documentation.
Confidential | ||
Role | Team Member | |
Environment Windows Professional | Software Informatica 7.1, Oracle 8i | Modules : Informatica |
Project Description:
Confidential is the one stop shop to report out the transactional details of all Orders and Sales transactions that tie out to the monthly summary data recorded in Corporate. This helps the financial users to extract reports based for Operational and Financial Views based on P L Tier 1, 2, 3 , Customer, Country, Region, Sales Person, and Transaction Type Internal/External
The Data warehouse was Designed/Built for the following:
- Extract all Orders data from various ERPs and Legacy Systems.
- Data extracted is cleansed with screen door process and loaded initially into staging tables.
- Load the data into defined dimension and fact tables.
- Capture the reject data into Reject tables and report out the data
Contributions:
- Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema.
- Provided excellent knowledge of the entire life cycle of Data Warehouse. Expert in analysis, design, development and implementation of systems in Client-Server and Data Warehousing environments.
- Integration of various data sources like Oracle, SQL Server, Fixed Width and Delimited Flat Files, DB2, COBOL files and XML Files.
- Collaborated in design and development of data warehouses/data marts using Informatica Power Center/Power Mart.
- Designed and developed Informatica Mappings and sessions based on business user requirements and business rules.
- Extensively worked with Repository Manager, Designer, Workflow Manager and Workflow Monitor.
- Worked with the Lookup, Aggregator, Expression, Router, Filter, Update Strategy, and Stored Procedure transformations.
- Configured and ran the debugger from within the Mapping Designer to troubleshoot the mapping before the normal run of the workflow.
- Provided expertise in SQL, PL/SQL, stored procedures, and triggers.
Confidential | ||
Role | Team Member | |
|
|
|
Project Description:
Confidential is a Data Warehouse reporting, analysis, and decision-support environment that allows users to view and analyze Human Resource information specific to Cisco. The initial source systems for this HR data were the Cisco Corporate Oracle HRMS instances. Data sources from other Corporate, Cisco, and local HR information systems were integrated into the HR Data Mart during future phases. The objectives of the project were to facilitate the comprehensive collection including data from Corporate Oracle HRMS instances, and eventually Payroll Systems, Benefit Systems and Contractor Systems and a globally consistent view of Cisco HR Data, as well as to provide for the delivery of HR data using a multigenerational process to ensure receipt of incremental benefits as quickly as possible.
Contributions:
- Involved in the design, development and implementations of the Enterprise Data Warehouse EDW process and Data Mart. Used Informatica PowerCenter 8 for migrating data from various OLTP servers/Databases.
- Developed standard and reusable mapping and mapplets using various Transformations.
- Created complex Informatica mappings to load the data mart and monitored them. The mappings involved extensive use of transformations like Aggregator, Filter, Router, Expression, Joiner, Sequence generator.
- Developed PL/SQL procedures for processing business logic in the database.
- Designed the Incremental Load strategy for daily loads.
- Created updated and maintained ETL technical documentation.