- Around 8+ Years of solid IT industry experience with 7+ years of extensive experience in Informatica , Data Warehousing and Data integration projects.
- Good experience in Hyperion interactive reporting, Hyperion production reporting, hyperion enterprize suites
- Having in depth ETL testing , Informatica development and knowledge and also worked in support.
- Strong Hands - on experience with ETL Testing and tools of Powercenter Informatica Designer, Workflow manager, Workflow Monitor and Repository Manager and Data Validation Client .
- Well versed with ETL Testing and Data Warehousing concepts and ETL designs for loading Star Schemas/ Snowflake schemas .
- Sound knowledge in Informatica transformations and its several components like Workflow manager, Repository manager, Workflow Monitor and Data Validation Option(DVO) .
- Good understanding and hands on Performance tuning of mappings / sessions in Informatica.
- Strong hands-on in Oracle 9i, 10g, 11g and 12C on PL/SQL, Functions, Procedures, Triggers and Cursors.
- Advanced Validation steps and testing the loaded data and implementation knowledge in informatica mapplets , SQL overrides, load types, session partitioning .
- Transformations knowledge on Lookup, Joiner, Aggregator, Sorter, Rank, Expression, Union, SQL transformation, Source Qualifier, Filter, Router, Stored procedure, XML, Normalizer, Sequence Generator, Transaction Control, Update Strategy .
- Created tests and systematic test plans and ran those tests whenever PowerCenter loads a batch of data to the target using Data Validation Option (DVO).
- In depth knowledge on Provider module of Facets.
- Good experience in testing / correcting re-usable and non-re-usable tasks and also automation testing using Informatica's Data Validation Option Client.
- Sound knowledge in SQL queries and Informatica integration with other databases.
- Very good automated Testing / Debugging skills in issues pertaining to Oracle and Informatica.
- Exceeded Client’s expectation in resolving critical issues in transformations of Informatica .
- Knowledge on SDLC Processes and Change Management processes.
- Implemented several projects to success following the waterfall Life cycle methodologies
- Completed several Agile life cycle courses suggested/provided by the Implementing parter( TCS ) and Clients
- Extensively performed Unit Testing, System Testing, Integration testing, Functional testing and User Acceptance Testing (UAT) .
- Sound experience in adhering to ETL standards and designing ETL process and frameworks to help reusability.
- Good knowledge on working with Informatica Data Validation Client to audit the data from production and other environments.
- Good experience in reducing the risk in testing using the DVO tool.
- Compelted Safe Agile courses provided by TCS and Confidential
- Well known skill for issue resolution till the closure.
- Performed several team managing roles like technical lead and team leads. Known for sticking to timelines and on time delivery since the beginning of the carrer.
- Upgrading my skillset as per the project needs and managed various applications that included . NET, TIBCO, HYPERION, INFORMATICA etc
ETL Tool : Informatica PowerCenter 10.1/ 9.5.1 /9.1.1 / 8.6
Hyperion : Hyperion Enterprise Edition Suite 10.1.3.4.1/10.1.3.3.2, Hyperion Interactive Reporting Studio 9.3.1/9.2/8. x/6.x, Hyperion Essbase 9.3.1/9.2, Hyperion Performance Suite 9/8.3.5, (Brio) on-demand server 6.6.x, & Hyperion (Brio) Broadcast Server 6.6.x, Hyperion report builder (SQR) 8.2.1
Database : Oracle 9i / 10g / 11g proficiency
Tools : Informatica, DVO, Toad, SQL Developer, Putty, PL/SQL Developer, HP service manager, Excel, Visio, Handly experience with all other microsfort tools
Operating Systems : UNIX, Windows XP, Windows 7
Bug Reporting Tools : HP Quality Center, DVO, Excel
Confidential, Greenwood Village, CO
Informatica Consultant/ Technical lead/ Team lead
- Responsible for migrating the hige ETL code from Informatica 9.1 to 10.1 version as part of the Enterprize Upgrade activity.
- Provide solutions for the long term persisting issue and received several appreciations for tuning the existing application for better performance.
- Responsbile for coordinating the Offshore-Onshore - Client partnerships situated globally and execute the project successfully
- As a senior ETL developer, the responsibilities include discussion with business and take the required inputs as per the requirement and device test plans and test strategies and decide timelines.
- Assigning tasks to the respective individual team members based on the required deliverables to satisfy a lead responsibility.
- Create, design and execute test plans, test harnesses and test cases using Informatica Data Validation Client.
- Extensively involved in testing the ETL logic to do the Auditing and Error Handling for the loads happening across different projects using Informatica DVO and custom SQL queries.
- Greately experienced in writying SQL queries and built numerous Stored Procedure, Views, Materliazed views for day to day activities.
- Converted SQL procedures to ETL and viceversa for effectiveness of the program.
- Responsbile for dealing with the massive data files and tables that process millions of rows on daily basis and HIPPA certified based on the Confidential Standards.
- Using the GUI included in the DVO compared the data in the source and target by running table pair tests.
- To Convert reporting information into different client required reporting formats like (HTML, PDF, RTF, EXCEL, CSV, PS, XML, CSS, MARKUP etc…).
- Working with Business analysts to understand business/system requirements in order to transform business requirements into effective technology solutions
- Designed the mapplets while focusing on code reusability in order to enable future data migration to newer target systems.
- Extensively used Normal Join, Full Outer Join, Detail Outer Join, and master Outer Join in the Joiner Transformation.
- Used Ralph Kimball Methodology for building the data warehouse.
- Worked with various active transformations in Informatica PowerCenter with B2B like Filter Transformation, Aggregator Transformation, Joiner Transformation,Rank Transformation, Router Transformation, Sorter Transformation, Source Qualifier, and Update Strategy Transformation
- Extensively worked with various Passive transformations in Informatica PowerCenter like Expression Transformation, Sequence Generator, Sorter Transformation, and Lookup Transformation.Developed the mapplets to support data migration from source SAP, data warehouse environment utilizing an agile delivery methodology. NASCO.
- I have good experience in Informatica Data Quality ensures that authoritative and trustworthy data is available to all stakeholders and data domains, for all projects and business applications. Configure Informatica Servers for better Performance and ODBC drivers for source.
- Extensively worked with Slowly Changing Dimensions Type1, Type2, and Type3 for Data Loads
- Responsible for Performance Tuning at the Mapping Level, Session Level, Source Level, and the Target Level
- I have good experience in developement and implement applications using ETL-SSIS and SSRS. Oracle data integrator.
- Responsible for Performance in Informatica PowerCenter at the Target Level, Source level, Mapping Level, Session Level, and System Level
- Extensively worked with both Connected and Un-Connected Lookups
- Extensively worked with Look up Caches like Persistent Cache, Static Cache, and Dynamic Cache to improve the performance of the lookup transformations
- Worked with re usable objects like Re-Usable Transformation and Mapplets.
- Extensively worked with aggregate functions like Avg, Min, Max, First, Last, Std Deviation in the Aggregator Transformation.
- Extensively made use of sorted input option for the performance tuning of aggregator transformation
- Extensively use Power Exchange to extract data from MQ series.
- Extensively used SQL Override function in Source Qualifier Transformation.
- Responsible for migrating the workflows from development to production environment
- Extensively worked with Incremental Loading using Parameter Files, Mapping Variables, and Mapping Parameters
- Created Pre/Post Session/SQL commands in sessions and mappings on the target instance.
- Used the command line program pmcmd to run Informatica jobs from command line. And used these commands in shell scripts to create, schedule and control workflows, tasks and sessions.
- I have experience in Pervasive and Data Conversion.
- Provided detailed technical, process and support documentation like daily process rollback and detailed specifications and very detailed document of all the projects with the workflows and their dependencies.
- Created documents for data flow and ETL process using Informatica mappings to support the project once it implemented in production.
- Created and Documented ETL Test Plans, Test Cases, Test Scripts, Expected Results, Assumptions and Validations.
- Passionately drive the testing life cycle and support for the top notch delivery.
- Validated report data by writing and executing SQL queries against the data base.
- Tested the ETL code using Informatica Data Validation Client and wrote different SQL queries to validate them against the different source / target systems.
- Performing Unit Testing and dry run testing and tuned the mappings for the better performance.
- While using DVO Table Pairs used a parameter in the WHERE clause of a table pair definition to filter the records that are tested.
- Test results to be captured and documented thoroughly.
- Created Table Pairs using the enterprise license of DVO to increase test performance and to store all error records from each table-pair test.
- Review the test results and validate the same submitted by other team members.
- Writing the test cases covering the complete requirement logic.
- Defined the Scope for System and Integration Testing.
- Suggest any performance oriented approaches as applicable.
- Participation in daily QA meetings.
- Functional knowledge on Facets provider Space
- Always deliver and stay on top of testing and support related issues.
Environment: Informatica power center (Repository Manager, Designer, Workflow Manager, and Workflow Monitor, Source Analyzer, Warehouse Designer, Teradata, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Designer, Task Developer), power Exchange, Data Explorer, informatica PowerConnect,teradata, Data Quality, Control M, Micro Strategy, DB2 UDB 8.1, MQ Series, Cobol, PL/SQL,Oracle 11g/ 10g, Netezza 6.0.4. Flat Files, XML and perl script. Sybase. BMC Control-M. Salesforce.com.
Hyperion Essbase/ Planning Consultant
- Designed BSO Cube with multiple alternate hierarchies to support various groups within the BU
- Setup Dimension Build Files to load dimensions from SAP to Essbase.
- Setup load rule files to load data from data warehouse into Essbase Cube
- Loaded five years of historical data and performed data validation and reconciliation with other Essbase applications and SAP
- Created member formulas to populate alternate hierarchies
- Create calculation scripts to generate quarter end financial metrics and future scenarios
- Migrated the application from Development to QA to Production.
- Provided training on Hyperion Essbase excel add-in for the power users
- Wrote Complex Calc Script’s.
- Wrote Maxl Scripts to Automate the Essbase Load Process.
- Created Planning Data Form’s for the Budget and Forcast .
- Created Business Rules and assigned it to Various Data Forms according to the requirement.
- Involved in Gathering Requirements for the Planning and Budgeting Cycles
- Created Data Sources and Planning Applications.
- Created Smart list’s and assigned it to Data Forms.
- Involved in Trainning the Business users on Smart View and Excel Addin.
- Created Multiple Hyperion Financial Reports, Web Analysis Dashboards, based on the requirement.
- Created Multiple Template’s of Smart View and Excel Addin for every day analysis of the user.
- Provided 24/7 on call Production support.
Environment: Hyperion Planning, Hyperion Essbase Version 220.127.116.11, Financial Reporting Studio, Hyperion WebAnalysis Dashboards, Microsoft Windows XP.