- ETL/BI Consultant with strong background in Informatica and Microsoft Analysis Services.
- Experienced since 2004 in leading implementations of ETL processes, DW and Business Intelligence Solutions using Informatica Power Center 6.x - 10.x and MSSQL 2012/2014/2017.
- Skilled in Software Development Life Cycle and AGILE methodologies, able to work across various domains including reinsurance, finance, investments, banking and manufacturing.
- Expertise in Informatica mappings, mapplets, workflows, transformations including Normalizer, Web Services, Unstructured, HTTP, XML, Java, B2B, User Defined Functions, Lookups, Stored Procedures, Data Quality/Profiler, Informatica DVO and Meta Query.
- Experience and understanding of ETL/BI Solutions using SSIS, SSAS (Multidimensional and Tabular), OLAP Architecture and processes in SSAS for changing and maintaining the Warehouse, Hierarchies, Calculations, Partitions and Aggregations in the Cube.
- Experience in Oracle, MS-SQL Server, Sybase IQ/ASE and Vertica in writing queries, views, stored procedures, functions, Cursors, triggers and packages using SQL, PL/SQL.
- Experience in UNIX and Shell Scripting for file operations and scheduling.
- Experience in Normalized, Star/Snowflake Schemas for designing Data Marts.
- Excellent analytical, communication skills with ability to work independently and in a team.
ETL: Informatica Power Center 5.x-10.x, SSIS, Data Integrator 6.0
BI/Reporting: SSAS, Business Objects, Micro Strategy, SSRS, Tableau
RDBMS: MS-SQL Server, SYBASE ASE/IQ, Oracle, Vertica
Tools: TOAD, SQL*Plus, Autosys, Cron, Jenkins
Data-Modeling: Erwin 7.0x, Visio 6.0x
Programming: SQL, PL/SQL, MDX, DAX, Unix shell scripting, C, C++, Perl
Operating Systems: HP-UX, AIX, Sun Solaris, Windows
Confidential, Chesterfield, MOSoftware Engineer
Environment: Informatica Power Center 9.6.1/10/10.1.1 , Oracle 11g/12c, SQL Server 2014, LINUX, People Soft, Tidal Enterprise Scheduler, SSAS, SSIS, Informatica DVO, Tableau, Vertica
- Analysis of the existing system and coordinate with business analysts to understand and design the Facts, measures, dimensional and other attributes and refresh intervals.
- Create the staging, warehouse tables, partitions, indexes, views and develop the ETL processes to load FDW and stored procedures to implement pre and post processing to enable bulk load and partition exchanges for the performance improvement of large volumes of data.
- Review use cases, define and set up test scenarios and test datasets and implement automated DVO tests for complete validation of the historical data and ongoing refresh cycles.
- Set up Tidal Job groups, Jobs, Actions, Events and Variables for scheduling.
- Identify bugs, performance issues and enhancements by coordinating with Business Users.
- Configuration and Release management to higher environments and post production support.
- Analysis of the existing systems and redesign to improve performance, identify and change the processes to eliminate all MDM s from Staging and FDW Areas.
- Create a parallel environment to test and maintain existing system change release processes.
- Run the data with the existing and new processes and verify tests using DVOs.
- Design and create the tables and objects required for the conversion of manual Plan data processing from ad hoc sqls, stored procedures and temporary tables.
- Standardize and automate the load process using Informatica and TIDAL jobs.
- Data validation and integration testing by coordinating with the business users.
- Analyze the existing system to identify the performance bottlenecks to be addressed.
- Identify and eliminate the MDM dependencies and switch to the Dimensional Hub for the dimensional data and convert non-performing stored procedures and triggers to Informatica.
- Migrate to Vertica database from Oracle for the performance gain by Tableau live connection.
- Create the metadata tables to store the MDX queries required for the cube extracts.
- Develop SSIS packages to read the cube data and write to a staging area.
- Create Informatica ETLs to load the data store for the consumption by Tableau.
Confidential, New York, NYSoftware Engineer
Environment: Informatica Power Center 9.5.1, Sybase IQ 15 and UNIX
- Designed table structures, data load processes and UNIX scripts to read data from Market Data and splits and dividends calculations for T-90 and T+90 day price Momentum refreshes.
Confidential, Weldon Spring, MOSoftware Engineer
Environment: Informatica Power Center 9.1.0, Oracle, SQL Server, SSAS, SSIS, People Soft, TIDAL.
- Analysis, design and develop the Data Warehouse to integrate with PeopleSoft.
- Develop Informatica ETL processes, validate transactional data using Combo Edit rules Web services and populate the ODS and Data Warehouse.
- Develop stored procedures and functions to log errors and audit data movement.
Confidential, Milwaukee, WISoftware Engineer
Environment: InformaticaPowerCenter 9.0.1/8.6.1 , DAC 220.127.116.11, OBIEE 10.1.3.4, Oracle10g/9i, UNIX
- Perform gap analysis and upgrade the OBI Applications 18.104.22.168 on Informatica 8.6.1 to 22.214.171.124 and Informatica 9.0.1 on an existing build of OBI Applications deployment.
Confidential, New York, NY
Principal Software Engineer
Environment: Informatica Power Center 7.1.4, Sybase IQ/ASE, SQL Server, Micro Strategy, UNIX
- Interaction with business analysts in requirement gathering and analysis.
- Developed ETL processes to load data files from Order Management Systems such as CRD, XIP and LV and validate data using persistent caches to increase the efficiency and integrity.
- Created scripts in UNIX for the concurrent execution of Informatica workflows and dynamic generation of parameter files to integrate with the TCA Batch processing.
- Created regression testing processes for the reusable components to verify the deployments.
- Automated the migration of Plexus data into Sybase in batches and converted the SSIS packages to Informatica and worked on validation, testing, maintenance.
Environment: Informatica Power Center 7.1, Autosys, Oracle 9i, SQL, UNIX
- Analyzed the functional specifications and prepared technical design documents.
- Developed the ETL processes and perform end to end testing between sources and Fermat.