- Over 13+ years of work experience in Cloud Integration, Data warehouse, Data modelling, ETL, Business Intelligence, Performance tuning, Data integration, Data migration Projects.
- Have experience designing data models, ER diagrams, define data dictionaries and producing design documents, mapping documents for ETL’s.
- Have experience working with PowerCenter, Informatica Cloud (IICS), Azure Data Factory, Kettle (PDI).
- Have experience extracting data from SAP, Peoplesoft, SharePoint, Webservices using Informatica.
- Have experience working with Informatica Cloud using SharePoint, OData and ODBC connectors.
- Have experience working on Informatica Data Quality (IDQ) and Informatica Analyst.
- Have experience in handling large volume of data, complex business logic and aggressive deadline environments.
- Have experience working in agile and waterfall models.
- Strong communication skills.
- Have experience with Amazon web services (AWS) and Microsoft Azure.
- Have experience working with Cognos, Power BI, Pentaho Reporting, and Jasper Reports.
- Oracle Certified SQL Expert (2008) (IZO - 007)
- Oracle Certified PL/SQL Developer (Oracle Certified Associate in 2009) (IZO-146)
ETL Tools: PowerCenter 10.4,10.2,9.x,8.x, Informatica Cloud (ICS, IICS), Azure Data Factory, Kettle (PDI).
Data Quality Tools: Informatica Developer (IDQ), Informatica Analyst.
Cloud Skills: Informatica Cloud (IICS), Azure, Amazon Web Services
Reporting Tools: Cognos 11, 10.x, 8.x, Power BI, Jasper Reports
Databases: Oracle 9i, 10g, SQL Server (2008-2016), PostgreSQL, Azure SQL Database.
Operating Systems: Windows, UNIX, Sun Solaris.
Other Programming Skills: Python, Java, HTML, C++, Pro C.
Scheduling Tools: Maestro, Tidal.
Others: SharePoint, CVS, PVCS, HP ALM, HP QC.
- Designed and developed the data warehouse from scratch.
- Have been a part of several projects over my tenure at HISD.
- Have been a part of requirement gathering, analysis, design and development.
- Have produced several data models for the projects and created ER diagrams.
- Have defined mapping documents needed for the ETL development.
- Extracted data from Peoplesoft, Chancery, SharePoint, Flat files, XML’s and webservices.
- Have designed and developed security for A4E Dashboards built on Cognos.
- Successfully completed the SAP Re-implementation project to extract data from SAP.
- Successfully completed SIS upgrade project to move from Chancery to PowerSchool.
- Have successfully implemented 3 new projects in Informatica cloud.
- Have designed and developed reusable framework for vendor integrations using the IMS Global Learning Consortium Standards.
- Have designed and developed over 15+ custom integrations with external vendors to HISD.
- Being data warehouse SME, I have provided guidance to power user environment whenever necessary.
- Have generated several documentation like Coding standard document, requirements gathering documents, design specifications, mapping documents, test plans and knowledge transition plans.
Environment: PowerCenter 10.4, 10.2, 9.6, 9.1, SQL Server (2008-2016), Cognos 11, 10.x, SAP, Peoplesoft, SharePoint, Chancery (Student Information System), PowerSchool, Windows, HP ALM.
Data Quality Consultant
- Implemented the data quality rules on data as re-usable mapplets in Informatica Developer (IDQ) which can be used in any mappings.
- Exported the rules as mapplets into Informatica power center.
- Plugged all the rules that need to be applied on a specific table and build the mapping for the table. Developed several such mappings.
- Developed all the data synchronization mappings which compare data from source system to target system.
- Have built the Data Quality framework to execute around 300 custom rules in an iterative fashion.
- Loaded all the Rules into the Meta data driven table which has a capacity to control the execution of the rules.
- Involved in doing the data profiling activity on the tables requested using Informatica Analyst service 9.1. This profiled data will help users to analyze the patterns in data and define more rules on data.
- Developed a web service which has the capability to consume addresses and cleanse the Address data using address doctor. This web service is designed to be used by any application across the organization.
- Scheduled all the jobs using power center.
- Have documented mapping documents, test plans for all the code developed and knowledge transition documents and plans.
Environment: Informatica Developer 9.1, Power center 9.1, Oracle 10, SQL Server, Unix, PL/SQL Developer, StarTeam.
- Understand the existing data warehousing system and gather the requirements for the extension of Data warehouse tables. Such as addition of new columns to critical DW tables.
- Created separate design for implementing the new columns into the existing tables as these columns are added for only one source system INTERACT.
- Implemented the design and done the testing using HP Quality Center system.
- Have proposed a new design to do the reconciliation process for the critical DW tables against several source systems.
- Implemented several mappings to complete the reconciliation process using Informatica Power center.
- Created complex PL/SQL stored procedures for tracking the Data mart status and update the information into Database.
- Scheduled all the reconciliation jobs in Tidal scheduler and extensions to the existing tables in the Maestro scheduler.
- Developed UNIX scripts to track the completion of data mart status based on the touch files generated and created Cognos reports to send the mart status to the business users.
- Developed several documents for the project as demanded by the compliance requirements of Confidential & Confidential Inc.
- Production deployment of all the jobs created for the project. Monitoring and support of the jobs during execution.
Environment: Informatica power center 8.6, UNIX, Oracle, Cognos Report Studio, Informatica Power Connect, ProC, Mainframes, SAP, TOAD, PVCS, Maestro scheduler, Tidal Scheduler, HP Quality Center.