Datawarehousing Technologies Resume Profile
Chicago, IL
Summary
- Self-starter with over 16 years' experience in IT requirement gathering, software design and development with extensive work in Datawarehousing Technologies.
- Collaborative problem solver with the proven ability to work effectively with departmental technical /functional teams.
- Worked in USA, UK and India with leading Global leaders like Informatica, GSK, Fisher Scientific, Thames Water and TEREX.
- Expertise in Datawarehousing/ETL Architecture designing and Data Modeling Start Schema, Snowflake Schema and leading technical teams.
- Extensive experience in leading requirement gathering, design, development and implementation of enterprise data warehouse EDW implementations.
- Extensive experience in developing applications and ETL processes on databases ORACLE, DB2, Teradata and SQL Server.
- Extensive experience of providing IT services in pharmaceutical and manufacturing industries.
- Good knowledge of Agile Methodology.
- Knowledge of Hadoop Ecosystem HDFS, HBaase, MapReduce, Hive, Pig, Flume
Strengths
- Leading technical teams in software development and implementation projects involving Datawarehousing, Data Integration and Data Migration activities.
- Collaborating with business, functional and technical teams to capture customer requirements correctly into functional specifications and driving the implementation of solution through completion.
- A good team player with excellent communication skills, cooperative and knowledgeable provider of strategic technical leadership. Always open to learn and implement new ideas.
Technology Skills
PC Productivity | TOAD, SQL Developer, Microsoft Office, MS Project, MS-Visual SourceSafe 6.0, Oracle Collaboration Suite 10g, PVCS |
Design Tools | Erwin 9.5, MS-Visio 2007, Power Designer 16.5 |
ETL Tools | Oracle Warehouse Builder OWB 9.2/10g/11g, Oracle Data Integrator ODI 11gR2/12c Informatica Power Center 9.5 |
OLAP Tools | Cognos 10.2, Oracle Discoverer, Business Objects XI, Qlikview, OBIEE |
ERP Systems | Oracle E-Business Suite |
Big Data | Hadoop Ecosystem HDFS, HBase, Hive, Pig, Flume |
Databases | MS SQL Server 2008, Oracle 11g/12c, Teradata 13 14, MySQL |
Operating Systems | UNIX, LINUX, SOLARIS, WINDOWS |
Languages | C and C , Java, PHP, PL/SQL, UNIX Shell Scripting |
Version Control | VSS, PVCS |
Professional Experience
Confidential
Datawarehouse Architect
Oracle/OWB Expert
Confidential makes slurries and pads for the purpose of cleaning silicon wafers down to the molecular level for their customers. This process generates a lot of essential data and quality requirements are very stringent from customers. Cabot Microelectronics collects data regarding process manufacturing and quality to measure properties of chemicals and pads into their CMC Quality Systems data mart.
Responsibilities
- Understand current Data Mart infrastructure, help fix technical issues and providing strategic assistance in terms of upgrading current infrastructure in order to retire some of the legacy systems.
- Fixing performance issues that were causing data load to fail and tune load performance of existing ETL processes.
- Reverse engineer the data model and make changes/enhancements to accommodate new sources , develop, test and implement new ETL processes for additional source systems to be added to the data mart.
- Collaborate with business teams for production support and enhancement related decision making processes for the data mart.
- Install and configure ODI 12c on Oracle XE to test the POC of code migration from OWB to ODI, collaborate the POC and prepare a plan for migration.
Technologies
Erwin 9.5, Oracle Warehouse Builder 10g/11g, ODI 12c , Oracle 10g/11g/12c, SQL Developer, SFTP, Oracle E-Business Suite, SQL Loader, PL/SQL,
Confidential
Data Architect / Systems Analyst
This was one of the most important and ambitious projects undertaken to track daily progress of all Confidential sites on a daily basis. This was also a very high profile reporting project since the end customers for the reports were the CEO and CFO of a Furtune 500 company.The report showed information concerning daily order bookings and actual sales on a daily basis across all Confidential sites and segments on a month-to-date MTD basis and on a quarter-to-date QTD basis.
Responsibilities
- Facilitate requirements gathering with key business and IT stakeholders, discussions with local and international IT teams and others involved to determine appropriate solution and coordinate the prioritization.
- Coordinate closely with the design teams to ensure that solution are fit-for-purpose and adequately meet the global enterprise level requirements.
- Collaborate and finalize the design of the data-mart and the reporting solution to meet the enhancement requests from the top management of the company.
- Coordinate ETL design, testing and go-live activities with about 40 Terex sites across the globe for the report.
- Produce Cognos report MTD QTD , test and coordinate the approval process
- Collaborating with the technical support teams after go-live for daily monitoring of ETL processes to make sure the report is actually getting all the data from all over the globe correctly before it is sent out every day.
Technologies
Oracle Data Integrator ODI 11g, Oracle 10g/11g, SQL Developer, Cognos 10.2, SFTP, Oracle E-Business Suite, SQL LoaderThe goal of Data Integrity project was to create a central repository of quality data to assist the data conversion site users in identifying issues and fix them long before the ERP implementation process begins. After the mission was accomplished, the same system was extended to create a central truth tables HUB for Customers and Suppliers across the organization serving along the same lines as MDM mechanism.There was a business intelligence reporting front end written in Cognos 10 where the project manager and other corporate managers could track the progress of the DI work queues generated by the DI mechanism. The ETL work was started in OWB but later completed in Informatica 9.1.
Responsibilities
- Designed and implemented a centralized data profiling mechanism to detect and report data deficiencies as well as to feed ETL engine of Data Conversion mechanism with quality data. Leveraged this tool set to provide metrics and manage global efforts for data quality improvement. This tool saved Terex 648K per year by removing hard dependencies on active ERP environments during data improvement activities.
- Collaborated with business analysts, functional teams and technical teams across Terex sites to capture functional and design specifications correctly for implementation.
- Managed the solution designing, data modeling, Data Integration code development, testing and production support for 20 sites across 20 Terex sites loading large data volumes into Oracle ERP.
Technologies
Oracle Warehouse Builder 10g/11g, Oracle 10g/11g, SQL Developer, Informatica 9.1, Oracle Collaboration Suite 10g
The Data Conversion project was a multi-phase, multi-year progressive and concurrent ERP implementations across four geographies for a Fortune 500 company. The goal of the Data Conversion team was to establish a unified business intelligence framework, Providing enterprise data services while enforcing data quality requirements to ensure data was complete, conformant, consistent and correct for ERP implementation. This tool facilitated the rapid implementation of 35 distinct data elements necessary to replicate legacy business units into a centralized ERP environment.
Responsibilities
- Architected, developed, deployed and maintained a centralized data migration mechanism used by international teams for over 20 discrete ERP implementations in the US, UK, Continental Europe and Australia.
- Collaborated in establishing and enforcing business processes, policy documentation and best practices for the development and utilization of the custom software tools.
- Collaborated and supported simultaneous international site data teams during multiple large scale and complex Oracle ERP Implementations resulting in 10M records being loaded into Oracle ERP.
- Worked tactically on-site when necessary to gather requirements, design software solutions and collaborate with business and implementation teams toward issue resolution.
- Lead the development team of 15 people using OWB, PL/SQL and SQL Developer on Oracle 10g/11g database.
Technologies
Microsoft Visio, Oracle Warehouse Builder 10g/11g, Oracle 10g/11g, SQL Developer, Oracle Collaboration Suite 10g, PVCS
Project Tech Lead
This project was to create the Compliance Datawarehouse solution using Confidential reporting tool on the compliance data collected every month from various systems and compliance monitoring groups across the organization. The reporting enabled the compliance office to understand compliance issues at various levels in the organization. There are as many as 15 source systems from where the data is collected in a centralized data warehouse to create a Dashboard for more than 300 users of GSK.
Responsibilities
- Conduct meetings with business analysts, functional consultants and IT stakeholders, capture requirements, and produce functional and design specification documents.
- Create logical and physical data models for the data warehouse.
- Create ETL specifications, Report specifications and test cases and coordinate development efforts in Informatica and Cognos with Offshore development team.
- Provided production support for the solution post release.
Technologies
UNIX, Windows 2003, Erwin 4.1., Informatica Power Center 7.1, Cognos 7.1/8, Toad 7, Oracle 9i
Tech Lead
This project was to extract activities of employees logged into different systems within the organization and load them into a Datamart. Based on the data loaded in the datamart, BO Universes were created for Ad-Hoc querying. Predefined Web reports were generated from the data in the datamart using .NET to analyze the efforts of individuals at various levels within the organization working on various projects and BO Universe was used for Ad-hoc querying using Business Objects 5.1.
Responsibilities
- Worked as Architect and ETL Lead responsible for writing functional and design specification documents
- Created specifications, logical and Physical data models for the datamart along with test cases.
- Developed complex Informatica mappings and performed unit and system testing
- Implemented row level security on datamart tables to restrict Ad-hoc and reporting users using Oracle's Fine Grain Access Control FGAC mechanism.
Technologies
UNIX, Windows XP, MS Visio 2003, Informatica Power Center 6.2, Business Objects 5.1, Toad 7, Oracle 9i
This solution was to create an enterprise Datawarehouse to generate analytical reports on the manufacturing and quality processes followed during the manufacturing process of the drugs created at Ware manufacturing plant of GSK. This project was executed under a UNIX environment using Informatica ETL tool that would extract data from various source systems such as SAP, PeopleSoft, Oracle and paper sources or flat files. There were 11 source systems.
Responsibilities
- Worked as Architect responsible for writing the requirement, functional and design specification documents
- Created Logical and physical data models for the data warehouse
- Created ETL specifications, complex ETL mappings and test cases
- Developed UNIX Shell Scripts for flat file formatting, automating data transfer from various UNIX servers onto the shared server and data conversion
- Coordinated the development efforts with an offshore team
Technologies
UNIX, Shell Scripting, Erwin 4.1, Informatica 7.1, Oracle 9i
Tech Lead
This project was to upgrade the existing Datawarehouse infrastructure of Thames Water to the latest version of the tools and technologies used. The efforts were being made to upgrade Informatica from Power Center 5 to Power Center 7.1.1 along with an upgrade of the database from version 7 to 9i. The scope of the work included performing upgrades of software, performance tuning of mappings, testing and implementation of the warehouse in new operating environment.
Responsibilities
- Writing the migration plan
- Installation and configuration of Informatica Power Center 7.1.1
- Migration of repository from Power Center 5.0 to new environment
- Performing unit and system testing of all existing mappings and fixing of the problems encountered during testing.
- Optimization of the mappings that had shown poor performance
Technologies
UNIX, Informatica Power Center 5.0/7.0/7.1, Oracle 9i
Sr. Developer
This requirement was to create an operational data store of the products manufactured by Fisher Scientific that would be used to facilitate creating a web based online catalog of products. The project was executed in two phases. In the first phase the development took place using OWB tool but in the second phase the development was re-done using Informatica Power Center along with a few enhancements to the existing system. The efforts were also extended to load the data from ODS to the EIM tables of Siebel to finally load data into Siebel.
Responsibilities
- Wrote ETL Specifications, Job scheduling documents and Unit Test Cases
- Developed initial OWB mappings to translate the business logic defined into ETL processes
- Redeveloped the ETL Specifications to fit specifications for Informatica tool
- Developed complex mappings using Informatica and performed Unit testing
- Defined target load plans and constraint based loading strategies
- Created Workflows and scheduled them using Workflow Manager
- Wrote UNIX Shell Scripts for flat file formatting and cleansing data.
Technologies
UNIX, Oracle Warehouse Builder 9.2, PL/SQL, Shell Scripting, Informatica Power Center 6.2
Pricing System solution was to extract the data from the mainframe systems in flat file format, transform the extracted data to suit the requirement of a pricing tool called Confidential and load the transformed data into the Oracle database tables. Confidential would access those tables to determine the prices of the products based on this data. After update of the pricing information by Confidential, the data was transformed back to flat file format which was then loaded back to the mainframe.
Responsibilities
- Wrote ETL Specifications and Unit Test Cases
- Developed complex OWB mappings to translate the business logic defined into ETL processes
- Performed Unit and System testing
- Optimized the mappings that had shown poor performance
- Wrote UNIX Shell Scripts for flat file formatting, cleansing and scheduling of OWB jobs
- Mentored the development members on OWB
- Performed code and document reviews
Technologies
UNIX, Oracle Warehouse Builder 9.2, PL/SQL, Shell Scripting
Confidential
Sr. Developer
This Project was to customize the existing warehouse of Informatica called WISDOM for their Sales and Marketing processes. The scope of work was to customize the existing mappings of the data warehouse, optimize them for performance and perform integration testing on the system. Siebel was the source system and connected to it using Power Connect is DB2 the target database for WISDOM.
Responsibilities
- Understand the problem description stated in USE-CASES and design and write ETL Specifications
- Install and configure Informatica at offshore location
- Customize the existing mappings and optimize them for performance
- Write complex new mappings and test them
- Write Test Cases and perform integration testing
- Perform code reviews
Technologies
Win 2000, UNIX, Informatica Power Center 6.2, DB2