Lead Oracle Dba Resume Profile
NJ
OBJECTIVE: Obtain a position to manage and deliver innovative solutions.
SUMMARY OF QUALIFICATIONS:
Confidential , Solution and Enterprise Architect with Project Manager experience in data governance, ETRM Solarc Right Angle solutions and fifteen years experience in managing, designing and implementing applications and systems with web-enabled and client server technologies and relational databases on Oracle/UNIX, DB2, SQL Server and Teradata platforms . 4 years of Ab-initio experience, 2.5 years of Informatica and 1 year in OWB to implement high performance ETL code for datawarehouses and 5 years of Business Objects. 3 years' experience in Dataflux and Optim archiving . Built datamarts for Marketing, Financial and Sales CRM databases in Financial, Retail, Healthcare and Telco industrial areas. Specialization in Business Intelligence Business Objects, Discoverer, Dataflux , Customer Relationship Management CRM , Data Warehousing, Data Mining and Oracle Financial Applications. MBA from DePaul, Chicago ranked 5 nationally in part-time MBA in 2000 and also completed an MS in Computer Engineering in 1994. Strong technical and communication skills. Goal oriented, client focused, team player and bottom line oriented. Published two research papers in international computer journals. Managed and delivered US 10 million solutions to customers and have generated multimillion dollar sales for consulting companies.
PROFESSIONAL EXPERIENCE:
Lead Oracle DBA
Primary responsibility to maintain Data Quality and drive a team of Dataflux consultants to keep the platform for DQ up and running on Unix servers. DEV/QA/PROD/DR platforms maintained with DB2/Oracle/Teradata/SQL Server data stores to deliver DQ monitoring using Webstudio, DM Studio. Also used US Postal services to identify quality of data. Profiling of data automated using DM server and Autosys established.
Enterprise Architect
Established Data Foundation for Business Units and Petrotechnical Functions Reservoir Management, Drilling and Completions, Base Business and Facility/Equipments by engaging stakeholders, setting expectations and employing strategy and architecture guidelines and governance. Documented design instructions for deploying Enterprise wide standards System-of-Records, Application Information Standards like Wellview 9.0 for Reservoir Pressure tests, etc. Setup Architecture Maturity for Chevron Upstream and engaged with overall architecture community to develop Target Architecture for Petrotechnical Functions for future years. Created roadmaps and processes for transitioning to the Target Architecture using Gap Analysis and Planning Portfolios. Created Taxonomy of products in Enterprise Architecture. Worked on System Architect tool as a enterprise architecture repository to maintain artifacts and models/diagrams. Developed Upstream strategy for using System Architect as a system of record for EA artifacts. Used UPAP process to help incorporate standards. Additionally developed Upstream Data Elements having enterprise Information objects and architecture related information usable by the business and the IT management team.
Solution Architect
- Solution Architect for Request to Pay e-commerce OLTP system that handled customer requests in real time for products and services inside SHELL and supported tens of thousands of customers and integrated with 15 other systems internally and externally. Used Biztalk for real time messaging, SQL Server and .Net for the application. Assured End to End Requirements Management, Capacity Configuration Management responsibility as well as functional specifications, design and development for 2 parallel releases. Drove performance, scalability and quality improvement through the tooling aspects of the project. Business Architecture, Information Architecture and Application Architecture as well as Reporting Architecture developed and presented to the leadership team. Gained Stage Gate approval through Architecture Review Board. Risks/Issues documented and driven towards resolution. Implemented Sharepoint FastSearch for advanced searching capabilities and used ProVision for business processes.
- Helped in Operational Management and analyzed the end to end business processes, integration bottlenecks and master data management improvements by enabling metadata, documentation and process management. Helped ensure end to end reporting and monitoring capabilities were established and agreed upon. Engaged with K2 Monitoring CoE.
- Analyzed performance and scalability issues in the analytical datawarehouse Traced and suggested improvements on the datamodel and business processes to gain performance and data quality.
Data Architect
Confidential to drive the business case, engage different stakeholders to drive the solution, define the business processes using ARIS and ensure best-fit for BP. Standards maintained, Architecture Quality Plan delivered, Non-Functional Requirements adhered to and overall Enterprise Architecture was addressed. The following were specific solutions worked on
- Assessed ETRM vendors e.g. Allegro, Openlink, TriplePoint, SolArc, etc. and recommend solution for least cost and highest benefit - chose Solarc. Developed methodology for Functional mapping to Data Systems used in selection of Vendor.
- Managed a team of data analysts and developers to gather requirements, create design artifacts and deliver code in Informatica ETL to create a BI data warehouse solution. Created data model, logical and Informatica mappings and managed 4 downstream project integration requirements and delivery considerations. As Project manager, created and managed the project plan.
- Facilitated Data migration and Integration experience to produce best of class results in BP IST. Finalized scope for Integration Architecture and obtained buy in from business. Managed the Reporting and Business Intelligence efforts for the program and outline the strategy and planning around it.
- Lead the data quality initiative inside BP using Dataflux. Deployed Dataflux in our project using Shared Services and used it for Master Data Management between SAP, Solarc, ETC and other applications. Encouraged the use of data quality best practices to achieve high performance results in project.
- Drove the archiving and purging best practices using Optim throughout BP. Manage the archiving and purging architecture for the Oasis program including its requirements, driving the solution. Used Advanced custom Data Subsetting and Archiving functions using Optim Basic. Developed Optim Discovery and Analysis processes. Information Life Cycle Management archiving of production systems and reporting off the archived repository on cheaper storage. Implemented requirement to archive tables containing multibillion rows with minimal impact to production. Solution required automation of archive files and loading into a secondary archived repository and then used Attunity to integrate Business Objects with it.
- Encouraged use of ER/Studio for data lineage within the data architecture community.
- Lead with the role of Data and Integration Architect on the NAGP NGL Oasis programme. Drove the strategy and architecture for the Data Migration activities in the program from Select to Define and Execute in 2010. As Delivery Lead for the Data Migration and Integration work stream ensured the program objectives are met within budget and time.
- Identified technology, methodology and type of integration for each Integration point using ETL, IL, SAP/PI where needed. Engaged in leadership role to drive the SAP/BW solution for the NGL Portal. Drove the solution for the ETL integration including Credit Risk, Market Risk, etc. Estimated the Integration Delivery effort hired resources as needed.
Confidential
- Responsible for end to end solution and for building and architecting a Star Schema data mart sourcing from the OLTP database and delivering a reporting solution for the project. Designed the Business Objects Universe and lead team to develop reports and deploy to production. Tool selection criteria for data analytics and ETL were part of the project.
- Delivered a project with Informatica to source data from source into target using both batch and real time Change Data Capture . Master Data Management in real time was implemented using messaging and a universal TIBCO bus.
- Architectural responsibility for driving the functional requirements by means of Use cases and Change Requests to deliver Scheduling and Logistics/Inventory Management of Physical Operations for BP. Analyzed requirements and lead effort to create, design and development artifacts as well as architecting highly available and top performing solution. Performance, Availability, Usability criteria were placed into the design considerations with balance of each criteria and cost/benefit analysis done for different options.
- Integrated SAP and Oracle NextGen systems in real time with messaging and data conversion/cleansing activities for Master Data Management using Dataflux. Also, deployed the Dataflux tool in our project using the shared Services Architecture. Understanding of the NextGen and SAP data models as well as the data was an important primary activity. Physical Deals, Exchange Deals, Shipments, Parcels and Master Data were culled for loading into the application IMOS database.
- Managed a team of 3 data modelers and 4 ETL programmers as well as guided a team of data analysts and business analysts to deliver data architecture for a US 125M project. Used MS Project to deliver solution and implemented standards, guidance and tools across the project and the enterprise.
- Lead the design of XML message structure used in the real time data movement by means of Canonical Data Model. Responsible for delivery of quality data model, data integration and data quality. Used metadata repository to keep source-target mappings and delivered real time data movement from multiple sources into OLTP database. Delivered high quality transactional and master data from SAP, etc. sources into target environment using ETL automation for Data Conversion and Cleansing.
- Drove data quality expertise and architecture using standards and best practices in BP.
Technology used was Oracle, SQL Server, TIBCO BW, SAP data integration using iDocs , Dataflux version 8.1, Java, Business Objects XIR2, Informatica 9.0.1, Optim and ER/Studio 8.0.1
Datawarehouse Architect
Confidential
- Worked as a solutions architect for Acxiom to integrate customers' data and business processes like CDI Customer Data Integration to allow campaign management and data analytics. Employed Database multichannel marketing using tools like Unica to create a CRM application with Business Objects XI R3 to provide customers' 360 degree view and facilitate diverse channel communication based marketing. Lead the analysis, requirements, and design and development efforts to enable business needs for a mult-terabyte datawarehouse. Used Oracle, UNIX, ERWin, DataStage to do data modeling, source-target mappings, and architecture and create design deliverables. Data Quality and Data Governance for Master Data Management embedded in solution for client data requirements and implemented internal processes to cleanse, convert and load data into the data warehouse from over 16 data sources after data profiling and analyzing the sources. Also designed and architected and lead the build of a bespoke Test Data Management utility for customer and designed datamodel, front end and backend processes to move data using ETL based on customer needs from Datawarehouse into the testing environment in real time.
- As the data architect, lead the design and development of Optim 8.1 based archiving from 50 SqlServer databases and 24 Oracle databases from production into test environment. Source analysis, automated DDL creation, data model reverse engineering was employed to automate the extract and load process. Also used Optim for data masking. Employed Access Definition from IBM to source from Oracle e-business suite into target database.
- As Integration Architect, created reference architecture for Integration patterns including Level 1,2 and 3 architectures that created an end-to-end integration data flow diagram, laid out different technologies, categories and use cases for integration and also documented the reference patterns for the data flows.
- As the data architect, lead the design and development of Oracle 10g datawarehouse using Oracle Warehouse Builder OWB , Cognos and Discoverer. Extracts from Oracle Apps Sources into target using materialized views and then OWB running on top of it.
- Implementing continuous flow XML based data movement from SAP 4.7 data sources in real time to canonical model based on design of the Teradata datawarehouse 20 TB and Oracle Financial Apps as also I2. Wrote best practices and design methodology documents and used Abinitio Metadata tool EME to source from ERWIN models to create business rules. Ab Initio Queues and JMS based queues used in Publish/Subscribe mode. Architected the datawarehouse model and created ETL mappings from many sources to the datawarehouse. Data Quality for Master Data Management was designed and implemented.
- For this campaign management software UNICA based project, mapped Business Rules in MS-ACCESS database to write a code-generator that was used to do Ab-initio transformations so that code was written once only. Loaded a DB2 database using ETL server on Windows NT platform. Also initiated design process flows to communicate between UNICA sources in-house to Experian outside. First Logic application used to cleanse the data for address and other data management.
- Implemented EME on this Abinitio project and streamlined code management. Developed graphs and converted existing graphs using vectors to source data from different Legacy Cobol based data sources and pushed into a new data repository on Teradata from Oracle RDBMS. Source Target Mapping done in MS-ACCESS database.
Sr. Enterprise Database Architect
- Manager and Lead Enterprise Architect for dev/test/production system of 9i and 10g databases leading a team of 3 DBAs on Oracle 10gR2. Guiding and mentoring the team and the company to scalable, high available systems using RAC clusters and RMAN. 2TB sized OLTP production system. Managing 2-3 outside contractors for DBA support and giving direction and setting strategy for the team.
- Used Optim in Data Archiving and Purging Strategy and tool implemented to reduce cost of EMC storage by 50 leading to cost savings of more than US 800,000. Drove Optim subsetting for testing requirements as well as for enabling copies subset of production. Automated archiving, purging and restore to a secondary data store with cheaper EMC storage using Optim in a scheduled daily batch program. Maintained referential integrity between all the entities in the archived repository by using Optim.
- Budgetary and Financial responsibility for 1.7 M annual layout involving the following:
- Enterprise Lead architect for designing Oracle Datawarehouse and ETL processes for the company. Tool selection, resource allocation, capacity planning and budgetary implementation executed. Designed ODS and Datawarehouse model as well as implanting ETL and BI front end strategy for reporting needs. Implemented reporting solution using Oracle Discoverer.
- Master Data Management implemented for reference and master data across multiple in-house applications. Data Governance was setup for managing master data in the Enterprise using automation and manual processing where optimal.
- Oracle 10g iAS portal/discoverer/reports environments and architecture set up for development/test and production. Change control set up for reporting and scheduling of discoverer analysis.
Sr. Managing Consultant
- Leading a team of data architect, Oracle/Teradata DBAs, data modelers and ETL Ab-initio developers to build a collections datawarehouse. Designed and implemented Database model, physical database on Oracle and Data Movement using Ab-initio to push data from legacy and analytics sources into the Collections repository. Designed the star schema database model from business requirements and modified it. Configured development, test and production ETL servers and databases for Oracle 9.2 9i and Teradata RDBMS.
- Managed teams of 5-25 people including Client employees, IBM employees and 3rd party consultants and lead as
- Ab-initio ETL team lead doing hands on development of 5 Abinitio developers and designers to develop and implement a banking Financial system Data warehouse. 25 graphs developed to extract data from different sources and transform and load into a central database of Oracle and Teradata. Used EME to do code management and track performance of individual jobs.
Lead Oracle DBA
- Leading and managing e-commerce projects and solutions with a variety of clients e.g. Walgreens, ServiceMaster, DISCOVER/NOVUS, US-WEST, etc. to manage, design and implement highly available solutions. Worked with Very Large Database VLDB systems using Oracle on open platforms with clustered UNIX computers using web technologies. Emphasis on OLTP and Decision Support Warehouse systems.
- Managing a team of DBAs and IT staff for this leading Internet startup. Designed the logical and dimensional model for the datawarehouse. Moved the company towards a highly scalable and highly available co-location facility with Exodus and Storage Networks. Reduced MTBF to less than 5 minutes per month from 5 hours per day. Reported to Vice President of Operations and helped in moving the company's business model from B2C to B2B and become a managed streaming media service provider. Built the reporting and datawarehouse database on Oracle for the dotcom and the dot net models on Sun's UNIX/Oracle servers.
- Lead a team of data architects, Oracle DBAs and batch ETL Ab-initio developers to build an in-house Accounts Receivables Financial system datawarehouse for Walgreens. Data movement using Ab-initio to push data from legacy 81 different sources and DSS sources using a canonical data model into the PARS repository. Designed the star schema database model from business requirements and implemented it. Helped in using EMC's BCV/SRDF based Timefinder standby database. Set up development, test and production configurations/databases and designed object and database model on Oracle 9.2 9i using ERWIN and Rational Rose limited experience on Rose . Set up physical database parallelism, partitioning, indexes, etc. for tuning the database. Improved performance on application and in some cases performance went 10 times faster than before. Used PL/SQL to tune application performance. Handled datawarehouse batch processing for this 4 Terabyte Oracle database 9i on IBM's AIX UNIX machine. Websphere and Apache were the web/app servers. Team of 30 people for this full life cycle project used Java and Business Objects for the front end online access. Also, put in place the starting of the data governance for Master Data Management at Walgreens.
- For this financial services company, built an e-business Customer Relationship Management CRM application using Oracle8i database. The database model was designed and implemented on IBM's UNIX webserver. The application required a scoring and modeling SAS data mining application running on 48 processors. Ab-initio was the ETL tool to extract and push data into datawarehouses and other databases. Apache server was used for web-enabled access to the Intranet for the end users of the application.
- For this Health Industry Company, helped to manage, design, create and implement an e-commerce solution with Oracle 8 datawarehouse from design to end-user tool access. The platform was IBM's UNIX web server using Apache for web access. 3-tier solution with the web-server in the mid-tier being a Windows NT machine running IIS for Intranet web access. Helped in the choice of OLAP tool, implement the ETL process and set up the datawarehouse and the database.
- For this Management Services company tuned an e-commerce application using Oracles' Web Application Server and WebDB. Also resolved performance issues of a homegrown Web-enabled application using an EMC disk farm. Bought response time down by 500 5 times after analyzing the database and the application. Partitioned the database server between different processors to have minimal resource contention. Helped train the in-house DBAs and gave a baseline metric analysis for the DBAs to work with.
- For this Telco, worked as an Oracle Datawarehouse Architect to re-design the datawarehouse. This included changing backup strategies, tuning the database and the Extract/Transform/Load procedures. Added 3 more consultants at the client site for more work and resulted in additional business for IBM. Tuned the datawarehouse and made recommendations for purchasing a new machine with 12 processors, an approx. 1 million dollar deal. The deal was successful.
- For this Telco, as a performance tuning specialist for a Fraud Detection project running on SUN/Oracle/HDS. Diagnosed problems on the Oracle database running on the SUN machine that allowed my company to get a large follow on consulting project. Lead to sales proposal of 1.8 million dollars to replace Sun with IBM.
- For this furniture company, worked as a DSS Business Intelligence Architect to gather Departmental Product Marketing, Purchasing, Finance, etc. requirements by conducting end-user interviews as part of BBA. The interviews result in a technology feasibility matrix that allows a company to pursue one particular data mart and stay focussed. Also worked with Decision Point Applications DPA -a product that allows a canned Oracle Datawarehouse to source data from an Oracle Financial Applications database.