Data Architect Resume Profile
Personal Summary
- Kolawole has over 15 years experience in information systems analysis and design. In addition, he has lead various projects and collaborated strategically with key stakeholders Director through C-level across enterprises.
- He has a formal education in Management Information Systems t the Masters Degree level.
- Capability Synopsis
- His experience and knowledge of Enterprise Data Architecture are in:
- Enterprise Data Strategy Roadmap
- Knowledge of TOGAF
- Knowledge of Big Data Concept
- Enterprise Data Modeling Data Warehousing
- Creation Management of
- Reference and Master Data
- Enterprise Metadata
- Enterprise Data Flow Diagram
- Data Governance Framework
- Enterprise Data Standards and Processes
- Enterprise Data Integration, featuring Data Extraction Transformation and Loading ETL design
- Analytics and Reports
Technical Synopsis
- CASE TOOL: ERWIN MODEL MANAGER E/R STUDI DESIGNER 2000
- DATA BASE: DB2 SQLSERVER2000 TERADATA ORACLE DBASE
- METADATA
REPOSITORY:
- ROCHADE
- ETL TOOLS: SQL SERVER INTEGRATION SERVICE SSIS 2008 AB INITI SUITE
- DOC MNGMT: MERCURY QUALITY CENTER SHARE POINT STAR TEAM
- DB QUERY: SQL
Experience Synopsis
Confidential
Projects: Leading the transformation effort of United Health One Enterprise data environment t an architected one.
The effort entails requirements gathering, analysis, modeling, mapping, data integration, application coding, and business intelligence reporting.
- Created data architect vision t address stakeholders' concerns
- Created gap analysis between the current environment and the target environment
- Created roadmap of activities and resources Enterprise data assets, from current-state t future-state
- Created a strategy for beginning-to-end proof of concept for:Data Integration Hub hybrid of Repository and Registry strategy
- Operational Data Store and Atomic Data Warehouse
- Master Data
- Metadata
- Source Extraction, Staging, Profiling, Transforming and Loading
- Business Intelligence
- Created Dimensional data model t enable analytics and reporting of business processes from Atomic Data Warehouse
- Created data standards and governance processes forInternal processes
- Requirements gathering
- Requirements Specification
- Peer Reviews
- Hand-off and sign-off of effort t other teams
- Testing verification and validation of processes
- Knowledge management
- Project management, using agile methodology
- Outbound and inbound data flow from third party service providers
- Created Data Flow Diagram t identify the enterprise'sSystems boundaries and the enterprise's data scope
- Data sources and destinations
- Data Processing points
- Storage points Integrated Data Storage IDS
- Liaison with database vendors for proof of concepts of databases t accommodate data warehouse data and growth
- Liaison with Modeling tool vendors for proof of concept and licensing agreements
- Recruited human resources t enable the proposed architected environment
- Data Modelers
- Quality Assurance Testers
- ETL Developers
- Manage efforts of my data architecture team direct reports
Confidential
Projects: HCSC Enterprise Data strategy plans and policy t fulfilling Next Generation NGEN Health Insurance Marketplace
- Created roadmap for HCSC Enterprise data assets, from current-state t future-state
- Created strategy for source data integration and loading t targetsThird Party Service Providers
- Risk Analysis RSA third party service provider Outbound and Inbound Data
- Neolane Healthcare Plan Application Management System Customers and Prospects data
- Transitional Data Warehouse for Customer Intelligence inbound Risk Analysis RSA data for Health Wellness and Neolane
- Operational Data Store and Atomic Data Warehouse
- Master Data
- Subsidiary retail insurance data assets - Hallmark
- Created Dimensional data model t enable analytics and reporting of business processes from Atomic Data Warehouse
- The business processes are:
- Managed Health Wellness
- Enrolment Eligibility Fulfillment
- Shopping Experience
- Consumer Intelligence
- Created data standards and governance processes forRequirements Gathering
- Internal Data Processes, using agile methodology
- Outbound and inbound data flow from third party service providers
- Created Data Flow Diagram t identify the enterprise'sSystems boundaries and overall enterprise data scope
- Data sources ODS RSA Neolane and destinations ODS RSA
- Data Processing points
- Storage points Integrated Data Storage IDS
- Created Dimensional entities from Atomic Warehouse Model - Healthcare Plan Data Model HPDM
- Modeled inbound data from third party service provider Risk Analysis - RSA in Transitional Data Warehouse TDW
- Reengineered the processes in the data modeling environment for the management ofMultiple HPDM model users in Model Manager
- Data model release strategy
- Metadata repository
- Change request
- Integrated message portal
- Collaborated with senior management and stakeholders about their concerns, views and viewpoints, and their buy-in during the enterprise data environment process re-engineering
- Manage the efforts of my data architect and data modeling direct reports
Project: FEP Claim Adjudication Streamline project. This was for the integration of duplicate claim adjudication in Blue Chip and FEP Direct
- Gathered requirements and created relational data models for Claims Payment Schedule Dynamic Mapping User ID Cross Reference Message Queue Management solutions
- Created physical names, and generated DDL for the optimized physical model
- Mapped metadata names with attribute names in the logical data model
- Facilitated peer review sessions for data model quality and its alignment with metadata
- Created and managed Change Request for the inclusion of new and modification of current objects in the Claim Transaction System CTS database
Confidential
Data Architect Consultant DISYS
Program: The integration of all HSBC Global data assets
- Created Architecture Vision for the Data Architecture as part of the strategy
- Created relational model for ICCM Integrated Customer Communication Module : OHD One HSBC Distribution OHSS One HSBC Sales and Services Trade Common Services
- Mapped enterprise data model objects t enterprise metadata objects
- Facilitated peer review sessions for data model quality and its alignment with metadata
Metadata Analyst
- Facilitated JAD sessions during the gathering of data attributes requirements for data models int Requirements Spreadsheet RSS
- Created enterprise metadata repository, and performs cyclical scan of the validated Group Metadata Repository GMR metadata int Production enterprise metadata in ROCHADE Repository
- Created and enforces standard of best practice in the metadata environment, such as reuse of existing validated objects
- Generated physical names for attributes in the GMR in accordance t Oracle, DB2 and UDB databases
- Maps physical only elements t logical/physical and logical only attributes
Process Improvement Analyst
- Defined Data Reuse Process for
- Comparison method of Message Specification data with GMR in ROCHADE in order t shorten data analysis turn-around time
- Data checks, verification and validation processes for data input from business consultants in the data analysis process
Data Governance
- Conducted an audit of the enterprise data governance framework t update all current enterprise data assets processes and practices
- Reviewed data fields and definitions for compliance with data naming standards
- Formulated Change Request process as well as strict adherence and management of the processes
- Enforced Data Modeling standards across HSBC locations Hong Kong, United Kingdom, Canada and the United States
- Facilitated peer review session for quality assurance of the metadata analysis process
Confidential
Lead Data Architect
Program: The PepsiC 1UP program was for the integration of the data assets in the five divisions of PepsiC Inc.
PepsiC Enterprise Data Model
- Lead and coordinated the effort of a team of four data architects, six data modelers, three ETL developers, and DBAs
- Interfaced with C-level, Directors, Senior Managers and Subject Matter experts, and other stakeholders during requirements gathering sessions stage gate meetings change request, business strategy evaluation and update, etc
- Created Data Architecture Vision for stakeholders' concerns, and for Statement of Architectural Work
- Created data architecture strategy in alignment with the PepsiCo's business goals and strategies Value discipline Operational Excellence , and for the inclusion of corporate policies, principles, standards, and data governance
- Created Roadmap for PepsiCo's enterprise data architecture
- Published white paper for the PepsiC Enterprise Data Model OLTP and Data Warehouse Models
- Data Integration Information Framework Source Conversion Business Process Analyst R7
- Profiled and analyzed source legacy application master and transaction data in readiness for integration migration t target destination Master Data Hub, and ODS respectively
- Created detailed Functional Design and Technical Design documents for legacy source data migration t targets - SAP Ab Initi data hub etc
- Configured data migration specification source t target mapping in Master Specification document
- Configured business rules for data migration specification Master Specification in Ab Initio's Business Rules Environment BRE
- Created unit test cases and executed them for data conversion in Master Specification
- Performed functional analysis in order t strike the right balance between consensus and direction as well as inadequately simple and perfect design solution t move team's deliverables forward for Technical Design, Data Mapping Process and Master Specification
- Performed Technical Design, Data Mapping Process and Master Specification
- Presented Technical Design Process design t stake holders in Business Units for 1UP conversion processes
- Planned, assigned, coordinated and monitored business SMEs and technical analysts and developers for process design alignment under an integrated project plan
- Adopted Inman methodology for the creation of Data Warehouse by customizing Atomic Data Warehouse Model in Teradata Manufacturing Data Model MDM for the PepsiC Logical Data Model PELDM
- Created Dimensional models for Business Processes data mart created mapping t the semantic layer for report
Process Framework Design
- Designed process document t manage data model release t DBA, EDM and ETL teams
- Created a fit-gap analysis for future state model release process
- Analyzed source enterprise data by using Ab Initi Profiler
- Mapped and loaded
- Source metadata int Enterprise Metadata Environment Datastore
- SAP Source Data t PepsiC Enterprise Logical Data Model PELDM
- Migrated PepsiC Enterprise Data Model PELDM from Erwin t E/R Studi CASE Tool
- Created and managed enterprise data model repository data model Check-in and Check-out Strategies and model release strategy
- Collaborated with stake holders during the creation of PepsiC Enterprise Logical Data Model PELDM environment management process for:
- Business requirements validation
- Data Model Repository Objects Naming Strategy
- PepsiC Enterprise Logical Data Model PELDM validation and release framework strategy
- PELDM Change Management in STAR Team
- Designed Change Request Management process for the management of EME users request for access t the EME
- Designed workflow based on users' categories and requirements, which were analyzed t grant them appropriate access. The documentation was used for users' training
- Created workflows for change management process
- Presented the process design t the user community for critique and acceptance
- Created and executed unit and system test cases for the document in Mercury Quality Center
PepsiC Enterprise Metadata Framework Process
- Designed Metadata Framework process design for metadata acquisition from numerous data sources and in t the Enterprise Metadata Repository
- Gathered requirements from numerous users' sources and logged the user requirements in Caliber. Worked with Data Warehouse Realization DWR team t source, define by using SAP Logon640 client for research int Development 2 - BW 3.5 SID D22 and port metadata from BW extracts t EME.
- Profiled heterogeneous data sources for migration t multiple target destinations SAP EDW and Ab Initi HUB
- Created detailed functional design for domain metadata framework requirements in Caliber
- Created and executed unit test cases for domain metadata framework in Mercury Quality Center
- Managed test cases defects
- Created Metadata Operational Template and Domain Operational document in the BDD environment, for user training and knowledge management
Global Enterprise Metadata Administration
- Designed EME customization in terms of identified EME objects and required objects t support business requirements
- Created shell scripts and graphs in Ab Initi GDE for design implementation in the EME
- Wrote test cases for Access Control and Security in the Enterprise Metadata Environment data store, and executed them in Mercury Quality Center
- Created and maintained events log, and environment process narratives t enable knowledge management and continuity in the environment
Confidential
Projects: Bank One and JPMorgan's legacy CRM Data Interface Rewrite project.
- This project was a comparison of the systems and processes of Bank One with JP Morgan, in order t keep and use the better of either banks' systems or processes during their merger.
- Gathered database interface redirection requirements from SMEs
- Profiled, analyzed and created ETL requirements for Bank One and JP Morgan's
- Business processes
- Core data tables interfaces
- Created the interface redirection requirements specification from the profiling and analysis of source data and processes
Confidential
Projects: OLTP Rewrite
- Conducted gap analysis in order t deduce best design path for the realization of users requirements for the following projects:
- CruiseCritic Behind The Smiles Fuel Consumption Booking Incentives Message Center Location Reference Marketing Promotions Port Security Sales Region Location Luggage Bag Tag Advance Boarding Pass System Itinerary Change Notice
- Created logical data model t bridge flat file data and relational data database, and generated logical physical model for the above projects
- Created Source-To-Target Mapping of data for daily refresh from the shore side t seaboard database
- Worked in liaison with DBA, business analyst, and developers during physical model implementation t determine
- Appropriate Indexes on table keys
- Primary Key type t use Natural keys, or surrogate keys
- Appropriate view objects t create for the bridging of flat file data with relational data join table materialized view, or join table aggregate view
- Facilitates requirements gathering and follow-up meeting sessions before and during the projects mentioned above
- Authored Design Standards Document that is in line with the standards-of-best-practice for the business data environment
Confidential
Data Architect - Consultant
Projects: OLTP
Datacom database data migration int DB2 database
- Created Data Elements Naming Standards Guidelines Document for enterprise implementation t promote data sharing and reuse
- Created Relational Data Models for Risk Management Information Systems data in the legacy Mainframe Datacom Database
- Created Data Model for EDI Electronic Data Interchange Administration System
- Integrated claims data Islands in the Datacom database and other sources in liaison with the data acquisition group in preparation for migration t DB2 database
- Created Source-To-Target Mapping during data migration of Claims data from Datacom t DB2 database
- Created and defined Data Model for Rules Center expert system implementation for an Alert Center - as part of enterprise-wide data model
- Created Data Administration Charter and Procedure document
- Held Training Classes on Relational Database Design in order t promote the benefits of enterprise and relational database design
- Analyzed and Authored Metadata Repository strategy for Risk Management Information System
Confidential
Projects: Data Migration and Integration
- Gathered and analyzed requirements for the integration of disparate data application sources
- Liaison with technical teams Enterprise Application Integration and Database Administrators in order t design efficient data extraction and conversion technique t service business requirements
- Integrated data from disparate application sources in an XML based message-brokering system for the Client Split/Merge project
- Migrated data from source systems Publishers - t Target tables s that the data are available t Subscribers in acceptable format.
- Documented data transformation rules for program specification during data mapping
- Researched and analyzed tag names from ACORD XML and Code Management Standard in order t name Allstate's Property and Casualty data elements
- Named atomic tag elements in compliance with ACORD tag naming convention and standard
- Logical Data Modeling in ERwin environment
Confidential
Projects: CAPAIII Project LIMS MEDISENSE Data Integration int IDEA Data Warehouse CRM Project
- Gathered and analyzed users' requirements relational data analysis by normalization designed and documented business processes, communicated system change requirement t fellow data modelers
- Logically modeled enterprise and project data with their relationships and transformed them int physical objects in Designer 2000
- Generated CAPA schema and SQL scripts from data model in Designer 2000 for the DBA
- Documented table usage by functions with the aid of CRUD Matrix in Designer 2000 in order t validate data model representation of use cases
- Created Dimensional model for the sales department OLAP system
- Prepared program specification for developers in the form of Source-to-Target mapping
- Collaborated with Data Architect and DBA during the migration of Medisense data int the IDEA data warehouse
- Participated in Performance Qualification Testing
- Analyzed and logically modeled data for Customer Relationship Management project Worked in liaison with Sales, Customer Support, and Business users during systems and data analysis
- Used ROCHADE repository for enterprise metadata management
- Administered ROCHADE Repository users needs and performed monthly data scan from IDEA data warehouse int ROCHADE MVS Repository Server
- Analyzed, recommended, and implemented generation of seamless information strategy for ROCHADE Repository users
- Analyzed, recommended, and implemented a revised standard for best practice for Oracle CASE tool during housekeeping
- Created and defined standard table and column names in accordance with Oracle naming convention and Abbott's standard of best practice
- Used Cognos Impromptu Reporting Tool t run queries against IDEA tables and columns for impact analysis during change management and other maintenance purposes