Enterprise Data Architect / Consultant Resume
4.00/5 (Submit Your Rating)
SUMMARY
- Strategy, Design, and Development of on - premise and cloud-based enterprise and solution level BI and data applications
- Ability to explore, assess, and apply new enabling technologies to support business needs
- Able to successfully interface with internal and external stake holders at all levels of a corporation and create insightful presentations/communications for both business and technical audiences.
- Ability to lead/gain consensus among businesses and teams to collectively achieve objectives
- Development and technology evaluation skills
PROFESSIONAL EXPERIENCE
Confidential
Enterprise Data Architect / Consultant
Responsibilities:
- Model data for enterprise-wide Human Resources Human Capital Management application. Data Model stores info to track reqs, candidates, recruiting/interview activities, offers, positions, and to define/track employee key performance indicators, as well as corporate ratings and reviews. Information supported BI dashboard and reports.
- Design and implement Informatica mappings for Quality in Patient Care (QIP) application that sources data from various legacy operational data stores and consolidates to Oracle 11.2g data mart for BI/reporting access. Review and optimize physical model to improve load and query times.
- Integration of operational Patient Population Analytics Oracle data store into federated Enterprise Netezza data warehouse via Informatica workflows. Design and implementation of Informatica workflows to preload 1 BIL fact records, and provide monthly updates, of related reference data history.
- Port of Informatica workflows for client applications to source and load data from/to new federated Enterprise Exadata Data Warehouse from legacy Netezza data warehouse.
- Logical and physical data model to store monthly patient care submission and feedback information for compliance with NHSN (national health surveillance network). Design and implement Informatica mappings to shred and convert XML feedback files received from NHSN into Oracle data mart’s relational representation.
- Technical Specification Document (TSD) for Quality in Patient Care application to implement Center for Medicare and Medicaid Services’ new 2017 measurements for affiliate dialysis centers patient care.
- Executive Summary - Feature/Implementation Analysis and Evaluation of cloud-based SaaS BI application to support affiliate Nephrology practices to manage key performance indicators via clinic, practice, and physician level dashboards and reports. Implemented in Logi BI and SQL Server. Inventoried data subject areas and entities. Assessed for ease of development and change, performance, security, HIPAA compliance for privacy/change logging/auditability, scalability, and maintainability.
- Provide support to client QA team for data validation during acceptance testing.
- Attend prospective client meetings as development partner for cloud-based SnapLogic integration tool, to investigate and promote potential use of SnapLogic in client endeavors to migrate their enterprise to the cloud using a single integration platform.
- Attend prospective client meetings as development partner for Thoughspot, to investigate and promote potential use of Thoughtspot for more dynamic BI search application capability.
- Participate in development partnership evaluation and development with latest data technology vendors such as cloud-based SnowFlake data warehouse.
- Contribute content for company marketing materials.
- Influence daily process for managing projects and staff to allow better tracking of project and non-project activities.
Confidential
Data Architect Consultant
Responsibilities:
- Member of Small Agile Consulting Team to Provide Redesigned Securities Finance (SF) Capabilities based on New Entity Profile Reference MDM.
- Responsible for Design and Delivery of Industry Best MDM Conceptual & Logical Data Model to support Consolidated Security Lending Capabilities for Agency and Principal/Enhanced Custody Security Finance (SF) Products. Involved:
- Review of Online and Offline SF Onboarding, Operational, and Administrative Processes for Both Agency and Principal/Enhanced Custody Product Offerings
- Review of Applications and Data Stores in existing SF Enterprise Landscape that would need to be Integrated with Enterprise SF Solution
- Review of Existing SF Entity Profile Data Sources, Supporting Technologies, and Technology Roadmap
- Requirement gathering sessions with stake holders including Credit and Risk departments, Agency and Enhanced Custody (EC) Onboarding teams, Data Governance, and Technical support teams.
- Create Logical Data Model for Reference MDM DB in Visio with approximately 180 Entities Relationships. Import into micro Access application I developed to automatically generate data model documentation.
- Create mapping and integration plan of Legacy Data Elements to New Enterprise Reference Data Model with approach for cleansing and de-duping
- Modeled new mission-critical capabilities:
- Rules and Parameters for declarative control of operational risk, compliance, and credit.
- Activity-specific organizational hierarchies for generating Netting Sets to support compliance with Regulations such as Basel II/III.
- Legal Agreements and how they govern entity participation in SF Lending Programs.
- Elastic Data-driven design that allows definition of new entity types and their attributes and relationships to support industry evolution
- Contributed to Solution Architecture, Road Map, Data Integration, and Data Governance work streams
Confidential
Business Intelligence (BI) for Product Distribution
Responsibilities:
- Member of Small Agile Consulting Team to Provide Road Map, Operating Model, Solution Architecture, Data Architecture for Industry Best BI Capabilities
- Responsible for Accelerated Delivery of Industry Best Fund Distribution Conceptual & Logical Data Model
- Reviewed Business Processes for Fund Distribution & Recommended BI Capabilities to Create Initial Conceptual Model
- Reviewed Applications including CRM (SalesFocus,SalesForce) and Marketing (PARDot,MarkIT) in Enterprise Landscape that would need to be Integrated with BI Solution
- Reviewed Existing Data Sources & Reporting Processes
- Created Logical Data Model for Reference MDM DB in Visio with approximately 150 Entities and 300 Relationships that provide comprehensive support for Finance, Product, Marketing, Sales, and Executive team BI dashboards and analytics.
- Created Logical Data Model for Analytic MDM that includes specialized entity characteristic and segmentation support for slicing and dicing data for advanced BI analytics
- Data Model was Well Received by the Business
- Contributed to Operating Model, Road Map, and Solution Architecture work streams
- Researched BI technologies (Birst, Tableau, Oracle BI Enterprise. Cognos BI 10.2) to determine if a living prototype of the report requirements could be made.
- Created Access DB micro app to import Visio model from, enter and maintain entity/relationship Documentation, generate deliverable documents, as well as database/technology independent DDL.
Confidential
Director of Data Development
Responsibilities:
- Design and implement: historic market data pricing data warehouse in Oracle 11.2g using Oracle SQL Developer and applying Oracle Analytics functionality . Design and implement at both the Logical and Physical data level. Applied partitioning, compression, Index hashing to optimize storage and retrieval performance for both bulk loads and data access queries/updates.
- Data access layer to provide speedy time-series and latest pricing information retrieval. Implemented as PL SQL stored procedures and views.
- ETL technologies (Syncsort DMExpress) on Linux for daily feed handlers for security symbology, pricing adjustment factors, and exchange/evaluated pricing data.
- Rule-based data cleansing / validation/QA framework using ETL technologies to compare data stored in heterogeneous systems/data stores/retrieval formats (Oracle, Sybase, My SQL, as well as proprietary mainframe database) .
- Java, sqlplus, and Perl data query scripts that access new and legacy data stores that query Data Access APIs for data comparison.
- Port of SQL Server data store of options history with complex Transact SQL functions and stored procedures that provide volatility estimates to Oracle 11g PL SQL.
- Integration of market data feed handlers with the company’s operational framework for job scheduling, subscriptions to data feeds, and publishing of data updates using korn/bash shell scripts and Perl
- Extract, transform, load 50 years of pricing history stored in proprietary mainframe data store to relational open systems representation in Oracle 11.2g.
- Evaluate ETL tools that supported rapid migration of 50 years of pricing history from proprietary data stores on mainframe.
- Support BI team and mentor/support on how to use ETL tools to mine/analyze data.
- HADOOP/HIVE pilot - Investigate an alternative modernization approach to an Oracle warehouse for processing, storing, and accessing 50 years of international securities pricing data amounting to 5.4 billion aggregated records. Reviewed DMExpress product that provides HADOOP/HDFS sort, put, and get accelerators.
- NOSQL data technologies review of key-value, document/aggregate, column family, and graph type data store and best uses, strengths, and weaknesses.
- Mentor team in new technologies, data modeling, and data delivery and exchange formats such as XML. Assist staff as needed with development.
Confidential
Architecture, Analysis, Independent Validation and Verification
Responsibilities:
- Provide enterprise architecture, SDLC, testing, and strategic planning services for clients.
- Design and Implement test procedures for validation of security requirements for FAA distributed ground control system.
- Respond to RFP’s for DoD, State, Federal, and locally funded projects .Write technical sections on topics ranging from Enterprise Architecture Services, integrated architectural products, Service Oriented Architectures (SOA), information support plans (ISP), information assurance (IA), configuration management, master data management (MDM), capabilities-based development (JCIDS/DODAF), Federal Enterprise Architecture Framework (FEAF), data center consolidation, integrated testing, independent verification and validation (IVV), and much more.
- Support Partnerships with vendors of Enterprise Architecture products on behalf of client that satisfy current DoDAF and FEA standards for SOA and Enterprise Service Bus (ESB) technologies, such as Fiorano software. Involved review of latest SOA/Business Intelligence industry standards including SCA/SDO, BPMN/BPEL, WSDL, XPATH/XML/XSD, JMS, WS-* and their application.