We provide IT Staff Augmentation Services!

Chief Data Architect Resume

5.00/5 (Submit Your Rating)

Edison, NJ

SUMMARY:

  • A highly accomplished technologist with demonstrated ability to get more value from data assets; who has successfully reorganized the day - to-day operations for several globally distributed Confidential clients.Significant experience in designing business architecture, information architecture, service management and service operations frameworks to support post-merger or post-acquisition environments.
  • Business Strategy and Planning Organizational M&A Integration Enterprise Information Architecture & Integration
  • Service Management & Operations (EAI, EII, ESB, Generic DB, etc) Master Data Management (MDM, PIM, CDI, VIM)
  • Sarbanes-Oxley Bluesky AML Basel II Data Mining Data Visualization Signals Processing Solvency II Health/Life Sciences (21 CFR Part 11, HL7, HIPPA) Software as a Service (SaaS / SOA) Supply Chain Business Process Management & Design (BPM) Data Quality Business Intelligence Enterprise Search FIX CMS EDI

TECHNICAL SKILLS:

Data SOA, Data Integration (virtual & Physical), Data Modeling (relational, hierarchical, Generic) Business Intelligence Tools: PowerDesigner, Erwin, Business Objects, Microstrategy, Cognos,Tableau, SQL Server, PowerView/Pivot, QlikView, Essbase, Siperian, Attivio, Palantir, SAP ECC6, SAP MDM, Fast, Global Ids, Informatica, Talend, Ab Initio, Rules-engines, Data Replication, Composite, Hybernate, IBM MDM Server, MQ Series, Tibco, Trillium, DataLever, DataFlux, DB2/UDB, Oracle, Sybase, Big Table, Hadoop, MongoDB, Cassandra, NoSql, Gemfire, Oracle Coherence, Teradata, Peoplesoft, XML, J2EE, JAVA, SQL, XML, WebSphere, Infosphere, CORBA, Service-Oriented Architecture (SOA), Zachman SFDC, Siebel, Siebel UCM, Autonomy

PROFESSIONAL EXPERIENCE:

Confidential, Edison, NJ

Chief Data Architect

Responsibilities:

  • Insurance is governed at the State level, thus the company have to rely on dozens of Third Party Agents (TPA’s) that operate at various states to collect/handle claims submissions. This creates huge Data Quality/Cleansing issues as TPA’s rarely comply with Data Format/Standards and typically send data in their prefer format.
  • At an $8 Billion P&C Insurance company where they already owned three data quality/cleansing tools to address their Claims validation and cleansing processing. Two of these tools were from the Gartner Magic Quadrant for data quality tools and one was custom designed to specifically address Insurance Claims data quality issues.
  • Nevertheless, their claims data quality/cleansing process was a challenge as claims are long tail transactions often plague with data issues caused by non-conforming claims submissions originating from the TPA’s that complicate review process. The highly regulated nature of this industry increases risk/cost involve in ‘mistakes’.
  • Applying the Run-time Rules/Data authoring capabilities in my Data ReVizor data quality product, business users were able to quickly/safely exercise ‘What IF’ hypothesis to cleanse and repair Claims for approval process because of the product’s Undo/rollback feature. Traditionally these ‘What IF’ hypothesis are exercise through hand modified ETL and SQL processes that require overnight run. Claims Analyst typically run 6-10 iterations to fully resolve a monthly batch of claims. This implies over a week to work through each batch. The Run-time Authoring capability enables a hypothesis to be tested in mins and batch completion in hour(s).
  • Typically their claims processing team require 2 weeks to cleanse/process the average monthly Claims submissions; with the deployment of the Run-Time Data Quality/Cleansing tool, the team was able to process an average month’s claims submission within just 1 day.
  • The productivity gain through use of our Run-Time Data Quality solution vs typical batch approaches allow them to redeploy 5 staff resource from the Claims processing team to focus on other needed projects.
  • The acceleration of the monthly Claims Cleansing process greatly benefited their Financial Close and Actuarial Reserve Calculations to expedite Stats Reporting and reduce Operational Risk.

Confidential, Basking Ridge, NJ

Global Head of Enterprise Data Services & SAP COE

Responsibilities:

  • Deliver Enterprise Data Warehouse in SAP BW to facilitate consolidation of multiple SAP BWs.
  • Develop Reference Architecture and Roadmap for tackling Confidential ’s Information Mgmt challenges, calling for consolidation (physical and virtual) of their redundant processes and data masters.
  • Collaborated with business execs to formulate, propose and implemented first 3 phase of global customer data remediation initiative.
  • Collaborated with business to establish and co-chair Confidential ’s Enterprise Data Governance council.
  • Proposed and deliver a breakthrough design for Confidential ’s Customer Access Portal which simplifies the existing design that involves 1000+ modules with scalability and performance issues. The new design requires less than 50 modules and has demonstrated dramatic scalability and performance improvements.
  • Successfully completed first phase of strategic roadmap which calls for upgrading Confidential ’s International SAP Instance from release 4.6C to ECC 6.0. This was done in conjunction with consolidation of the respective BW instance to the newly deliver EDW BW instance.
  • Oversee the consolidation of Confidential ’s SAP instance which serves their Domestic business with the SAP instance which serves the International business.
  • Successfully led overhaul of Confidential ’s bookclose process through automation and process integration to make this it a stable, predictable, transparent process in anticipation for Confidential ’s forthcoming IPO.

Confidential, Westport, Connecticut

Head of Research Data Platform

Responsibilities:

  • Oversee Data Dev team which supports existing data platform and on-going research needs
  • Develop Reference Architecture for Data Platform in accordance to stake holder vision from Rsch Data Ops
  • Planned, proposed, acquired funding to bring on service partner to flesh out requirements and build POC.
  • Completed Proof-Of-Concept (POC) system which successfully provided new capabilities for managing structured/unstructured data and metadata. The most significant software components of the POC system were:
  • Informatica 9.x PowerCenter, DX, DQ and MDM (Siperian)
  • Global ID's Data Governance and Data Management
  • Attivio's Active Intelligence Engine™ unified information access platform

Confidential, New York, New York

Management Consulting / Lead Data Solutions Architect

Responsibilities:

  • Deliver Morcom Enterprise Service Bus (ESB) project that had missed two prior rollouts. I step into role as project/tech lead and provided transparency, tech leadership to drive project to successful roll out. This project provides messaging frame work base on IBM MQ and Tibco’s EMS to offer robust and flexible messaging.
  • In support of real-time trade support data (margins/dividends/price/etc...), Account Entitlements and Dashboard applications.

Confidential, Denver, Colorado

Management Consulting / Chief Data Architect

Responsibilities:

  • Design and implemented a generic Rules Driven Data Model which can support ANY healthcare and insurance institutions for admission, scheduling and claims processing. This generic model revolutionizes the Data Modeling process; eliminating the struggle to strive for near 3rd normal form to achieve its benefits This unique modeling implementation offers performance curves that are insensitive to table join complexity or sparse data which is a real breakthrough in Data Architecture. The extensibility and adaptability this approach affords is mandatory to success for Confidential ’s business model and can be apply to any SaaS shop.
  • Designed a model driven Data SOA to interface with the generic DB to facilitate all data access. This addresses the fundamental optimization and stability flaws of available data persistence tools such as Hibernate and JDBC. It also greatly simplifies the change effort related to schema changes and its effect to applications. Now Data Model attributes/relationship changes require little to no programmer involvement at Confidential . Data changes that once require weeks of effort can now be deployed in a day. This provides incredible efficiency and flexibility for data servicing and custom configuration for new clients. This was a must have capability in order for Confidential ’s SaaS model to scale and be profitable.
  • Assist in the design and implementation of Rules driven Client Interface to allow for full customization of end client interface by institution, roles and user level. These custom configurations will be manage in rules level and all GUI’s will be dynamically generated at runtime to serve clients need. This too is an aggressive implementation of typical MVC framework.
  • Most significantly is the new architecture/design successfully insulates the applications from configuration & data changes and greatly reduces operation cost. The net result of the above deliveries is Confidential is now able to onboard new member hospitals at the pace of upwards of 10 per month where previously it took up to 3 month+ to onboard a new member hospital.

Confidential, Jersey City, New Jersey

Executive Director

Responsibilities:

  • Received in May 2008, “Best Reference Data Initiative” award for Data SOA and “Best Data Management Initiative” for the Data As a Service approach from Inside Reference Data Magazine. The Data SOA is a services layer that insulates and abstracts Merrill’s fracture legacy data environment and provides data consumers’ robust access method (Pub/Sub, Request/Reply, etc). It serves as a virtual integration and reconciliation facility that unravels the fracture legacy reference data systems for users and enables elimination and re-engineering of these systems with minimal impact.
  • Delivered Contacts Repository which is GMI’s Client Data Integration project and the most strategic initiative in Capital Market Sales for 2007. Objective is to reconcile, collapse and provide centralize stewardship of Merrill’s client contact information. Partnered with Siperian to extend their product from a “person” oriented processing to support notion of “entity” that financial institutions are base on. Completion of initial phase of project enabled business to improve client servicing, accurately classify high value clients and correct internal processes.
  • Partner in the design and delivery of Confidential ’s FUSION initiative, which is the core of Merrill’s Master Data Management strategy to re-engineer and consolidate all of their bi-temporal Product & Pricing reference data including Equity, Futures, Fixed Income and time-series. Managed delivery for Feed handler processing, data integration and real-time delivery to over 100 trading system across the Global Capital Markets.

Confidential, New York, New York

Executive Director

Responsibilities:

  • Program managed the Client Data Integration initiative (CDI) as part of Morgan’s MDM strategy, with multi-year horizons and cost upwards of $50M. The Client Data Integration program was Confidential ’s attempt to achieve Client-Centricity. The goal was to achieve a 360-degree view of client’s roles, responsibilities and relationships to satisfy a multiplicity of goals. The successful completion of the first phase consisting of extensive data validation & reconciliation exercises which surface over 55MM data discrepancies in Morgan’s client data. We Initiated a Data QA process that flushed out numerous process and system flaws that was the source of the many data discrepancies. Successfully corrected the process and system problems and cleanse ALL of the 55MM+ data discrepancies leading to improvements in client servicing, lower client communications and operations cost.
  • Team was instrumental in the delivery for many projects and initiatives of varying scopes; in 2006 we successfully delivered on 115 projects. The most significant of these projects included initiatives such as Client Data Integration, Statement Re-engineering, Financial Data Warehouse, Basel II, OFAC, AML, BlueSky, Bank Deposit Program, Client Data House-holding, Risk and Regulatory Compliance initiatives.
  • Co-led the $50MM Statements Renovation program to re-engineer GWM’s antiquated statements system, which was comprised of over 400+ program modules with over 1.4 MM lines COBOL code. Achieved both savings and improve client satisfaction. Using an innovative approach completed implementation of first phase in just 9 months. This equipped FA’s to approach clients for in-depth financial planning and cross sales opportunities.

We'd love your feedback!