- Experienced senior leader with a successful enterprise data architecture, strategy and technology career in the financial sector.
- Proven expertise in determining, expressing and delivering towards the enterprise vision for data architecture.
- Led system study and Enterprise Data Platform design strategy road map.
- Conducted feasibility analysis and made technology selection that would best fit the solution proposed.
- Extensive experience in Data Management, Data Governance, Data Quality, Data Analytics, Machine Learning, Data Science, Big Data implementations and Cloud Computing.
- Established strong alliances with internal and external business sponsor heads to budget, plan and secure funding for applications’ design and development. Multi - million dollar budget aggregates handled annually.
- Provided business justification / ROI for application induction / enhancement in terms of FTEs saved, savings on license costs, migration to commodity hardware, operational loss avoidance, improved customer experience, safeguarding business reputation, and support for regulations and innovations.
- Knowledgeable in all aspects of designing and constructing data architecture, operational data stores, and data marts.
- Skilled in enterprise wide data modeling and database design.
- Expertise in defining data/information architecture standards, policies and procedures for the organization, structure, attributes and nomenclature of data elements and application of accepted data content standards to technology projects.
- An effective technology manager with the skills necessary to direct, train, and motivate highly qualified staff to its fullest potential.
Databases: Oracle, SQL Server, MongoDB, HBASE, neo4j, MemSQL, Snowflake, MariaDB, Hive
Data Modeling: erwin Data Modeler (DM), ER Studio, SQL Developer, Toad, IBM Data Architect, XSD
Big Data technologies: Cloudera HDFS, Apache Spark, Spark Streaming, Apache Impala, MapR, Kafka, Confluent
BI / ML: MicroStrategy, IBM Cognos, Arcadia Data, Paxata, Datameer, H2O, R, Python
Cloud Computing Services: Amazon Web Services (AWS), Google Cloud Platform, Microsoft Azure
Architecture principles and frameworks: TOGAF, Lambda Architecture (Big Data applications)
ETL: Talend, Ab Initio, Informatica, Flume, Sqoop
Programming and scripting languages: Oracle PL/SQL (all constructs Stored Procedures, Packages, Triggers), UNIX Shell scripting, Java(JDK 1.2.2), XML, Object Oriented Programming in C++ Master Data Management EBX5, Multidomain MDM platforms iPaaS and API management Mulesoft Anypoint Platform
- To build foundational data capabilities to turn data into information and actionable intelligence and make it available for Business, Customers and Distributors.
- Design and build an Integrated Data Platform (IDP) comprising an Ingestion Framework, a Data Lake, a standardized, validated and cataloged data repository, an integrated multi-domain master data platform (MDM), and a delivery layer capable of supporting batch and real-time, self-service based consumption including provisioning of user sandbox via cloud capability.
- Establish and enforce data standards, data enrichment strategy and data cataloging and lineage capabilities.
- Data model extensive and complex data tiers of the IDP.
- Extend data lake to cloud storage (Amazon S3, Azure Data Lake Storage, or Google Cloud Storage) and use Snowflake to accelerate data transformations and analytics in the existing data lake.
Senior Vice President
- Establish and enforce data standards, processes, frameworks, tools and best practices consistently in the operating units and functions across the enterprise. Understand data flows and lineage from operational data sources to analytical data repositories.
- Design and deploy data pipelines to source diverse data from disparate sources, contributing to Confidential ’s enterprise data lake. Processing / transforming, curating and preparing the data to generate target data sets for advanced analytics and modeling, ML (machine learning), and visualization.
- Extensive and complex data modeling for the persistence tier using Erwin as well as a data architecture design deliverable.
- Partner with CDO and platform engineering to develop and leverage a Metadata Management (data catalog) system that scans data repositories and ETL processes to catalog metadata, including lineage information and mapping to business glossaries, and makes this information available to business and technical users. Design capabilities to drive data discovery and self-service data consumption.
- Provision and monitor data consumption through Confidential ’s Data Bridge (ab initio based federated data sourcing platform).
- Control and provision data level entitlements. Design and build authorization leveraging RBAC and in alignment with Confidential ’s data management policy (CDMP) and regulatory mandates.
- Collaborate with CDO office to define, create, maintain, and leverage enterprise data standards, critical data elements (CDEs), and enterprise information assets (IFAs). Identify, certify and maintain approved data sources (ADSs) and strategic data repositories (SDRs) for IFAs.
- Create and refresh (annually) a data roadmap that documents data principles and target state data architecture. The roadmap conforms to Confidential ’s data management policy (CDMP) and is an input to yearly regulatory submission (OCC).
- Establish strong alliances with Confidential ’s business sponsor heads to budget, plan and secure funding for applications’ design and development. Budget aggregate in multi-millions dollars annually.
- Work with business sponsors spanning multiple lines of business for applications with enterprise wide scope.
- Ensure applications adhere to and leverage the enterprise’s data management standards, data governance framework, and data strategy.
- Participate in the development (Agile and Waterfall) and maintenance of corporate data architecture, data management standards and conventions, data dictionaries and critical data elements for multiple computing environments.
- Design and implement cloud computing reference architecture on Google Cloud Platform (GCP), Amazon AWS for advanced analytics, Machine Learning (ML) and ETL. The architecture leveraged Google BigQuery, Cloud DataProc, Bigtable, Data Fusion, Amazon EMR, Redshift, S3 and AWS/Snowflake.
- Regular review of emerging technologies to assess their relevance and viability in solving on-going information management challenges. Big Data, cloud computing, data visualization, data masking, enterprise business metadata management and Data SOA.
- Leverage Lambda Architecture framework in Big Data applications to cater to varying latency requirements.
- Review and approve project designs including data models, data access and application integration.
- Recruit, lead, train and motivate a team of architects in accomplishing data architecture tasks.
- The responsibilities are balanced between managing the existing data infrastructure with designing and delivering the future state vision of data technology.
Senior Data Architect
- Solution / Data Architect for banking, brokerage and commercial lines of business.
- Responsible for master data strategy and MDM solution for client and account data, logical and conceptual data models, XML schema definitions (XSD), data classifications and translations across disparate systems, data service to data provider mappings, metadata repository creation and maintenance. Leveraged multidomain MDM platform.
- Lead author or contributor in multiple Software Requirements Specifications and HLD and LLD documentation.
- Responsible for architecting and implementing scalable, well performing, cohesive and integrated ETL processes. Provide technical leadership in the design and development of data integration solution architecture and best practices.
- Provided strategic technology platforms recommendations.
- Worked with Enterprise Architecture Team to ensure proposed designs and approach met overall enterprise architecture requirements.
- Customer data integration (CDI) Data Quality - Correct, standardize and verify data. Data Profiling.