We provide IT Staff Augmentation Services!

Enterprise Information Architect (vp) Resume

PROFILE:

  • Experienced, hands - on Data and Information Architecture professional with a track record in successfully designing, developing and delivering complex DW, MDM, Metadata, and Data Integration solutions and associated Data Architecture, Data Modeling, Data Profiling, Data Cleansing, ETL, Data Quality, Metadata, etc. components. Adept at working hands on as well as in leading teams in developing and delivering tactical as well as strategic solutions. Skilled in establishing collaboration and communication across teams and rescuing problematic projects.
  • DATA DISCOVERY, ANALYSIS, ACQUISITION AND INTEGRATION
  • Analyzing Business Requirements and translating them into conceptual, logical, and physical data models.
  • Analyzing data sources to identify Systems of Record (SOR) and map them to Target Logical Data Models.
  • Performing data analysis and profiling across source systems to identify data domains, data contents, data quality, data relationships, business rules, transformation rules, data rules, intra-column, intra-table, intra-source relations, and other patterns to assess data quality and to develop data cleansing and ETL routines.
  • Performing data quality trend analysis and generating reports to track data quality improvement or deterioration over time to identity root causes and to implement tactical and strategic remediation measures.
  • Architecting and implementing Data Acquisition and Data Integration rules including Data Survivorship, Change Data Capture (CDC), Slowly Changing Dimensions (SCD) & Data Versioning, Surrogate Key Assignment, Data Lineage, Data Balancing & Reconciliation, Exceptions Handling & Rejects Reprocessing
  • Enforcing data quality by enforcing checks and balances during the data processing life cycle
  • Designing processes and data models to capture Business, Process, and Technical Metadata
  • DATABASE DESIGN, DEVELOPMENT AND SUPPORT
  • Designing Inmon and Kimball Methodology Data Warehouses and Logical/Physical data models including 3NF and Dimensional, including Conformed Dimensions, Facts, Star Schemas, Snowflake Schemas, etc.
  • Reverse engineering existing schemas into Data Models and Data Dictionaries containing business definitions, business rules, validation rules, transformation rules, data rules, intra-column, intra-table relationships.
  • Designing Modular, Reusable data structures and processes for data integration, migration and conversion
  • Performing database design, sizing, capacity planning, optimization, tuning, archival & retrieval strategies, etc.
  • Providing Production Support and resolving production issues by designing and implementing tactical and strategic fixes to minimize business impacts and to eliminate root causes.
  • TEAM LEADERSHIP & MANAGEMENT
  • Estimating task durations & resource requirements; prioritizing tasks; monitoring progress & reporting status; identifying & documenting risks & issues
  • Working with diverse and globally distributed technology and business teams including Business Users, Data Governance, AML, Regulatory Compliance, PMO, Change Mgmt, Release Mgmt, Developers, Data Modelers, BAs, Data Analysts, DBAs, QA/QC Testers, etc;

TECHNOLOGY SUMMARY:

DATABASES: Oracle 11G/10G/9i/8i/7.x, IBM-UDB2 8.x/10.x, Sybase, MS SQL Server, Oracle Express

DATA MODELING: ERWin, Embarcadero, Dimensional Modeling, 3NF Modeling

ETL: DataStage 8.x, 7.x, 6.x, Informatica6.2/8.1/9.x, Custom ETL Design & Dev

BI REPORTING: Cognos8/7/6, Discoverer4i/3i, Orcl Sales Analyzer, Orcl Fincl Analyzer, Oracle Reports/Forms

PROGRAMMING: PL/SQL, SQL, SQL*Plus, Oracle Express SPL, UNIX Shell

AUTOMATION: Appworx SQL*Operator, Autosys, Maestro

SOURCE VERSION CONTROL: MS Visual SourceSafe, PVCS, eROOM, SVN

OPERATING SYSTEMS: MSWindows, UNIX (Solaris 2.6/2.5, AIX4.3, HP-UX 10.x/11.x), DOS

METADATA: Informatica Metadata Manager & Business Glossary

MDM: Informatica (SIPERIAN) MDM HUB & IDD (BDD) Configuration, Administration

EXPERIENCE DETAILS:

Confidential

Enterprise Information Architect (VP)

Responsibilities:

  • Hands On - Requirements Analysis; Solution Architecture & Design; Conceptual & Logical Data Modeling; Data & Metadata Sources discovery, analysis and Systems of Record identification; Mapping Data Sources, Entities and Attributes to Target Data Models and Business Requirements; Enterprise Data Landscape discovery & documentation; Working with development teams, source system owners, vendors, etc. to ensure successful implementation of solution
  • Business Solution Being Implemented
  • Working on designing and developing an Enterprise metadata repository that will catalog the firms structured and unstructured data assets as well as infrastructure assets and associated metadata. This repository will provide capabilities to support
  • Compliance and Regulatory Functions - including Regulatory and Legal queries, Dodd Frank Records Retention and eDiscovery functions, Records Management including Data Archival and Defensible Disposal functions, BCBS239, etc.
  • Data Security & Privacy Functions - enforcement of Data Security and Privacy policies through identification of relevant data stores and associated privacy related jurisdictions and regulations.
  • Data Architecture Functions - supporting data sourcing and SDLC for new applications and associated databases by directing them to Systems of Record and Golden / Authoritative Data Sources that would support their data requirements.

Confidential

Manager (Managing Director)

Responsibilities:

  • Planned & implemented an enterprise-wide upgrade Informatica MDM Hub from v9.0.1 to v9.5.1
  • Planned & implemented an enterprise-wide upgrade of Informatica PowerCenter from v9.0.1 to v9.5.1
  • Ongoing design, development, and enhancement of an MDM Data Hub that integrates Entity/Party data existing across disparate systems to develop a single golden master record to be used across the organizations HR, Alumni, Corps, Applicants, Contact Mgmt (SFDC), etc. systems. Responsible for MDM design, development, support and operations, including source system analysis and discovery, defining Entity/Party instances across different source systems, defining and refining data matching & merging rules across source systems, establishing trust & survivorship rules between source systems.
  • Responsible for building Enterprise Metadata Repository and Data Dictionary using Informatica Metadata Manager and capturing Business and Operational metadata, and data lineages across multiple systems.
  • Trained in Informatica MDM Configuration, Informatica MDM Administration, Informatica Data Director (IDD/BDD) Configuration, Informatica Metadata Manager and Business Glossary
  • Responsible for data migrations and conversion of a 750 table database along with the associated indexes, procedures, functions, triggers, views, etc database objects. from Postgres8 to Oracle11g
  • Responsible for migrating data from Taleo Applicant Mgmt System to a new Oracle database system.
  • Responsible for providing enterprise-wide ETL and DI services including those for moving data from seven source systems to MDM Landing, as well as intra-applications data integrations outside of MDM.
  • Responsible for providing application database services, including provisioning of database instances, database schemas, logical and physical data models, performance tuning, and roles/privileges/users administrations for pre-production environments.

Confidential

Data Quality Consultant

Responsibilities:

  • Business Solution Being Implemented
  • Responsible for designing and developing Data Quality and Data Validation processes, data models and dashboard on a Risk Reporting application, to capture data quality parameters on an ongoing basis so data quality can be measured on an ongoing basis by comparing it with baseline as well as by performing comparisons between builds, and DQ reports can be generated off the dashboard.
  • Also designing an ETL Metadata/Audit model that captures ETL Metadata and provides for every loaded and updated record in target, complete lineage back to its source system along with every transformation that it went through along the ETL pipeline. The dashboard would allow generation of metadata and audit reports.

Confidential

Data Architect

Responsibilities:

  • ETL/Data Integration technical lead on three projects relating to Investment Plan Reporting Dashboard, Regulatory Disclosure Requirement Reporting, and Investment Reporting Data Warehouse.
  • Investment Reporting Data Warehouse is being created for the Asset Management Group and enables financial reporting at Securities, Holdings, Transactions and Organization levels. The architecture includes Sourcing of Accounting and Assets datafiles from CAMRA, transformation and loading (ETL) of the CAMRA data into the Data Warehouse, Reconciliation against GL, Data Quality monitoring and reporting at source and target, Manual Adjustments handling, OLAP/BI reporting, etc.
  • Plan Dashboard provides real time snapshots of Retirement Plans related financial information like Contributions, Distributions, Loans, Fund Transfers, through use of Web Data Services and XML.
  • Responsibilities include analyzing business and functional requirements and decomposing into data modeling, data integration and ETL requirements and tasks, developing data flow diagrams (DFD), data mapping, data rules mappings, business rules mappings to capture the data lineage and to define and enable data audit, balance and control, criteria and unit testing, integration testing, regression testing scenarios and test cases to ensure quality control.
  • Triage production defects, provide resolution by correcting data modeling and data integration processes.
  • Managing onshore and offshore development teams in environments including Oracle11g, SQL Server2008, DataStage, AutoSys, UNIX, etc. SDLC methodologies included Agile Scrums and Waterfall.

Confidential

Data Architect (MDM) (VP)

Responsibilities:

  • Developed and enhanced an MDM Data Hub that integrates enterprise Client and Account Reference data to develop a Single Customer Profile (SCP) to be used across the organization’s Account Opening, Client On-boarding, House-holding CRM and AML-KYC Compliance applications within Merrill and BOA.
  • Used Web Data Services and Batch Processing to orchestrate the Master Data across applications.
  • Implemented Regulatory Anti Money Laundering and Know Your Client (AMLKYC) solutions.
  • Performed source data analyses to identify data to fulfill the regulatory information needs.
  • Perform top down business data requirement analysis to design logical data models to meet business needs
  • Perform bottoms up data analysis and profiling across multiple source systems to identify data domains, contents, relationships, business rules, transformation rules, data gaps, to validate the logical data models.
  • Define data integration steps, develop data flow diagrams (DFD), data mapping, data rules mappings, business rules mappings to capture the data lineage and to enable data audit, balance and control
  • Define data audit criteria and unit testing, integration testing, regression testing scenarios and test cases to ensure quality control
  • Supporting the INT, QA and UAT Testing Phases and Production Support of projects and providing and leading Issues Management through providing tactical and strategic resolutions of issues discovered.
  • Collaborating with the Business Analysis teams to validate the business rules, requirements and use cases against the existing state of data.
  • Collaborating with the PMO and Release Mgmt teams in estimating and planning for new projects and functionalities, solutions and change requests.
  • Working with onshore and offshore development teams in an environment that includes Oracle10.x, DataStage, AutoSys, Siperian/Informatica MDM Hub, UNIX, Windows.
  • Triage production issues and data anomalies with data governance and stewardship teams, application owners, SMEs, Business Users, and tech teams.
  • Analyze and revise data models, data architectures, business rules and ETL processes to implement short term tactical fixes to minimize business impact due to production issuers, and long term strategic fixes to eliminate root causes of those production issues.
  • Oracle10g, SQL Server2008, DataStage, AutoSys, Siperian MDM, Erwin, UNIX, Windows.

Hire Now