We provide IT Staff Augmentation Services!

Enterprise Information Architect Resume

2.00/5 (Submit Your Rating)

SUMMARY:

  • Experienced, hands - on Data and Information Architecture professional with a track record in successfully designing, developing and delivering complex DW, MDM, Metadata, and Data Integration solutions and associated Data Architecture, Data Modeling, Data Profiling, Data Cleansing, ETL, Data Quality, Metadata, etc. components. Adept at working hands on as well as in leading teams in developing and delivering tactical as well as strategic solutions. Skilled in establishing collaboration and communication across teams and rescuing problematic projects.
  • Analyzing Business Requirements and translating them into conceptual, logical, and physical data models.
  • Analyzing data sources to identify Systems of Record (SOR) and map them to Target Logical Data Models.
  • Performing data analysis and profiling across source systems to identify data domains, data contents, data quality, data relationships, business rules, transformation rules, data rules, intra-column, intra-table, intra-source relations, and other patterns to assess data quality and to develop data cleansing and ETL routines.
  • Performing data quality trend analysis and generating reports to track data quality improvement or deterioration over time to identity root causes and to implement tactical and strategic remediation measures.
  • Architecting and implementing Data Acquisition and Data Integration rules including Data Survivorship, Change Data Capture (CDC), Slowly Changing Dimensions (SCD) & Data Versioning, Surrogate Key Assignment, Data Lineage, Data Balancing & Reconciliation, Exceptions Handling & Rejects Reprocessing
  • Enforcing data quality by enforcing checks and balances during the data processing life cycle
  • Designing processes and data models to capture Business, Process, and Technical Metadata
  • Designing Inmon and Kimball Methodology Data Warehouses and Logical/Physical data models including 3NF and Dimensional, including Conformed Dimensions, Facts, Star Schemas, Snowflake Schemas, etc.
  • Reverse engineering existing schemas into Data Models and Data Dictionaries containing business definitions, business rules, validation rules, transformation rules, data rules, intra-column, intra-table relationships.
  • Designing Modular, Reusable data structures and processes for data integration, migration and conversion
  • Performing database design, sizing, capacity planning, optimization, tuning, archival & retrieval strategies, etc.
  • Providing Production Support and resolving production issues by designing and implementing tactical and strategic fixes to minimize business impacts and to eliminate root causes.
  • Estimating task durations & resource requirements; prioritizing tasks; monitoring progress & reporting status; identifying & documenting risks & issues
  • Working with diverse and globally distributed technology and business teams including Business Users, Data Governance, AML, Regulatory Compliance, PMO, Change Mgmt, Release Mgmt, Developers, Data Modelers, BAs, Data Analysts, DBAs, QA/QC Testers, etc;

TECHNICAL SKILLS:

DATABASES: Oracle 11G/10G/9i/8i/7.x, IBM-UDB2 8.x/10.x, Sybase, MS SQL Server, Oracle Express

DATA MODELING: ERWin, Embarcadero, Dimensional Modeling, 3NF Modeling

ETL: DataStage 8.x, 7.x, 6.x, Informatica6.2/8.1/9.x, Custom ETL Design & Dev

BI REPORTING: Cognos8/7/6, Discoverer4i/3i, Orcl Sales Analyzer, Orcl Fincl Analyzer, Oracle Reports/Forms

PROGRAMMING: PL/SQL, SQL, SQL*Plus, Oracle Express SPL, UNIX Shell

AUTOMATION: Appworx SQL*Operator, Autosys, Maestro

SOURCE VERSION CONTROL: MS Visual SourceSafe, PVCS, eROOM, SVN

OPERATING SYSTEMS: MSWindows, UNIX (Solaris 2.6/2.5, AIX4.3, HP-UX 10.x/11.x), DOS

METADATA: Informatica Metadata Manager & Business Glossary

MDM: Informatica (SIPERIAN) MDM HUB & IDD (BDD) Configuration, Administration

PROFESSIONAL EXPERIENCE:

Confidential

Enterprise Information Architect

Responsibilities:

  • Working on designing and developing an Enterprise metadata repository that will catalog the firms structured and unstructured data assets as well as infrastructure assets and associated metadata. This repository will provide capabilities to support
  • Compliance and Regulatory Functions - including Regulatory and Legal queries, Dodd Frank Records Retention and eDiscovery functions, Records Management including Data Archival and Defensible Disposal functions, BCBS239, etc.
  • Data Security & Privacy Functions - enforcement of Data Security and Privacy policies through identification of relevant data stores and associated privacy related jurisdictions and regulations.
  • Data Architecture Functions - supporting data sourcing and SDLC for new applications and associated databases by directing them to Systems of Record and Golden / Authoritative Data Sources that would support their data requirements.

Confidential

Manager (Managing Director)

Responsibilities:

  • Planned & implemented an enterprise-wide upgrade Informatica MDM Hub from v9.0.1 to v9.5.1
  • Planned & implemented an enterprise-wide upgrade of Informatica PowerCenter from v9.0.1 to v9.5.1
  • Ongoing design, development, and enhancement of an MDM Data Hub that integrates Entity/Party data existing across disparate systems to develop a single golden master record to be used across the organizations HR, Alumni, Corps, Applicants, Contact Mgmt (SFDC), etc. systems. Responsible for MDM design, development, support and operations, including source system analysis and discovery, defining Entity/Party instances across different source systems, defining and refining data matching & merging rules across source systems, establishing trust & survivorship rules between source systems.
  • Responsible for building Enterprise Metadata Repository and Data Dictionary using Informatica Metadata Manager and capturing Business and Operational metadata, and data lineages across multiple systems.
  • Responsible for data migrations and conversion of a 750 table database along with the associated indexes, procedures, functions, triggers, views, etc database objects. from Postgres8 to Oracle11g
  • Responsible for migrating data from Taleo Applicant Mgmt System to a new Oracle database system.
  • Responsible for providing enterprise-wide ETL and DI services including those for moving data from seven source systems to MDM Landing, as well as intra-applications data integrations outside of MDM.
  • Responsible for providing application database services, including provisioning of database instances, database schemas, logical and physical data models, performance tuning, and roles/privileges/users administrations for pre-production environments.

Confidential

Data Quality Consultant Short Term Consulting Role

Responsibilities:

  • Responsible for designing and developing Data Quality and Data Validation processes, data models and dashboard on a Risk Reporting application, to capture data quality parameters on an ongoing basis so data quality can be measured on an ongoing basis by comparing it with baseline as well as by performing comparisons between builds, and DQ reports can be generated off the dashboard.
  • Also designing an ETL Metadata/Audit model that captures ETL Metadata and provides for every loaded and updated record in target, complete lineage back to its source system along with every transformation that it went through along the ETL pipeline.
  • The dashboard would allow generation of metadata and audit reports.

Confidential

Data Architect

Responsibilities:

  • ETL/Data Integration technical lead on three projects relating to Investment Plan Reporting Dashboard, Regulatory Disclosure Requirement Reporting, and Investment Reporting Data Warehouse.
  • Investment Reporting Data Warehouse is being created for the Asset Management Group and enables financial reporting at Securities, Holdings, Transactions and Organization levels. The architecture includes Sourcing of Accounting and Assets datafiles from Confidential, transformation and loading (ETL) of the Confidential data into the Data Warehouse, Reconciliation against GL, Data Quality monitoring and reporting at source and target, Manual Adjustments handling, OLAP/BI reporting, etc.
  • Plan Dashboard provides real time snapshots of Retirement Plans related financial information like Contributions, Distributions, Loans, Fund Transfers, through use of Web Data Services and Confidential .
  • Responsibilities include analyzing business and functional requirements and decomposing into data modeling, data integration and ETL requirements and tasks, developing data flow diagrams ( Confidential ), data mapping, data rules mappings, business rules mappings to capture the data lineage and to define and enable data audit, balance and control, criteria and unit testing, integration testing, regression testing scenarios and test cases to ensure quality control.
  • Triage production defects, provide resolution by correcting data modeling and data integration processes.
  • Managing onshore and offshore development teams in environments including Oracle11g, SQL Server2008, DataStage, AutoSys, UNIX, etc. SDLC methodologies included Agile Scrums and Waterfall.

Confidential

Data Architect (MDM) (VP)

Responsibilities:

  • Developed and enhanced an MDM Data Hub that integrates enterprise Client and Account Reference data to develop a Single Customer Profile (SCP) to be used across the organization’s Account Opening, Client On-boarding, House-holding CRM and AML-KYC Compliance applications within Merrill and BOA.
  • Used Web Data Services and Batch Processing to orchestrate the Master Data across applications.
  • Implemented Regulatory Anti Money Laundering and Know Your Client (AMLKYC) solutions.
  • Performed source data analyses to identify data to fulfill the regulatory information needs.
  • Perform top down business data requirement analysis to design logical data models to meet business needs
  • Perform bottoms up data analysis and profiling across multiple source systems to identify data domains, contents, relationships, business rules, transformation rules, data gaps, to validate the logical data models.
  • Define data integration steps, develop data flow diagrams ( Confidential ), data mapping, data rules mappings, business rules mappings to capture the data lineage and to enable data audit, balance and control
  • Define data audit criteria and unit testing, integration testing, regression testing scenarios and test cases to ensure quality control
  • Supporting the INT, QA and UAT Testing Phases and Production Support of projects and providing and leading Issues Management through providing tactical and strategic resolutions of issues discovered.
  • Collaborating with the Business Analysis teams to validate the business rules, requirements and use cases against the existing state of data.
  • Collaborating with the PMO and Release Mgmt teams in estimating and planning for new projects and functionalities, solutions and change requests.
  • Working with onshore and offshore development teams in an environment that includes Oracle10.x, DataStage, AutoSys, Siperian/Informatica MDM Hub, UNIX, Windows.
  • Triage production issues and data anomalies with data governance and stewardship teams, application owners, SMEs, Business Users, and tech teams.
  • Analyze and revise data models, data architectures, business rules and ETL processes to implement short term tactical fixes to minimize business impact due to production issuers, and long term strategic fixes to eliminate root causes of those production issues.
  • Oracle10g, SQL Server2008, DataStage, AutoSys, Siperian MDM, Erwin, UNIX, Windows.

Confidential

Data Architect (Data Warehouse) (Director)

Responsibilities:

  • As a Team Lead participated in the analysis, design, and implementation of an enterprise-wide Data Warehousing and Statutory Financial Reporting solution.
  • Performed source data analysis and created conceptual, logical, and physical data models to document the informational needs. Led the design of Staging and Target Data Models, Data Sourcing, ETL, Data Integration, Data Quality Assurance, and Metadata Management strategies and processes.
  • Implemented Data Mappings, Data Models, and Data Flow designs for enterprise Data Warehouse and reporting applications. Designed processes and data structures to monitor and capture process and business Metadata, including ETL usage and performance statistics, data lineage, data reconciliation, data quality, etc. Led the hands-on implementation of all of the above strategies and design resulting in the successful deployment of an enterprise wide ODS, Data Warehouse, Data Marts, ETL Infrastructure, and a repository of repeatable processes.
  • Designed and adopted data integration best practices, standards, templates, and repeatable and restart-able processes and methods to process data based on data patterns, like dimensions, facts, hierarchies, etc.
  • Designed and adopted processes and data models to monitor and capture the utilization and performance statistics of the ETL infrastructure.
  • Designed and implemented Metadata Management processes, models and data structures to capture and document various types of Metadata associated with existing legacy data repositories and new enterprise data warehouse and dependent data marts. Business metadata included business entities and attributes definitions, business rules, data rules, survivorship and precedence rules, dimension/hierarchy value domains and assignment rules, formulas and calculations, etc. Technical metadata included information related to ETL processes and operations, process streams schedules and dependencies, data availability, data rejection and re-processing rules, data quality control rules, data lineage, etc.
  • Responsible for troubleshooting and resolving issues related to data sourcing, data quality, ETL development, system performance, Quality Assurance and User Acceptance.
  • Developed data quality control processes to automate & standardize the detection and resolution of Data Quality issues and conformance of reference data to organization’s standardized copy.
  • Partnered with tech teams, user groups, senior management to ensure timely and budgeted delivery.
  • Developed Project Plans & Estimates, handled multiple projects, hired & mentored staff and built teams.
  • Participated in development and mapping of BI Metadata for Cognos7 BI tool implementation and mapping of DW metadata to Cognos metadata.
  • Provided all levels of Production Support for all reporting applications during their lifecycles. Revisited and revised data models and ETL / data integration processes to fix production defects and to provide enhanced functionalities.
  • Developed a new ODS and an EDW consisting of conformed normalized subject areas and dimensional data marts to enable operational and BI reporting for Treaty, Facultative, Financial, Marketing, etc. business areas within the re-insurance business. The new ODS and EDW are sourced from legacy reinsurance applications, a legacy operational data store and an old data warehouse. The old data warehouse was an accumulation of un-conformed and redundant data structures in an Oracle8i RDBMS that resulted from meeting the tactical and ad-hoc reporting requirements of the users over the past many years and is sourced from legacy re-insurance applications. The old operational data store was a DB2 Mainframe database also sourced from legacy re-insurance applications

We'd love your feedback!