We provide IT Staff Augmentation Services!

Enterprise Information Management Architect Resume

0/5 (Submit Your Rating)

Chandler, AZ

SUMMARY:

  • A highly experienced information technology data architect, data modeler, data analyst, requirements analyst and consultant with a wide data warehouse and enterprise subject area background. Works dedicated to a team or multiple teams clarifying and transforming requirements into deliverables.
  • Extensive experience navigating data models or databases answering complicated questions, verifying data quality, performing root cause analysis on identified exceptions, and specifying/negotiating/ performing corrections. Recently Teradata data warehouse environment based with focus on Manufacturing, Quality, and Supply Chain. Strengths include:

TECHNICAL SKILLS:

Data Modeling: Logical, Physical, Dimensional, Semantic (ERWin)

BI Analytics: SQL, Excel, PowerBI, Tableau, Hyperion, JMP, R

PROFESSIONAL EXPERIENCE:

Confidential, Chandler, AZ

Enterprise Information Management Architect

Responsibilities:

  • Held Enterprise Information Management (EIM) kick - off symposium requesting client input and identifying priority cross-organization projects (Bookings, Billings, Backlog client dependent data mart port and enhancement, introducing Teradata Data Labs). Started formally aligning Business Stewards with specific data domains with focus on the cross-functional areas.
  • Modeled and instantiated standard Calendar, Product, and Customer master versions from multiple sources with incremental time-variance support. Extended current Product/Material time-variant master collections with historic data mined from multiple sources prior to Product SOR retirement. Identified differences and reconciled into one master version. Agile SDLC used.
  • Worked with Global Marketing Organization in sorting out and converging multiple semi-documented and semi-automated regional management hierarchies into a standard analysis hierarchy structure.
  • Engaged with Quality organization remodeling and tuning data extracts behind their re-engineered analysis applications delivered in Tableau.

Confidential

Enterprise Data Warehouse (EDW) Data Architect

Responsibilities:

  • Adapted Enterprise Data Management Standards and Review Process into the Enterprise Data Warehouse Organization. Integrated Logical and Physical Data Model development and reviews into the development process with active business client/steward participation.
  • Defined data model and transaction process specifications for LotMaster; the Teradata based enterprise lot history master. Delivered single second latency for 84 Business Transactions (BizTs), each one decomposing into multiple database retrievals and updates. Architecture and implementation was/is Teradata bleeding edge. Managed legacy genealogy data conversion introducing physical rows into the new schema reflecting all the analysis software tricks developed to successfully reconstruct the network.
  • Migrated Lot Traceability application (WebLotG) to use the rich LotMaster data. Implemented “batch” execution functionality for 100X+ cycle time improvement. Defined “punch list” analysis process to identify where all starting lot material ended up. Performed data validations against the LotMaster data and initiated software, process,, and data corrections. (2009)
  • Pulled cycle-time from traceability analysis process by implementing former human analysis into new reports and features. Introduced Oracle intermediate database driving multiple report types from same data extraction saving 99% time/CPU with each reuse. Introduced multi-step execution scripting cutting 40-85% data extractions. Performed multiple spot tunings with up to 40X improvement. (Note: 15 thread batch consumed 6% Teradata CPU.) (2011)
  • Worked business and manufacturing process changes limiting material recall scope and cost - one case was 45% less units worth $145,000. Identified additional shop-floor algorithm issue in not using remnant/partial lots resulting in orphan lot scrappage -- $190,000 recently by one group. Wrote modified algorithm and proved it out by an Excel simulation. Extended WebLotG providing exacting detail where each unit went. ( )
  • Performed multiple data quality audits looking for inconsistencies and worked on identifying and fixing root causes with the responsible parties. Subject areas included:
  • Inventory and Genealogy Master (LotMaster)
  • Customer Quality Incidents
  • Bill-of-Materials (BOM) including reconciling multiple representations
  • Supply Chain Planning Output Analysis
  • Product Marketing End-use Segmentation Rules and Results
  • Secondary Teradata DBA with focus on performance tuning.
  • Managed migration of the triaged old, complicated, and critical “nasty” reports into new Teradata environment because they need concentrated attention.

Confidential

Enterprise Data Architect

Responsibilities:

  • Responded to #1 Sales Customer threat to pull all business due to quality recall traceability issues by developing a web based Lot Genealogy application (WebLotG) directly accessing DB2 data. Customer retained with additional business and WebLotG became a corporate asset. However, 35% complete traceability rate triggered multiple, incremental improvement projects over multiple years.
  • “External” architecture and process representative on next generation data warehouse architecture project which resulted in a Teradata, DataStage, and Hyperion environment. Participated in migration planning from existing Oracle data warehouse and home-grown ETL and scheduling tools.
  • Initiated Freescale Enterprise Data Model (FEDM) integrating all new work in the new Enterprise Data Warehouse (EDW) Teradata environment.
  • Wrote SMART Requirements against New Product Introduction (NPI) Product Portfolio and Technology Management needs for a SAAS RFP.

Confidential, Tempe, AZ

Enterprise Data Architect

Responsibilities:

  • Mapped Legacy Product, BOM, and Location masters into corresponding SAP R/3 Structures. Worked with conversion team on data validation and transformation. Worked with the backfill team on “replaying” transactions containing new R/3 Master data into the now slaved Legacy Master data to keep other Legacy systems running.
  • Designed foundry and contract house (vendors) contract negotiation and management data model with $100s million annual spend. Defined data flows tying manufacturing flows to consumed resources, commitments, and PO payment authorizations. Insisted product resource profiles and capacity statements be used in Product Planning.
  • Defined and specified load mapping for an Operational Planning Database ODS for populating the SAP Advance Planning & Optimization (APO) Supply Network Planning (SNP) Engine. The defined data quality process is heavily focused on surfacing Relational Integrity (RI) issues so they can be corrected before a planning run.
  • Reverse engineered i2 Technologies Supply Chain Planning Engine data model into ERWin model which was implemented, with intermediate extensions, into Oracle staging model. Using process and database enforced referential integrity (RI) virtually eliminated input errors (10x resource cost versus normal flow) and avoided major hardware upgrade. Passed model to i2 Technologies for use as a work aid.
  • Defined and implemented data model for internally developed PDM (Product Data Management) application consolidating requirements and processes from 17 “like” organizations and automating feeds into multiple targets. Drastically improved cycle time, data quality, accountability, and reduced (>50%) headcount.
  • Worked a massive Product and BOM clean-up campaign enforcing drastically tightened Product Class and BOM Structure Rules and introduced Semiconductor Industry standard product statuses supporting Product Management and Planning Improvements. Over 160,000 products retired, 40% remaining reclassified, 70% active BOMs reworked, 100% had product statuses calculated.
  • Worked Strategic Plan with focus on Supply Chain Planning and introducing SAP R/3 core functionality. Drove R/3 environment sizing (and got it right ).

Strategic Information Coordinator

Confidential

Responsibilities:

  • Defined relational Product model with co-population by legacy IMS transactions supporting evolutionary migration to DB2 where all extensions were executed.
  • Drove multiple Product data quality and clean-up projects looking to improve data quality. Correcting software defects and the data they had mangled which reduced supporting staff by half.
  • Co-led SEI CMM process improvement effort achieving Level III .
  • Implemented manufacturing inventory flow analysis model as the basis for all Manufacturing Metrics until retired in 2009. Building the model and cleaning the data surfaced many data and process issues which drove corrective changes in the source applications.
  • Defined Y2K impact assessment criteria and strategies for handling each identified type.
  • Defined and implemented in 90 days against 300K+ products an Export Compliance engine supporting five countries. Meeting government mandate avoided shuttering company by inability to ship internationally.

We'd love your feedback!