- Development of enterprise data architecture plans to support business strategy and objectives.
- Application of data modeling methodology including enterprise, E/R and dimensional modeling to ensure business rules and metrics are captured, prioritized and incorporated in the technical design.
- Transition of logical models to operational data store, data warehouse, or OLTP applications.
- Develop enterprise Meta Data and Master Data strategy, data governance, and data quality plans.
- Strong facilitation experience in capturing business objectives and rules, and defining specifications. Analyze current enterprise information technology versus future data requirements, perform gap analysis and build roadmap to future technology.
- Experience with tools including ER/Studio Data Architect, ER/Studio Repository, ER/ Studio Portal, ERwin, Visio, DB2, Oracle, and SQL Server.
- Built data models for online banking, finance, insurance, healthcare insurance, mortgage securities, mortgage banking, bank examinations (FDIC), oil exploration, manufacturing, and utility sectors.
Data Architecture Consultant
- Developed data models for bank receivership resolution, public feedback tracking, Bank Examination Tools Suite, Corporate Reference Data (ERD), Office of International Affairs, enterprise interface database for PeopleSoft financial applications, compliance examination reporting, Analysis Toolkit (DIR data), Corporate Data Dictionary, and Contract Information Management applications. Refined and documented data migration and ETL procedures and rules. Participated in the development of M aster Data Management tables.
- Installed the ER/Studio, ER/Repository and ER/Portal for the data team. Migrated ERwin models to the new platform.
- Worked with SME(s) from Procurement, Contractor HR, Legal and Financial (PeopleSoft) functions to analyze Contract information, in preparation for a proposed data mart to support corporate performance analysis. Worked with multiple back office functions to review existing technology, business information flow, future requirements, gap analysis, data quality procedures, and develop plans for future scorecard and threshold analysis applications.
- Worked with the project team to build a database model that provided information for resolving failed financial institutions and final report generation. All resolution transactions are categorized to support financial and regulatory reporting. Generated database on SQL Server platform.
- Performed source data analysis, identified Authoritative Sources (Systems of Record). Built data model for bank examiners that supported multiple file input and data editing, via use of templates, expressions and constraints. The input might be bank loans, mortgage assets, deposits, or collatera l, supplied by banks to the FDIC for bank examinations, or existing bank information from FDIC applications. The data structures allowed different input formats. The database ran on the SQL Express platform.
- Participated in business event analysis. Modeled data requirements in support of examination of financial institution assets and liabilities, including loan revi ew, institution ownership, asset evaluation, violations, mor tgage asset/collateral review, and examination report generation.
- Developed a phased approach to data development that meet near term urgent Phase 1 goals, but would still meet the longer term data architecture objectives. Lead faciltation discussions with technical and business ex perts to integrate data from multiple sources , and develop the target data architectur e. Communicated the data architecture and model metadata to all parties via use of data flow diagrams, model reports, maps, spreadsheets, presentations, etc.
- Built model for Corporate Reference Data in support of the future data warehouse. Profiled existing data, compare with future requirements, recommended roles for target data creation, update, storage, and services to allow access across the enterprise. Parti cipated in the building of Master Data tables.
- Documented data migration and ETL rules and processes.
- Worked with business experts to create a model to track public comments on banking regulations as required by the Dodd Frank Act. The model supported web based interfaces to diffe rent Federal agencies. Proposed enhancements to the enterprise data model based on new requirements brought about by the Dodd Frank Act.
- Worked with project team to review data risks in support of their project planning tasks.
- Analyzed the risks of moving data between different platforms (DB2 to Oracle).
- Participated in the evaluation of tools that would support future FDIC data administration productivity requirements. The tools would have to support metadata information access by all business functions across the enterprise.
- Key member of team that set up ER/Studio Repository, Portal and Data Architect. Loaded ORacle database for Repository and conducted acceptance tests.
- Wrote procedures for smooth conversion of ERwin models to ER/Studio, develop procedures for problem identification and resolution.
- Acquired in-depth knowledge of ER/Studio and ER/Portal.
- Built data model to support the International Association of Depository Insurers (IADI) data requirements and reporting data mart. Documented roles for parties that created and/or consumed the data. Recommended tools to support back office analysis of the data.
- Modeled interface database serving as interface between the PeopleSoft Financial and HR applications, and other applications such as Legal, Procurement, etc. Modeled tables to support threshold analysis of the interface data, by functions across the enterprise. An example was to enable analysis of contract compliance by combining legal, procurement and financial data. Recommended phased approach, and future tools for performance analysis.
- Worked with bank insurance experts to build a new data mart to support the banking ratios process that run on SAS reporting software, such as CAMELS ratings and related computations, and performed risk data mapping. The project built a data mart to support statistical analysis of banking data, with predictive analysis of bank data by use of slice and dice techniques using SAS.
- Built data models for the Corporate Data Dictionary and Enterprise Repository applications that would be flexible to accommodate future FDIC data administration requirements. Built acceptance criteria and test cases.
- Set up financial and contract systems.
- Trained team members for data management tasks.
- Built data models for their enterprise data warehouse and data mart.
- Worked with business and technical teams to specify data quality verification procedures, and built future data warehouse that integrated multi-source web, internal transactions, and external data for business analysis. The data warehouse supported analysis of patterns, performed margin analysis, and improved customer service.
- Developed dimensional data model and physical database design, using ER/Studio Data Architect, Repository and Rapid SQL tools, to support the mortgage securities (single and multi-family) portfolio tracking data mart (Oracle 9i). The data mart supported statistical analysis, trend analysis, time series, profitability analysis and performance indicators of the acquisition and sales of mortgage backed securities and bonds portfolios.
- Worked with business experts to develop attribute names and definitions that complied with enterprise architecture standards. Led to reusable names and definitions of shared data, avoided data redundancy, promoted data sharing, and increased business knowledge dissemination.
- Built data model to analyze mortgage payment and predict refinancing patterns. The model facilitated risk analysis, statistical analysis, and predictive modeling.
- Built data mart to support analysis of single family mortgage payments.
- Transitioned to full time employment in September 2006.
- Developed the data model and physical database design for the Automated Clearing House (ACH) database for monetary transmission to and from the Federal Reserve inter-bank system.
- Developed strategy for analyzing data and processes to ensure they complied with the US Federal Government's information security categorization requirements.
- Analyzed business processes and built data architecture and models to support the enterprise digital document library Documentum. This library supported document access across the enterprise by multiple back office functions.
- Analyzed information requirements, and modeled the data mart to support the Balanced Scorecard analysis of bank examiners’ activities. Planned and assisted the migration to a new technology platform for performance analysis.
- Identified source information, transformation rules, performed data quality checks, built dimensions and facts to support the business key performance indicators.
- Analyzed data requirements and developed logical data models using Cool: Gen / CA Advantage for the IRS enterprise data warehouse project.
- Analyzed business data rule, and performed data gap analysis.
- Advised the data warehouse team on the implementation of a central repository. The business users were able to browse multiple definitions from different systems, to support data mining of vehicle parts performance information.
- Developed the Meta data strategy and data administration plan to support the data warehouse. The plan allowed Meta data from multiple data warehouse tools to be integrated into a central repository.
- Developed the enterprise Meta data strategy for a healthcare insurance organization.
- Worked with the business process teams to guide the client towards building an enterprise data model, common data definitions and standard business terminology, using Erwin and PR/MVS repository.
- Led team of analyst/programmers to develop the Retail and Corporate Planning systems, which analyzed sales and margins by stores, regions, product groups, and identified trends.
- The system helped management to respond quickly to market changes and contributed to the firm's survival during the recession of the early 1980s.
- Developed a Cash Management System that analyzed cash register data to identify customers’ purchasing patterns, enabling management to target customer trends.