- I have been providing data architecture design and support to multiple concurrent development projects at large organizations for more than 20 years. My niche business sector is banking and finance. Other sectors are Pharmaceutical, Retail, Healthcare and Logistics.
- Services include data modeling, metadata management, requirements analysis, process analysis, data profiling, source to target data mapping, data migration solution design, data quality monitoring and root cause analysis
- Banking and Finance domain includes: Bill Pay, Mobile Banking, Mortgage & Commercial Loans, Construction Loan, Student Loan, Retirement Plans, Stock Plan Services, Commercial Credit Card Management, Cloud Based Payment, Online Checkout Automation, Payables Automation.
- Have experience in reviewing existing OLTP, EDW and Big Data data architecture and refining it to comply with globalization standards and naming standards.
- Big data projects include: Designing Hadoop tables for reporting and providing DBA with Hive DDL; Creating a 2 - tier staging and reporting data warehousing architecture on Hadoop and specifying ETL data mapping; sometimes reverse engineer the data model from NoSql database design.
- Able to specify technical solutions involving complex data transformation rules using a combination of narratives, data flow diagrams, process diagrams, pseudo code and syntax free SQL statements.
- Effective collaboration with development team leads, solution architects, business users, business analysts, data architects, data modelers, DBAs, ETL engineers and Reporting engineers .
- Mostly used Data Modeling tools: Erwin Release 9 and Embacadero ER/Studio
Mostly used RDBMS: TERADATA, ORACLE, UDB DB2, SQLSERVER
SQL experience: SQL at advanced level
Data Profiling tool: Informatica Data Explorer
Data Analysis and Data Load tool: TOAD, SQL Assistant
Reporting environment experience: MicroStrategies, Business Objects, Cognos, Oracle OBIEE
ETL environment experience: Informatica, Ab Initio, Accential DataStage, Oracle Sql*Loader
Senior Data Architect
- I am one of the Senior Data Architects of a data architecture group leading the design and implementation of data models that can support multiple ongoing company projects.
- Designed and implemented OLTP and ODS data models in 3rd normal form; and OLAP star schema models in compliance with the corporate data naming standards.
- Designed and implemented Hadoop data platform for supporting analytical reporting.
- Maintain data modeling, data naming and abbreviation standards
- Create ticket for submitting DDL to DBA for implementation in Dev, QA and Production.
- Review access path, SQL and contribute to performance tuning.
- Contribute to project planning discussions, provide daily standup status updates on data base design progress and be a critical resource for issue resolution.
- DBMS involved: Sqlserver, Oracle, DB2, Hive
- Manage BI Data Services requests from Assigned to Production states. Request size is from one table to 20 tables, mostly involving offshore ETL development and local Business Object reporting.
- Subject areas covered: Accounts, Customers, Events, Stock Plan Services, Retirement Plan, Stock Orders, Pricing Negotiation, Access Management, Incentives.
- Personally accountable for analyzing the BI requirements, 3NF and dimensional data modeling, data acquisition design, data profiling, source to target data mapping, BI views design, metadata dictionary maintenance, QA, UAT and root cause analysis.
- Conduct conceptual design review, data architecture review and ETL data mapping review.
- Tools used: Erwin release 8, Teradata Assistant, TOAD
- DBMS involved: Sqlserver, Oracle, DB2 on mainframe and Teradata
- Source file format:: Pipe delimited flat file, Excel csv, fixed width flat file .
- Confidential (parent company Roche) is a leading pharmaceutical organization managing the product development from R&D to commercialization. Under the new Sunshine Act, Confidential is required to track their spending on healthcare professionals on a federal and state level during the product development and commercialization life cycle. In order to comply with the new Aggregate Spend reporting requirements, existing data architecture and solutions have been modified and enhanced.
- Performed logical and physical data modeling, data profiling, source to target data mapping, technical solution specifications, data quality monitoring and root cause analysis.
- Supported the ETL Informatica developers, Business Objects developers, QAs and KPMG System Auditors.
- Developed data models using Embarcadero ER/Studio and implemented physical database on Oracle 11. Database.
- Data architecture layers include OLTP source systems, Landing, Staging, Integrated Data Store, ODS, Data Mart..
- Profiled incoming extract files from multiple sources using Informatica Data Explorer and recommended corrective actions..
- Analyzed and reconciled data across all data layers using TOAD and custom SQL queries.
- Developed Technical Solution Specifications based on in-depth understanding of the data.
- Discovered root causes of data discrepancies.and recommended corrective actions.
- Developed data quality monitoring data model for keeping daily operational data load volume and trending.
- Participated in the Wachovia to Confidential commercial online banking conversion/integration projects by performing data profiling, data analysis and data mapping specifications. I supported the offshore ETL Informatica developers and onshore QAs.
- Maintained the data models of the Internet Services Group (ISG) Money Movement solutions (MMS) for each enhancement project. The applications were online banking, bill pay and inter-financial institution payment.
- Completed a Bill Pay Data Purge data analysis and developed a Data Purge Framework data model for automating the PL/SQL purge script generation.
- Perform Data Compression data analysis in order to identify what tables to compress.
- Prepared sizing for data modeling efforts.
- Maintained metadata and training materials.
- Worked closely with the project managers, product managers, application architects, development teams, DBA and QA from project conception to implementation.
- Maintained the data modeling processes and data modeling standards for ensuring that all data modelers will get engaged at the early stage and all data models will be completed with consistent format and quality.
- Confidential Bank is the second largest bank in California after Confidential . After the merge with United Commercial Bank on 10/6/09, Confidential Bank has become the largest US bank that serves the Asian community. I worked as a Data Architect responsible for establishing an integrated Customer & Account data warehouse architecture for supporting management reporting and BI development.
- Integrated all the existing loan systems structure into an enterprise data warehouse architecture. Source systems include Fidelity (1-4, Multi-family mortgage), Metavante (Deposit, Commercial loan, Commercial Real Estate, HELOC and EZline), PCFS (SBA loan), and Trade Innovation (Trade finance).
- Joined with Business Analyst in defining the reporting requirements and identifying the authoritative data source systems.
- Extensive study of existing system documentation and on-line real time systems in order to understand the business concepts.
- Carried out source data profiling to review the domain of values, percentage distribution, uniqueness and referential integrity.
- Develop a complete and accurate Source to Target data mapping with pre-load validation rules and transformation rules.
- Lead and coordinate a team of data modelers in developing the logical and physical data model for the target data warehouse and report application layer.
- Developed data modeling and naming standards.
- Resolved data quality issues reported during QA and UAT.
- Maintained the data models of the Internet Services Group (ISG) Customer Information & Analytics (CIA) data warehouse which provides a 360 degrees view about a customer and supports online sales & marketing performance tracking, campaign management and marketing assets tracking, marketing resources management.
- Major projects completed: XML Event Framework, Online Sales & Marketing, Mobile Banking, Virtual Safe, Credit Card, Student Loans, Same Day Payment, Wachovia Bank data integration and Email quality tracking,
- Participated in Joint Requirements Discovery sessions in order to facilitate the business users in expressing their data sourcing and reporting requirements.
- Provided requirements level and design level sizing to PM.
- Developed the logical data models using Erwin for data source layer, data integration layer and data application layer.
- Attended weekly data model peer reviews with data architect, data modelers and DBAs.
- Walk through data models and data mapping with project team members and reporting users.
- Provided DBA with a first cut physical data model for implementation on Teradata database.
- Prepared extract files for updating the Enterprise Metadata Repository (using Rochade).
- Maintained the data models of the Internet Services Group (ISG) Money Movement solutions (MMS) using Erwin. The models were implemented on Oracle 11 database platform for supporting online banking, bill pay and inter-financial institution payment.
- Analyzed the business requirements and clarified data requirements.
- Completed metadata and training slides for the Bill Pay data model with over 500 entities and 2500 data elements.
- Developed an Operational Data Store data model for supporting operational reporting and business intelligence reporting requirements.
- Completed the data models for a new data warehouse which was implemented on Teradata platform for the Mortgage, Credit Card and Investment Checking accounts. This data warehouse provided the Bank marketing representatives the capability in cross-selling bank products to the traditional Brokerage Dealers customers, tracking performance and paying incentives.
- Huge volume of source data come daily/monthly from multiple vendors into the IBM platform, got converted into flat file for loading into the Teradata platform via Informatica and reporting via Business Objects. .
- Developed logical and physical data models in 3rd Normal Form and dimensional form using Erwin.
- Maintained metadata and prepared extract files for maintaining the Metadata repository and website.
- Conducted source data profiling and define data integrity checks.
- Prepared source to target data mapping documents.
- Supported ETL development, Reporting development, QA and performance tuning.
Senior Data Architect
- Reported to Director, Data Architecture and Database Technologies and Project Managers
- Processed change requests according to specified priorities.
- Analyzed business requirements developed by business analysts.
- Provided sizing and timeline.
- Participated in work product peer reviews and data architecture planning meetings.
- Developed logical/physical data model for OLTP and data warehouse projects using ERwin and Oracle Designer.
- Worked with DBAs in transforming a logical data model to physical data model on Oracle 9i database.
- Developed source to target data mapping documents.