We provide IT Staff Augmentation Services!

Data Architect Resume

3.00/5 (Submit Your Rating)

Pennington, NJ

SUMMARY:

  • 18+ Year of Information Technology experience in Business Analysis, Data Modeling, Data Architecture, Data Warehousing, NoSQL and Business Intelligence. Able to deliver management vision, goals, priorities, design principles, and operating policies in support of the business goals of the organization. Extensive experience in multiple full life cycle development projects (Including enterprise and DW/BI) including gathering business requirements, analysis of source systems, design Data Strategies for both transactional/operational and dimensional/analytical system, requirements.
  • Some of the key points which highlight the expertise level attained in Data Architecture are as follows:
  • Define and deliver Data Strategy and develop concept, logical data models, data architecture diagrams including current, Interim and Target State.
  • Design and develop data warehouse architecture, data modeling/conversion solutions, and ETL mapping solutions within structured data warehouse environments
  • Define best practices for data loading and extraction and ensure architectural alignment of the designs and development
  • Driving innovations by keeping current on emerging technology and data trends such as Big Data, Hadoop, NoSQL, Data Virtualization, and Data Services.
  • Extensive experience in Optimized database design of On - Line Transactional Processing (OLTP) systems. The logical data model is prepared based on Retail Point of Sales Logical Data Model paradigm that is completely normalized (i.e. in 3NF) form.
  • In-depth understanding of the Kimball & Bill Inmon architectures of Data Warehouse.
  • Extensive experience in working with Data Modeling Tool Erwin & Power Designer for design of Logical and Physical Data Model.
  • Hands-On experience with tuning of the SQL Queries of Data Warehouse (OLAP) and On-Line Transactional Processing (OLTP) systems.
  • Foster and Implement best Data API practices and methodologies in data analysis, design and modeling.
  • Provide technical leadership in resolving internal or external customer issues
  • Write product release notes, feature/function documents, white papers, and other documents in support of sales and marketing.
  • Provide guidance on structure and content of product documentation, and review for quality
  • Provide assistance to sales (gathering feedback from the field, helping prepare standard sales tools, helping respond to RFP's and involvement in sales demonstrations)
  • Conceptualized Data Hub Product for External Vendor Data and Developed AWS Private Cloud Data Hub Product Architecture.
  • Exposure to HDFS Concepts and Discovery Platform Architecture

TECHNICAL SKILLS:

Languages: SQL, PL/SQL, T-SQL .C, C++, Java, JSP

Modeling Tools: Erwin Data Modeler 7.3.10, MS Visio, Power Designer

RDBMS: Oracle 10g (RAC)/9i/8i, Sybase 10/11, DB2 and SQL

Reporting Tools: Cognos, Oracle Reports 6i/9i/10g

Front End: Power Builder 5.0, 6.5 and 8.5 and Oracle Reports 6i, 10g

ETL Tools: Oracle SQL * Loader, Informatica 8.5

Developer Tools: TOAD, PL/SQL Developer, SQL*PLUS, RAPID SQLDBArtisan, TOAD and Arc Server 2000

Operating Systems: AIX 5.3, HP UX, Sun Solaris UNIX, Windows NT/2000/XP

Legacy Language: IBM iSeries AS400, IBM Mainframe

SMAC Technologies: AWS Cloud, Salesforce, Hadoop, NOSQL

PROFESSIONAL EXPERIENCE:

Data Architect

Confidential, Pennington, NJ

Responsibilities:

  • Created Business Capability for Compliance Multiple Compliance Application Module or Subject Area.
  • Created Conceptual Model and Defined Data Strategy for External and Operational Data.
  • Defined Data Strategy for Downstream Warehouse Users
  • Technical Mentor for Java and Database Development Team on Data and workflow Insights.
  • Delivered DDL Scripts for Physical DB Creation
  • Providing Star Schema Solutions for Compliance Workflow Modules and Warehouse Requirements.
  • Influence the developer and business community by providing recommendations that foster data-driven decision-making and empower team members through the availability of data

Sr. Data Architect

Confidential, Durham, NC

Responsibilities:

  • Define overall Data architecture & design. Developed Data Strategy for Phase -1 data movement and Data Integration.
  • Created Concept and Logical Model for GIC Outbound Reference & Transaction data.
  • Drive the AIP process and delivery of the artifacts. Responsible for technical and architecture quality. Work with the delivery teams to develop Project Strategy
  • Identify key risks and mitigation steps. Appropriate balance between functional & technical backlog
  • Define overall Data architecture & design. Developed Data Strategy for Phase -2 data movement and Data Integration.
  • Drove Data Analysis Effort for Data Movement and Built ETL Designs.
  • Created Kimball Business Process Matrix for GIC Business Events. Defined level of grain required and Star Schema concept model for data modeler to physicalize
  • Developed Data Strategies for Warehouse data movement and Data Integration Strategies for Outlook
  • With Data Design, helped Business and IT Operation to Reduce the overhead and errors associated with managing conference schedules and incoming broker requests
  • Provided a robust data foundation that meets the analytical and operational needs of multiple advisors within the Asset Management business unit.
  • Tier1 Ace Harmony Integration
  • Created Star Schema for Counterparty Research Notes and Ratings
  • Collaborate with product managers, data scientists, business users and other engineers to define requirements and design solutions for analytics and data-driven products
  • Influence the company by providing recommendations that foster data-driven decision-making and empower team members through the availability of data
  • Created Concept Model for Counterparty approval and Reporting. Created data strategies for ODS and Warehouse.
  • Created Concept and Logical Tier1 Data model for Phase 1 Reference Data Distribution
  • Created Concept and Star Schema Logical Model for GIC Warehouse

Environment: Power Designer 8.0, Oracle, Oracle Exadata, Toad, Salesforce, Tier1 ACE Event Management and etc...

Lead Data Architect/Modeler

Confidential, Charlotte, NC

Responsibilities:

  • Created FSD for CORE IBM BPM 8.0 rollout for the business purpose of Streamline and consolidate the core deal and workflow data acquisition and distribution through MODE
  • Responsible for performing data analysis activities to capture data requirements clearly, and representing them in a formal and visual way through data models.
  • Created end to end system and data flow diagram.
  • Performed SOR analysis to Develop and Deliver Data Model to support MIDE multiple Business Intelligence / Data Warehouse projects
  • Worked on MODE ODS Data Mart to Support Treasury and MIS Reporting Requirements.
  • Data Analysis performed with LIS/LPS Origination and Mortgage Servicing Application.
  • Understand the existing system and preparing design document for all improvements. Mentoring the onsite and offshore team on all new enhancements.
  • Develop blue print of system, database models, Data Flow and source to target attributes mapping and logic.
  • Facilitated to Integrate Mortgage Banking Customer data with Confidential Integrated Customer Data Warehouse Data Marts.
  • Conducted Data Model & User Education Training Session with Business and Operation team.

Environment: Power Designer 15. Oracle 11g, SQL, PL/SQL, Procedures, Oracle, TOAD, Visio, Teradata

Lead Data Architect/Modeler

Confidential, Wilmington, DE

Environment: Erwin Data Modeler 7.3.10. Oracle 10g R4, SQL, PL/SQL, Procedures, Oracle, TOAD, Visio

Responsibilities:

  • Transform business and system requirements into a Subject Area, Conceptual, and Logical data model.
  • Responsible for performing data analysis activities to capture data requirements clearly, completely, and correctly while, at the same time, representing them in a formal and visual way through data models.
  • Develop and Deliver Data Model to support multiple Business Intelligence / Data Warehouse projects, including build out of Dimensional Models, Integration and Staging Model.
  • Involved in a fast-paced environment that develops, implements, and maintains mission critical information systems to the Business Intelligence community.
  • Responsible for assessing the data architecture of all of Loss Mitigation Data Marts.
  • Envisioned and Created Mod Data Mart to Support Treasury Reporting Requirements.
  • Designed the Staging, ODS and Star Model for Default Work Flow Application MIS Reporting.
  • Created Logical and physical database design for Foreclosure Semantic Data Model
  • Created Concept, Logical, and Physical data model for MSP Loss Mitigation Mortgage Servicing Application.
  • Created Operational Data Store for VLS Mortgage Servicing System.
  • Worked on De-Normalization as needed.
  • Understand the existing system and preparing design document for all improvements. Mentoring the onsite and offshore team on all new enhancements.
  • Develop blue print of system, database models, Data Flow and source to target attributes mapping and logic.
  • Worked on IBM Banking Data Warehouse Model(BDWM) and Created Integration Model for Mortgage Banking Default Application ( MSP, VLS,FORTRACS and etc.)
  • Facilitated to Integrate Mortgage Banking Customer data with Chase Integrated Customer Data Warehouse IBM Banking Data Model.
  • Conducted Data Model & User Education Training Session with Business and Operation team.
  • Created multiple Data taxonomy and DTD for many strategic projects.

Lead Data Architect/ Modeler

Confidential, Framingham, MA

Environment: Erwin Data Modeler 7.2.0, IBM VMM, Oracle 10g R4, PL/SQL, Procedures, iSeries AS400 and J2EE Technology

Responsibilities:

  • Transform business and system requirements into a Conceptual and then logical data model using Erwin Data Modeler 7.2.0. Logical data models to physical database designs and generating Scripts for Oracle database. Defining the Database and Data model Standards and Created Data Dictionary.
  • Sketching up system design, database models and basic algorithms. Mentoring onsite and offshore team on CaCS Data Model and IBM VMM Data Model. Worked on database Demoralization as needed
  • Derived Logical and Physical Data Model for the IBM VMM database (Federated User Repository) using the database. Script Generation for Oracle Database and Data Population
  • Creating Oracle Database Schema Objects and Setting up data for development, QA and Production Database environments of CaCS and IBM VMM ( Oracle 10g Release 4)
  • ISeries Data Analysis and Preparing Data Mapping Document for PCard and CCard Data along with Notes.
  • Creating Wired Frame for Data Flow for business requirement and defining the data abstract for all major business functionality
  • Preparing Interface Document for Real-in-time data integration between ISeries and Oracle Database. Creating Data Migration Design Document and defining testing strategy.
  • Creating ETL Programs using Oracle PL/SQL for Data Migration (Oracle 10g Release 4, SQL, PL/SQL). Oracle external table design for data migration and creating Linux shell Scripts as needed. Designed and developed best forming queries for Application.
  • Manage data population and migration architecture for application (PL/SQL) in all the environments.
  • Created test scripts for generating final reports.
  • Worked on Table Partitioning (Range and List) and deploying Local Indices on partitioned tables.
  • Implemented Performance Tuning by creating Indexes, executing EXPLAIN PLAN on SQL Scripts and using Optimizer Hints if necessary for faster retrieval

Database Architect/ Developer

Confidential, Charlotte, NC

Environment: Oracle 10g Release 2, SQL, Erwin, J2EE Technology

Responsibilities:

  • Transform business and system requirements into a Conceptual and then logical data model. Convert logical data models to physical database designs.
  • Sketching up system design, database models and basic algorithms.
  • Revisit existing ETL processes improving many stages by more than 10x times - usually achieving best improvement by tuning algorithms and application logic
  • Data modeling; workflow definition; metadata management. Worked on De-Normalization as needed.
  • Understand the existing system and preparing design document for all improvements. Mentoring the onsite and offshore team on all new enhancements.
  • Resorted LDM and PDM for the Oracle database and Re-instated the Change management.
  • Design scalable high performance solutions of utilizing data modeling and physical implementation of the database, partitions and tuning. Work on and participate in project development throughout the entire SDLC lifecycle.
  • Redesigned database architecture, analyze and tune application performance by optimizing Oracle SQL, stored procedures and triggers.
  • Creating ETL Programs using Oracle PL/SQL for Data Migration (Oracle 10g Release 4, SQL, PL/SQL). Oracle external table design for data migration and creating Linux shell Scripts as needed. Designed and developed best forming queries for Application.
  • Worked on Table Partitioning (Range and List) and deploying Local Indices on partitioned tables.
  • Implemented Performance Tuning by creating Indexes, executing EXPLAIN PLAN on SQL Scripts and using Optimizer Hints if necessary for faster retrieval
  • Provide technical leadership to offshore/onsite development team thru project leadership and technical documentation
  • Troubleshoot and provide technical solutions to database issues encountered by new and existing applications in the environment.
  • Developed a framework for Copying Production data to several schemas like DEV, UAT, & QA. Developed a database framework for Alert

We'd love your feedback!