We provide IT Staff Augmentation Services!

Sr. Data Architect/modeler Resume

0/5 (Submit Your Rating)

Baltimore, MD

SUMMARY

  • Result - oriented IT professional experience with strong background in Database development and designing data models for online transactional processing (OLTP) and online Analytical Processing (OLAP) systems for Health Care, Financial, Insurance & Retail domains.
  • Strong background in various Data Modeling tools using ERWIN, IBM Data Architect, Power Designer,MS Visio.
  • Expert knowledge in SDLC(Software Development Life Cycle) and good experience in the field of business analysis, reviewing, analyzing, and evaluating business systems and user needs, business modeling, document processing.
  • Extensive experience inRelationalData Modeling, Dimensional Data Modeling, Logical data model/Physical data models Designs, ER Diagrams,Forward and Reverse Engineering, Publishing ERWIN diagrams, analyzing data sources and creating interface documents.
  • Designed and developed Data Marts by following Star Schema and Snowflake Schema Methodology, usingindustry leadingData Modeling tools like Erwin.
  • Participated in JAD sessions, created Use Cases, workflows, and Power Point presentations.
  • Gathering and translating business requirements into technical designs and development of the physical aspects of a specified design.
  • Gathered and documented Functional and Technical Design documents.
  • Have experience in extensive data profiling & Analysis.
  • Worked extensively on Data Governance.
  • Implemented metadata management to achieve regulatory compliance and quality in business intelligence.
  • Have worked with ETL tools to extract, transform, and load data from relational databases and various file formats and loaded to target database.
  • Excellent knowledge inSQLand codingPL/SQLPackages, Procedures.
  • Experience in SQL Performance Tuning and Optimization (Design, Memory, Application, IO) and using Explain plan, Tracing and TKPROF.
  • Experience in creating Materialized views, Views, Lookups for the Oracle warehouse.
  • Experience withErwin model manager.
  • A dedicated team player with excellent communication, organizational and interpersonal skills.

TECHNICAL SKILLS

Data Modeling Tools: Erwin, ER Studio and Oracle Designer

Languages/Development Tools: R, TSQL, SSIS, SSAS, MDX, SSRS, Master Data Services (MDS), Data Quality Services (DQS), Excel BI Components, ANSI SQL, Tableau.

Azure Cloud Technology: Unstructured, Structured, and big data storage solutions including Azure Data Lake, Azure SQL, Azure Parallel warehouse. AI, Cognitive Services, machine learning. Migration of data from on - premises to cloud, as well as hybrid solutions using both.

AWS Cloud Technology: large-scale data migration

Google Cloud Platform (GCP): On Premise to Cloud migration

Traditional Enterprise Solutions: Full Confidential BI Stack comprised of SQL Server, SSIS, SSRS, SSAS MDS, DQS, Performance Point Server, Excel Power Pivot, Data Mining, and MDS Components; MOSS, SharePoint Services

Architecture: SOA, ESB, Kimball BI Methodology (star schema) data warehouses, Data Mining Models, Master Data Management, OLTP (Online Transactional Processing - Normalized Transactional Database), OLAP (Online Analytical Processing - Dimensional and Tabular Model), Tableau Data Model

Infrastructure: Windows all versions, Azure Active Directory, On-Premises Active Directory, ADAM, Identity Information Server, WhereScape

Databases: SQL Server, Azure SQL, Azure Data Warehouse, Oracle, Teradata, and MS Access

Project Management and Design: Visio, Visual SourceSafe, Team Foundation Server, Project Server, MS Project, Erwin, VSTS 2010 through 2017 Ultimate Editions, VSTS 2008 Tester and Database Edition, GIT

PROFESSIONAL EXPERIENCE

Confidential, Baltimore, MD

Sr. Data Architect/Modeler

Responsibilities:

  • Created Logical & Physical Data Modeling on Relational (OLTP), Dimensional Data Modeling (OLAP) on Star schema for Fact & Dimension tables using Erwin.
  • Gathered business requirements, working closely with business users, project leaders, and developers.
  • Analyzed the business requirements and designed Conceptual and Logical Data models.
  • Prepared ETL technical Mapping Documents along with test cases for each Mapping for future developments to maintain SDLC.
  • Assisted in the migration of the Legacy DWH, based on the Kimball bus Matrix design, into the current Data Vault 2.0 Enterprise DWH.
  • Involved in Data flow analysis, Data modeling, Physical database design, forms design and development, data conversion, performance analysis and tuning.
  • Create and maintain data model standards, including master data management (MDM).
  • Involved in extracting the data from various sources like Oracle, SQL.
  • Designed and maintained the Enterprise Data Warehouse model following the Data Vault 2.0 design methodology, using the CA Erwin data modeling tool.
  • Extensively worked on early-stage business projects, discovery efforts, and engagements initiated by Business Relationship Managers to provide appropriate architecture deliverables, such as stakeholder analyses, capability analyses, risk/value analyses, or technical analyses.
  • Worked with Database Administrators, Business Analysts, and Content Developers to conduct design reviews and validate the developed models.
  • Guided data engineers on how to properly ingest batch data sets into the stage environment and also how to load data in to both the raw Vault as well as the Business Vault.
  • Used Model Mart of Erwin for effective model management of sharing, dividing, and reusing model information and design for productivity improvement.
  • Performed data analysis and data profiling using complex SQL on various sources systems and answered complex business questions by providing data to business users.
  • Generated ad-hoc SQL queries using joins, database connections, and transformation rules to fetch data from legacy DB2 and SQL Server database systems.
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment using Informatica Data Quality and developed working documents to support findings and assign specific tasks.

Environment: Erwin Data modeler, Informatica, SQL Server, SQL, Oracle, DB2, MS Excel, MS Visio, Rational Rose, Tableau, Power BI, EPIC Caboodle, Epic Clarity, Azure Data Lake, Azure Data Warehousing, Snowflake cloud, Python, NoSQL, Where cape

Confidential

Lead Data Analyst/Data Modeler

Responsibilities:

  • Involved in interactions with Subject Matter Expert, Project Manager, Developers, and the end-users in more various JAD sessions to Gather Requirements.
  • Involved in all phases ofSDLCfrom the requirement, design, development, testing and rollout to the field user and support for the production environment.
  • Translated Business Requirements into Data Design Requirements used for driving innovative data designs that meet business objectives.
  • Designed the Enterprise Data Warehouse following the Data Vault 1.0 methodology.
  • Designed the various Data Vault entities using the IBM Data Architect tool.
  • Developed Business and Subject Area LogicalDataModels,Physical Data Models, Physical Staging Area Data Models andDataFlow Diagrams.
  • Extensively used Erwin to design Logical/Physical Data Models,Forward and Reverse Engineering, publishing data model, applying DDLs to database and restructuring the existing Data Models.
  • Wrote SQL Queries and XQuery to profile MySQL,Hadoop, and Mark logic databases
  • Responsible for designing and introducing new (FACT and Dimension Tables) to the existing subject area Data Models.
  • Created Data Mapping documents detailing the transfer of data from Source to Target.
  • Created a Semantic Data Model to support Power BI reporting.
  • Testing the application to ensure proper functionality, data accuracy, and modifications had no adverse impact on the integrated system environment.
  • Involved in the Maintenance and Support of existing applicationswith the users.
  • Involved in updating metadata repositories while detailing on use of applications and data transformation to facilitate impact analysis.
  • Involved in Performance Tuning of the database, which included Creating Indexes and Optimizing SQL Statements.

Environment: Erwin, MySQL,Hadoop, Mark logic, DBVisualizer, Aqua data studio, DataVault 2.0, AWS, MicrosoftExcel, Power BI.

Confidential, CT

Data Architect/Sr. Data Modeler

Responsibilities:

  • Responsible for delivering and coordinating data profiling, data-analysis, data-governance, data-models (conceptual, logical, physical), data-mapping, data-lineage, and reference data management.
  • Participated in meetings & JAD sessions to gather and collect requirements from the business end users as well as reporting requirements.
  • Worked with Business Analysts, SQL Developers, end users and stakeholders to understand the requirements and signoff on the various deliverables
  • Create team strategies and establish project scopes of work; communicate project deliverable periods and benchmarks to project sponsors.
  • Created process flows and data flow diagrams of the current and future systems.
  • Suggest architectural improvements, design, and integration solutions, and formulate methodologies to optimize database development.
  • Performed Data profiling and exploration using Informatica IDQ
  • Performed gap analysis for different data sources
  • Set up multi-user environment through distributed versioning control system, GIT HUB
  • Integrated Data modeling tool, IDA with business glossary
  • Created Source to Target mapping to include transformations and data quality rules needed for EDW Integration
  • Created Data Marts to support enterprise reporting.
  • Maintained metadata, version controlling of the data model.
  • Utilized hybrid data model architecture to deliver project goals with limited budget.
  • Developed Data Exploration zone stores to manage raw data for exploratory analytics
  • Integrated physical data model with business glossary.
  • Implemented data lineage of Metadata workbench tool
  • Developed data marts to feed Actuary statistical models
  • Experience using IBM Big Insights Hadoop distribution framework -Big Integrate, Big SQL, HDFS, Hive, Sqoop, Spark

Environment: IBM Infosphere Data Architect, Hive, HDFS, Vertica 8, Oracle 11g, Big Integrate, Toad, SQL Server, GIT HUB, WhereScape

Confidential, Baltimore, MD

Sr. Data Modeler

Responsibilities:

  • Responsible for delivering and coordinating data-profiling, data-analysis, data-governance, data-models (conceptual, logical, physical), data-mapping, data-lineage, and reference data management.
  • Participated in meetings & JAD sessions to gather and collect requirements from the business end users as well as reporting requirements. Worked with Business Analysts, SQL Developers, end users and stakeholders to understand the requirements and signoff on the various deliverables.
  • Create team strategies and establish project scopes of work; communicate project deliverable time frames and benchmarks to project sponsors.
  • Created process flows and data flow diagrams of the current and future systems.
  • Suggest architectural improvements, design, and integration solutions, and formulate methodologies to optimize database development.
  • Performed Data profiling and exploration using Informatica IDQ.
  • Performed gap analysis for different data sources.
  • Created Source to Target mapping to include transformations and data quality rules needed for EDW Integration.
  • Created Data Marts to support enterprise reporting.
  • Maintained metadata, version controlling of the data model.
  • Utilized hybrid data model architecture to deliver project goals with limited budget.
  • Developed Data Exploration zone stores to manage raw data for exploratory analytics.
  • Created Materialized views & Summary tables for Reporting.
  • Integrated physical data model with business glossary.
  • Implemented data lineage of Metadata workbench tool.
  • Implemented metadata management as one part of data governance. Worked with the business users to populate business glossary.
  • Worked in Agile environment to help teamwork toward our goals.
  • Performance Tuning on SQL Queries for efficiency and performance
  • Migrate data from traditional database (Teradata) systems to Azure databases.
  • Design and implement database solutions in Azure SQL Data Warehouse, Azure SQL
  • Data Migration from mainframe application to Amazon Web Services cloud and Hive platforms.

Environment: Erwin, LINQX, Hadoop Hive, JIRA, HDFS, Azure, Teradata, Oracle 11g, Oracle, Netezza, Big Integrate, SQL Server, AWS Cloud Computing Services, SharePoint

Confidential, Atlanta, GA

Data Analyst/Sr. Data Modeler

Responsibilities:

  • Performed requirements gathering meetings with both Stake holders and end users.
  • Engaged source data SMEs to understand the data sets.
  • Profiled source system data sets for integration.
  • Designed Logical and Physical data models using the ER Studio modeling tool.
  • Utilized the snowflake schema for he dimensional model. The business requirement called for normalization to 3NF.
  • Worked with ETL teams to create source and target mappings.
  • Participated in performance management and tuning for stored procedures & queries.
  • Utilized JIRA to track allocated tasks and comply with the agile internal requirement.
  • Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT).
  • Involved in interactions with Subject Matter Expert, Project Manager, Developers, and the end-users in more various JAD sessions to Gather Requirements.
  • Involved in all phases ofSDLCfrom the requirement, design, development, testing and rollout to the field user and support for the production environment.
  • Translated Business Requirements into Data Design Requirements used for driving innovative data designs that meet business objectives.
  • Developed Business and Subject Area LogicalDataModels,Physical Data Models, Physical Staging Area Data Models andDataFlow Diagrams.
  • Extensively used Erwin to design Logical/Physical Data Models,Forward and Reverse Engineering, publishing data model, applying DDLs to database and restructuring the existing Data Models.

Environment: Erwin, MySQL,Hadoop, Mark logic, DB Visualizer, Aqua data studio, MicrosoftExcel, ER Studio, MS SQL Server 2012, 2014, ServiceNow, Sales Force, MS Visio, Microsoft Visual Studio 2012, 2013, Microsoft Team Foundation Server

We'd love your feedback!