We provide IT Staff Augmentation Services!

Sr. Data Modeler Resume

2.00/5 (Submit Your Rating)

Irving, TexaS

PROFESSIONAL SUMMARY:

  • Senior Data Modeler/Analyst with 10 years of IT professional experience in Data Modeling, designing and data analysis with Conceptual, Logical and Physical modeling for Online Transaction Processing and Online Analytical Processing (OLTP & OLAP) systems.
  • Experienced in Data Warehouse, Database design, SQL, and PL/SQL programming
  • 7+ years Data Modeling experience using Erwin, ER Studio, Power Designer. Dimensional Modeling, Relational Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data mart, OLTP, OLAP, FACT & Dimensions tables, Physical & Logical data modeling and Oracle Designer.
  • Strong background in business intelligence, reporting systems, data quality (including ETL), and full life - cycle development.
  • Experienced with most of software deployments methodologies like Water Fall, Iteration, Agile and Rapid Application Development (RAD).
  • Vast experience of working in data management including data analysis, gap analysis and data mapping.
  • Performed Data mining, data profiling, data cleansing, data validation to maintain data quality.
  • Experience in gathering business requirements from business/user, creating Process Flows and Data Flow Diagram (DFD) and creating Source to Target mapping documents. Working directly with ETL team for data extraction, Data Quality (DQ) team for validation.
  • Worked with the core functionalities of the data modeling tools like Forward Engineering, Reverse Engineering, Complete compare and Merging.
  • Designed star schema and Snowflake schema data Models. Worked with MDM and Ref Data as per business need.
  • Worked with heterogeneous relational databases such as Teradata, Oracle, DB2, MS Access and SQL Server.
  • Experience in working third party tools like WinSQL and TOAD.
  • Worked with Data Management team for exceptional data handling.
  • Developing reports on SSAS & SSRS on SQL Server. Sound Experience and understanding of SSAS, OLAP cube and Architecture.
  • Supported team in resolving SQL Reporting services and related issues.
  • Extensive experience in Performance tuning databases for optimum performance
  • Expert in Backup / restore of databases for maximum efficiency.
  • Hands on experience in testing ETL mappings and Business Object (BO) reports.
  • Experience in complete Unified Modeling Language (UML), Software Development Life Cycle (SDLC), Software Testing Life Cycle (STLC) and Bug Life Cycle (BLC).
  • Experienced in Data Virtualization tool Denodo Development and implementations.
  • Proficient in all cycles of test life cycle from test planning to defect tracking and managing defect lifecycle.
  • Interacting regularly with the development team, creative services, database designer, system administrator and the higher management to meet the deadlines of Project milestones.
  • Worked in close co-ordination with the testing team in developing test plan and test cases from functional and business requirements.

TECHNICAL SKILLS:

Business/Data Modeling Tools: MS Visio, MS Excel, Rational Rose, Data modeling, Sybase Power Designer, Erwin 9.6.4/7.2/4.5/4.1.4/3.5.5/3.5.2 , ER Studio 7.1.1, IBM Rational System architect, Dimensional Modeling, Relational Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data mart, OLTP, OLAP, FACT & Dimensions tables, Physical & Logical data modeling and Oracle Designer.

Databases: Oracle 11g/10g/9i/8i/8.0/7.0, DB 2 8.0/7.0/6.0 , MS SQL Server 2014/2012/2008 Teradata 12.1.13, Netezza, Sybase 12.x/11.x, MS Access 7.0/2000, XML, PL/SQL, SQL Plus, SQL Loader

Data Warehousing: Informatica Power center 8.0, SSIS, Power Mart 6.1, Warehouse Designer, Power connect for SAP/Oracle Apps/Siebel/DB2, IDQ, Power Exchange, Power Analyzer, ETL, Data mart, Siebel 7.0/2000/5.5 , OLAP, OLTP, Autosys

Tools: WinSQL, TOAD, Autosys, Teradata SQL Assistant, Clear Quest, HPALM, Rally, Rational Requisite Pro, COGNOS

Environment: Windows 7/Vista/2000/XP/2003, Unix 5.2/4.3,Sun Solaris7, WinNT4.0

PROFESSIONAL EXPERIENCE:

Confidential, Irving, Texas

Sr. Data Modeler

Responsibilities:

  • Responsible for data warehousing, data modeling, data governance, data architecture standards, methodologies, guidelines and techniques
  • Partner with various business stakeholders and technology leaders, gather requirements, converted them into scalable technical and system requirement documents
  • Designed rule engine to handle complicated data conversion requirements when syncing data among multiple POS systems and the centralized ERP system
  • Designed Data lake, Master data, Security, data hub and data warehouse/data marts layers.
  • Created logical, physical models according to the requirements and physicalize them.
  • Effectively articulated reasoning for data model design decisions and strategically incorporated team member feedback to produce the highest quality data models.
  • Worked with project and application teams to ensure that they understand and fully comply with data quality standards, architectural guidelines and designs.
  • Performed Reverse engineering of the source systems using Oracle Data modeler.
  • Involved in capturing Data Lineage, Table and Column Data Definitions, Valid Values and others necessary information in the data models.
  • Identified Facts & Dimensions Tables and established the Grain of Fact for Dimensional Models.
  • Generate the DDL of the target data model and attached it to the Jira to be deployed in different Environments.
  • Tuned DB queries/processes and improved performance
  • Reverse engineered crystal reports (Command Performance), SSRS reports to identify logic/business rules for the Driver's Performance Metrics, Customer Order Performance, Order management and daily sales .Created data mart based on multiple POS system for Power BI Dashboards/reports.
  • Worked on Data load using Azure Data factory using external table approach.
  • Involved in creating Pipelines and Datasets to load the data onto data warehouse.
  • Worked closely with ETL SSIS Developers to explain the complex Transformations using Logic
  • Created ETL Jobs and Custom Transfer Components to move data from Transaction systems to centralized area (Azure sql data Warehouse) to meet deadlines.
  • Extensively used SSIS transformations such as Lookup, Derived column, Data conversion, Aggregate, Conditional split, SQL task, Script task and Send Mail task etc.
  • Developed SSIS packages to load data from various source systems to Data Warehouse.

Environment: Oracle Data modeler, Visio, Microsoft outlook, Adobe pdf, DQ Analyzer, SQL Server, Azure data factory, Power BI, Microsoft Teams, Microsoft Visual studio.

Confidential, Hopkins, MN

Sr. Data Modeler/Sr. Data Analyst

Responsibilities:

  • Facilitated project kickoff session with Business and SMEs to understand the requirements which also involved Project Manager, Development Lead, Data Modeler, Data Mapper, Architect etc.
  • Owned and managed all changes to the data models. Created data models, solution designs and data architecture documentation for complex information systems.
  • Effectively articulated reasoning for data model design decisions and strategically incorporated team member feedback to produce the highest quality data models.
  • Worked with project and application teams to ensure that they understand and fully comply with data quality standards, architectural guidelines and designs.
  • Participated in brainstorming sessions with developers and DBAs to discuss about Partitioning and Indexing Schemes for Physical Model.
  • Designed the ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) as per business requirements using Erwin . Created the Conceptual, Logical, Physical data Models
  • Reviewed the Conceptual EDW (Enterprise Data Warehouse) Data Model with Business Users, App Dev. and Information Architects to make sure all the requirements are fully covered.
  • Analyzed the existing logical model of ODS to understand relationships between different entities.
  • Worked on Requirements Traceability Matrix to trace business requirements to Logical Model.
  • Reviewed the Logical Model with Application Developers, ETL Team, DBAs and Testing Team to provide information about the Data Model and business requirements.
  • Normalized the tables/relationships to arrive at effective Relational Schemas without any redundancies.
  • Created Snowflake Schemas by normalizing the dimension tables as appropriate, and creating a Sub Dimension named Demographic as a subset to the Customer Dimension.
  • Developed Virtual DataMart’s using Denodo Data Virtualization Tool
  • Worked on a POC for Denodo Data Virtualization. Created Base Views, Derived Views and Interface Views
  • Extensively used Metadata & Data Dictionary Management; Data Profiling; Data Mapping.
  • Applied Data Governance rules (primary qualifier, class words and valid abbreviation in Table name and Column names).
  • Created schema objects like Indexes, Views and Sequences. Tuning and optimization of SQL Queries.
  • Involved in dimensional modeling of the data warehouse to design the business process.
  • Performed, Data mining, data profiling, data cleansing, data validation to maintain Data quality.
  • Performed the data model comparisons between the data models in data mart and the database to find if they are same. Once the data Model is completed Create Source to target mapping Document.
  • Follow Up With teams as the code gets deployed in different environments to find if there are any issues.
  • Involved in the process design documentation of the Data Warehouse Dimensional Upgrades. in the design of report layouts to maximize usability and business relevance.
  • Make sure all the tables have Audit columns. Create lookup tables if there are any code columns which need description. Generate the DDL by forward engineering.
  • DDL Can be generated either by comparing to backup Model or by using generate database option.
  • Deploy the DDL in the Development environment and required database.
  • Save this DDL in share point and send DDL folder path to the DBA’S to be deployed in other environments like QA, External, and Production.
  • Generate Data dictionary and ER Diagram PDF and save it on share point in project folder.
  • Share the project related Documents and the change log to the Teams involved in the project.
  • Good understanding of Teradata SQL Assistant, Teradata Administrator and data load/ export utilities like BTEQ, Fast Load, Multi Load, Fast Export.
  • Using Agile Central (Rally) to enter tasks which has the visibility to all the team and Scrum Master.
  • Analyzed the JSON files using XML SPY. Involved in integrating the JSON files to API.
  • Worked collaboratively with ETL team and QA team to ensure system deliverables align with business requirements with measurable results.
  • Involved in creating Test Data for evaluating in and out boundaries of Testing
  • Involved in fixing the defects raised by QA Team.
  • Worked closely with ETL team and QA Team for End- to- End Testing.
  • Provided production support by monitoring the processes running daily.

Environment: Rally, Mainframe, DB2, Teradata Developer 15.00, L, XML Spy, Erwin 9.64, Data Stage, Microsoft outlook 2013, Excel, Visio, Access, SOAPUI, Adobe pdf, DQ Analyzer, Toad, IBM Data Studio, DB Visualizer, Cognos.

Confidential, Kansas City, MO

Data Analyst/Data Modeler

Responsibilities:

  • Worked on business requirement Gathering with subject matter experts and product Owners to identify and understand requirements.
  • Defined key facts and dimensions necessary to support the business requirements.
  • Participated in JAD sessions with stakeholders and to ensure accurate requirements were captured
  • Analyzed the source data coming from various data sources like Mainframe & Oracle.
  • Created Source layouts for Big data HDFS environment.
  • Facilitated project kickoff session with Business and SMEs to understand the requirements which also involved Project Manager, Development Lead, Data Modeler, Data Mapper, Architect etc.
  • Responsible for the Analysis, Design and Modeling with Erwin
  • Designed the ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) as per business requirements using Erwin .
  • Decided the datatype and length of columns by discuss with teams about expected data in that column and this also depends on database platform.
  • Designed Logical and physical data models (Hive and SQL Server)
  • Created schema objects like Indexes, Views and Sequences.
  • Created the relationships and Define parent and child table.
  • Created table level and column level constraints and default values
  • Created lookup tables if there any columns which need to be described
  • Tuning and optimization of SQL Queries
  • Performed the data model comparisons between the data models in data mart and the database to find if they are same. Once the data Model is completed, Created Source to target mapping Document
  • Review the Source to target mapping document with ETL teams.
  • Manipulating, cleansing & processing data using Excel, Access and SQL.
  • Performed Data Validations using SQL developer.
  • Resolved the data related issues such as: assessing data quality, data consolidation, evaluating existing data sources
  • Worked closely with Data Architect to review all the conceptual, logical and physical database design models with respect to functions, definition, maintenance review and support Data analysis, Data Quality and ETL design that feeds the logical data models of new the EDW.

Environment: - Erwin 7.3, Excel, Visio, Microsoft outlook 2010, Adobe pdf, DQ Analyzer, SQL Server, Mainframe, DB2, Hadoop, Informatica.

Confidential, Irving, TX

Sr. Data Analyst/Data Modeler

Responsibilities:

  • Responsible for the Analysis, Design and Modeling. Worked on Star Schema, Data Modeling,
  • Created Logical and Physical models for Staging, Transition and Production Warehouses using ER/Studio.
  • Designed the ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) as per business requirements using ER/studio.
  • Performed Reverse engineering of the source systems using ER studio.
  • Performed forward engineering using ER studio to generate the DDL of the target data model.
  • Extensive use of SQL for accessing and manipulating database systems and Profiling.
  • Created events and tasks in the work flows using workflow manager.
  • Created schema objects like Indexes, Views and Sequences.
  • Performed, Data mining, data profiling, data cleansing, data validation to maintain Data quality.
  • Performed data analysis of the source data coming from point of sales systems (POS) and legacy systems.
  • Involved in dimensional modeling of the data warehouse to design the business process.
  • Worked with Claims data, member’s data and provider’s data.
  • Worked with PPOs and HMO care Plans
  • Created the ETL data mapping documents between source systems and the target data warehouse.
  • Worked with the OLTP database to generate the DDL.
  • Performed 1 st, 2 nd, 3 rd Normal form.
  • Data mapping using Business Rules and data transformation logic for ETL purposes.
  • Maintained the Data Dictionary, Repository of models, performed model merging’s for all the tracks.
  • Involved in Migrating the data model from one database to oracle database and prepared a oracle staging model.
  • Designed star schemas and bridge tables to control slowly changing dimensions.
  • Involved in the process design documentation of the Data Warehouse Dimensional Upgrades.
  • Involved in Metadata management, where all the table specifications were listed and implemented the same in Ab Initio metadata hub as per data governance.
  • Validated the data during UAT testing.

Environment: ER Studio, Ab-initio, Oracle 9i/ 10g, DB2 8.0, PL/SQL, MS SQL Server 2012, SQL Developer, Toad, IBM Rational clear quest, Clear case, Cute PDF.

Confidential, Chicago, IL

Data Analyst/Data Modeler

Responsibilities:

  • Requirement gathering and Business Analysis to build a large analysis warehouse.
  • Performed Data Analysis to primarily identify Data Sets, Source Data, Source Meta Data, Data Definitions and Data Formats
  • Created logical & physical data models and Meta Data to support the requirements
  • Analyzes requirements to develop design concept and technical approaches
  • Find the business requirements by verifying Manual Reports.
  • Worked with POS for sales, transactions, quantity of sale.
  • Worked with labor tracker team to find labor time and salaries and no of labor hours.
  • Worked with different retail departments to find projected values, exact values of salaries, sales.
  • Conversion of manual reports to automated documents.
  • Worked with OLTP to find the daily transactions and type of transactions occurred and the amount of resource used. Created Data Dictionary/Meta Data of the data elements.
  • Walked through the Logical Data Models of all source systems for data quality analysis
  • Created the ETL data mapping documents between source systems and the target data warehouse.
  • Facilitated project kickoff session with Business and SMEs to understand the requirements which also involved Project Manager, Development Lead, Data Modeler, Data Mapper, Architect etc.
  • Developed and designed Data marts extensively using Star Schema.
  • Identifies impact of existing models with respect to new BI framework
  • Designs conceptual and logical models for the data warehouse or data marts to support BI reporting repository
  • Assist in creating source to target mappings from reporting data sources to Data Warehouse
  • Defining facts and dimensions for Analytic reports for Data Marts
  • Used Teradata utilities such as TPT (Teradata Parallel Transporter), FLOAD (Fastload) and MLOAD (Multiload) for handling various tasks.
  • Defining data mappings for the Data Marts
  • Analyzing existing data sources and reports

Environment: - Erwin, Teradata, DB2, Oracle 9i, SQL Server 2008, Teradata sql assistant, Unix, SSRS, Micro strategy, Data stage, Microsoft Excel, Windows XP.

Confidential, Grand Rapids, MI

Systems Analyst/Data Modeling

Responsibilities:

  • Gather Requirements from project Teams.
  • Give estimates for the task to be completed to project managers.
  • Discuss the requirements with the architects and project teams
  • Explain our approach plan to the other teams on our project.
  • Send out notification to all the application teams if there would be a change to existing tables.
  • Once we get the green light from all the teams we will start working on our data Modeling task
  • We will find out the list of tables required for this project both existing and New
  • We will get the DDL for the required existing tables and its relational tables
  • Reverse engineer the tables into the Erwin Data Modeling Tool
  • Create an OLTP ER diagram By Reverse engineering. Created backup Model Before making any changes to project
  • Add the columns to existing tables by following the naming standards and add class words as postfix to the column names. Add Meaningful comments to tables and columns with examples.
  • Performed 1 st Normal form , 2 nd Normal form and 3 rd Normal form
  • Worked on Membership and claims Data. Worked on Agents and websites providing members info.
  • Decide the datatype and length of columns by discuss with teams about expected data in that column and this also depends on database platform. Check if the columns are mandatory and need to be part of the primary key.
  • Add the table owners and check the Generate Box. Create the relationships and Define parent and child table.
  • Create Indexes on required columns to improve the performance
  • Create table level and column level constraints and default values. Created synonyms and grants as required.
  • Allocate the table spaces to the Indexes and primary keys.
  • Create partitions If the tables need to be purged periodically
  • Create history Tables for the OLAP teams and new tables if required for the project.
  • The History tables will be Type 2 SCD and will be loaded with triggers.
  • Make sure all the tables have Audit columns. Create lookup tables if there any code columns need description.
  • Created Teradata SQL scripts using OLAP functions like RANK () to improve the query performance while pulling the data from large tables.
  • Designed, developed and Unit tested SQL views using Teradata SQL to load data from source to target.
  • Skillfully defined the list codes and code conversions between the Source Systems and Data Mart as well as actively involved in extensive data analysis on Teradata and Oracle Systems querying and writing in SQL.
  • Generate the DDL by forward engineering.
  • Performed, Data mining, data profiling, data cleansing, data validation to maintain Data quality.
  • DDL Can be generated either by comparing to backup Model or by using generate database option.
  • Deploy the DDL in the Development environment and required database.
  • Save this DDL in share point and send DDL folder path to the DBA’S to be deployed in other environments like QA, External, and Production.
  • Generate Data dictionary and ER Diagram PDF and save it on share point in project folder.
  • Share the project related Documents and the change log to the Teams involved in the project.
  • Then Conduct model review sessions
  • Once the data Model is completed Create Source to target mapping Document.
  • Review the Source to target mapping document with ETL teams.
  • Follow Up With teams as the code gets deployed in different environments to find if there are any issues.
  • Populate or refresh Teradata tables using Fast load, Multi load & Fast export utilities for user Acceptance testing

Environment: - Oracle 11g, Erwin 7.3, Toad, Excel, Visio, Microsoft outlook 2010, Adobe pdf, DQ Analyzer, Teradata, Mainframe, PLSQL Developer.

Confidential

Data Analyst

Responsibilities:

  • Gathered user requirements.
  • Identified high-level requirements for developing and documenting detailed business requirements
  • Involved in meetings with SME (subject matter experts) and users for requirements gathering.
  • Revised and updated the technical feasibility documents and created sequence diagrams.
  • Responsible for the Analysis, Design and Modeling.
  • Worked with the Microsoft office tools for mapping and multiple other purposes.
  • Utilized RUP to create use cases, activity, class diagrams and workflow process diagrams.
  • Responsible for providing database solutions using Oracle database.
  • Proficient use of T-SQL/ PL/SQL in creating tables, views, triggers, stored procedures.
  • Created reports using Crystal Reports and Oracle Forms/Report.
  • Provided technical/Functional support to end-user.

Environment: Oracle 8.0/9.0,Visual Basic 5.0,TSQL, TOAD, PL/SQL Developer, UNIX Shell Scripting, Visio.

We'd love your feedback!