We provide IT Staff Augmentation Services!

Sr. Data Modeler/sr. Data Analyst Resume

Hopkins, MN

PROFESSIONAL SUMMARY:

  • Over 9 years of experience in IT as a Data Modeler, Data Analyst, Business Analyst in the Healthcare, Finance, Retail and Telecommunications industry in all stages of the Software Development Life Cycle(SDLC).
  • Experienced in Data Warehouse, Database design, SQL, and PL/SQL programming
  • Hands on Business System Analysis experience on Full life cycle of the project using methodologies like AGILE, SCRUM Waterfall, RUP and hybrid methods.
  • Experience in conducting GAP analysis, Traceability Matrix, User Acceptance Testing (UAT), SWOT analysis, Cost benefit analysis and ROI analysis.
  • Experience in defining scope and objectives, researching and root - cause analysis; analyzing business and user requirements for complex projects.
  • 6+ years Data Modeling experience using Erwin, ER Studio, Power Designer. Dimensional Modeling, Relational Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data mart, OLTP, OLAP, FACT & Dimensions tables, Physical & Logical data modeling and Oracle Designer.
  • Extensive experience in Business Process Modeling using Visual modeling tools.
  • Performed Data mining, data profiling, data cleansing, data validation to maintain data quality.
  • Experience in gathering business requirements from business/user, creating Process Flows and Data Flow Diagram (DFD) and creating Source to Target mapping documents. Working directly with ETL team for data extraction, Data Quality (DQ) team for validation.
  • Data Analyst with diverse experience in all stages of the Software Development Life Cycle with emphasis on Business Analysis, Data Analysis, Gap Analysis and Quality Assurance.
  • Worked with the core functionalities of the data modeling tools like Forward Engineering, Reverse Engineering, Complete compare and Merging.
  • Designed star schema and Snowflake schema data Models.
  • Worked with MDM and Ref Data as per the business need.
  • Worked with heterogeneous relational databases such as Teradata, Oracle, DB2, MS Access and SQL Server.
  • Experience in working third party tools like WinSQL and TOAD.
  • Worked with Data Management team for exceptional data handling.
  • Developing reports on SSAS & SSRS on SQL Server. Sound Experience and understanding of SSAS, OLAP cube and Architecture.
  • Supported team in resolving SQL Reporting services and related issues.
  • Extensive experience in Performance tuning databases for optimum performance
  • Expert in Backup / restore of databases for maximum efficiency.
  • Hands on experience in testing ETL mappings and Business Object (BO) reports.
  • Experience in complete Unified Modeling Language (UML), Software Development Life Cycle (SDLC), Software Testing Life Cycle (STLC) and Bug Life Cycle (BLC).
  • Proficient in all cycles of test life cycle from test planning to defect tracking and managing defect lifecycle.
  • Interacting regularly with the development team, creative services, database designer, system administrator and the higher management to meet the deadlines of Project milestones.
  • Worked in close co-ordination with the testing team in developing test plan and test cases from functional and business requirements.

TECHNICAL SKILLS:

Business/Data Modeling Tools: MS Visio, MS Excel, Rational Rose, Data modeling, Sybase Power Designer, Erwin 9.6.4/7.2/4.5/4.1.4/3.5.5/3.5.2, ER Studio 7.1.1, IBM Rational System architect, Dimensional Modeling, Relational Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data mart, OLTP, OLAP, FACT & Dimensions tables, Physical & Logical data modeling and Oracle Designer.

Databases: Oracle 11g/10g/9i/8i/8.0/7.0, DB 2 8.0/7.0/6.0, MS SQL Server 2014/2012/2008 Teradata 12.1.13, Netezza, Sybase 12.x/11.x, MS Access 7.0/2000, XML, PL/SQL, SQL Plus, SQL Loader

Data Warehousing: Informatica Powercenter 8.0, SSIS, Power Mart 6.1, Warehouse Designer, Power connect for SAP/Oracle Apps/Siebel/DB2, IDQ, Power Exchange, Power Analyzer, ETL, Data mart, Siebel 7.0/2000/5.5, OLAP, OLTP, Autosys

Tools: WinSQL, TOAD, Autosys, Teradata SQL Assistant, Clear Quest, HPALM, Rally, Rational Requisite Pro, COGNOS

Environment: Windows 7/Vista/2000/XP/2003, Unix 5.2/4.3,Sun Solaris7, WinNT4.0

PROFESSIONAL EXPERIENCE:

Confidential, Hopkins, MN

Sr. Data Modeler/Sr. Data Analyst

Responsibilities:

  • Gather Requirements from the Business analyst
  • Worked with architects to create the technical document (SAS Document)
  • Facilitated project kickoff session with Business and SMEs to understand the requirements which also involved Project Manager, Development Lead, Data Modeler, Data Mapper, Architect etc.
  • Involved in profiling the data using SQL for the support of clear understanding of the data.
  • Involved in Analysis of data in source and target database by performing SQL Queries. Worked on DML, DDL and Joins.
  • Performed validation of Source Files.
  • Responsible for the Analysis, Design and Modeling with Erwin
  • Created Logical and Physical models for Staging, Transition and Production Warehouses using Erwin.
  • Designed the ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) as per business requirements using Erwin..
  • Created schema objects like Indexes, Views and Sequences.
  • Involved in dimensional modeling of the data warehouse to design the business process.
  • Tuning and optimization of SQL Queries.
  • Performed, Data mining, data profiling, data cleansing, data validation to maintain Data quality.
  • Performed the data model comparisons between the data models in data mart and the database to find if they are same. Once the data Model is completed Create Source to target mapping Document.
  • Follow Up With teams as the code gets deployed in different environments to find if there are any issues.
  • Involved in the process design documentation of the Data Warehouse Dimensional Upgrades. in the design of report layouts to maximize usability and business relevance.
  • Make sure all the tables have Audit columns.
  • Create lookup tables if there any columns which need to be described
  • Generate the DDL by forward engineering.
  • DDL Can be generated either by comparing to backup Model or by using generate database option.
  • Deploy the DDL in the Development environment and required database.
  • Save this DDL in share point and send DDL folder path to the DBA’S to be deployed in other environments like QA, External, and Production.
  • Generate Data dictionary and ER Diagram PDF and save it on share point in project folder.
  • Share the project related Documents and the change log to the Teams involved in the project.
  • Good understanding of Teradata SQL Assistant, Teradata Administrator and data load/ export utilities like BTEQ, Fast Load, Multi Load, Fast Export.
  • Using Agile Central (Rally) to enter tasks which has the visibility to all the team and Scrum Master.
  • Analyzed the JSON files using XML SPY. Involved in integrating the JSON files to API.
  • Worked collaboratively with ETL team and QA team to ensure system deliverables align with business requirements with measurable results.
  • Involved in creating Test Data for evaluating in and out boundaries of Testing
  • Involved in fixing the defects raised by QA Team.
  • Worked closely with ETL team and QA Team for End- to- End Testing.
  • Provided production support by monitoring the processes running daily.

Environment: Rally, Mainframe, DB2, Teradata Developer 15.00, L, XML Spy, Erwin 9.64, Data Stage, Microsoft outlook 2010, Excel, Visio, Access, SOAPUI, Adobe pdf, DQ Analyzer, Toad, IBM Data Studio, DB Visualizer, Cognos.

Confidential, Kansas City, MO

Data Analyst/Data Modeler

Responsibilities:

  • Worked on business requirement Gathering with subject matter experts and product Owners to identify and understand requirements.
  • Defined key facts and dimensions necessary to support the business requirements.
  • Participated in JAD sessions with stakeholders and to ensure accurate requirements were captured
  • Analyzed the source data coming from various data sources like Mainframe & Oracle.
  • Created Source layouts for Big data HDFS environment.
  • Facilitated project kickoff session with Business and SMEs to understand the requirements which also involved Project Manager, Development Lead, Data Modeler, Data Mapper, Architect etc.
  • Responsible for the Analysis, Design and Modeling with Erwin
  • Designed the ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) as per business requirements using Erwin .
  • Decided the datatype and length of columns by discuss with teams about expected data in that column and this also depends on database platform.
  • Designed Logical and physical data models (Hive and SQL Server)
  • Created schema objects like Indexes, Views and Sequences.
  • Created the relationships and Define parent and child table.
  • Created table level and column level constraints and default values
  • Created lookup tables if there any columns which need to be described
  • Tuning and optimization of SQL Queries
  • Performed the data model comparisons between the data models in data mart and the database to find if they are same. Once the data Model is completed, Created Source to target mapping Document
  • Review the Source to target mapping document with ETL teams.
  • Manipulating, cleansing & processing data using Excel, Access and SQL.
  • Performed Data Validations using SQL developer.
  • Resolved the data related issues such as: assessing data quality, data consolidation, evaluating existing data sources
  • Worked closely with Data Architect to review all the conceptual, logical and physical database design models with respect to functions, definition, maintenance review and support Data analysis, Data Quality and ETL design that feeds the logical data models of new the EDW.

Environment: - Erwin 7.3, Excel, Visio, Microsoft outlook 2010, Adobe pdf, DQ Analyzer, SQL Server, Mainframe, DB2, Hadoop, PLSQL Developer.

Confidential, Irving, TX

Sr. Data Analyst/Data Modeler

Responsibilities:

  • Responsible for the Analysis, Design and Modeling. Worked on Star Schema, Data Modeling,
  • Created Logical and Physical models for Staging, Transition and Production Warehouses using ER/Studio.
  • Designed the ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) as per business requirements using ER/studio.
  • Performed Reverse engineering of the source systems using ER studio.
  • Performed forward engineering using ER studio to generate the DDL of the target data model.
  • Extensive use of SQL for accessing and manipulating database systems and Profiling.
  • Created events and tasks in the work flows using workflow manager.
  • Created schema objects like Indexes, Views and Sequences.
  • Performed, Data mining, data profiling, data cleansing, data validation to maintain Data quality.
  • Performed data analysis of the source data coming from point of sales systems (POS) and legacy systems.
  • Involved in dimensional modeling of the data warehouse to design the business process.
  • Worked with Claims data, member’s data and provider’s data.
  • Worked with PPOs and HMO care Plans
  • Created the ETL data mapping documents between source systems and the target data warehouse.
  • Worked with the OLTP database to generate the DDL.
  • Performed 1 st, 2 nd, 3 rd Normal form.
  • Data mapping using Business Rules and data transformation logic for ETL purposes.
  • Maintained the Data Dictionary, Repository of models, performed model merging’s for all the tracks.
  • Involved in Migrating the data model from one database to oracle database and prepared a oracle staging model.
  • Designed star schemas and bridge tables to control slowly changing dimensions.
  • Involved in the process design documentation of the Data Warehouse Dimensional Upgrades.
  • Involved in Metadata management, where all the table specifications were listed and implemented the same in Ab Initio metadata hub as per data governance.
  • Validated the data during UAT testing.

Environment: ER Studio, Ab-initio, Oracle 9i/ 10g, DB2 8.0, PL/SQL, MS SQL Server 2012, SQL Developer, Toad, IBM Rational clear quest, Clear case, Cute PDF.

Confidential, Chicago, IL

Data Analyst/Data Modeler

Responsibilities:

  • Requirement gathering and Business Analysis to build a large analysis warehouse.
  • Performed Data Analysis to primarily identify Data Sets, Source Data, Source Meta Data, Data Definitions and Data Formats
  • Created logical & physical data models and Meta Data to support the requirements
  • Analyzes requirements to develop design concept and technical approaches
  • Find the business requirements by verifying Manual Reports.
  • Worked with POS for sales, transactions, quantity of sale.
  • Worked with labor tracker team to find labor time and salaries and no of labor hours.
  • Worked with different retail departments to find projected values, exact values of salaries, sales.
  • Conversion of manual reports to automated documents.
  • Worked with OLTP to find the daily transactions and type of transactions occurred and the amount of resource used.
  • Preparation of Data Dictionary/Meta Data of the data elements.
  • Walked through the Logical Data Models of all source systems for data quality analysis
  • Created the ETL data mapping documents between source systems and the target data warehouse.
  • Facilitated project kickoff session with Business and SMEs to understand the requirements which also involved Project Manager, Development Lead, Data Modeler, Data Mapper, Architect etc.
  • Developed and designed Data marts extensively using Star Schema.
  • Identifies impact of existing models with respect to new BI framework
  • Designs conceptual and logical models for the data warehouse or data marts to support BI reporting repository
  • Assist in creating source to target mappings from reporting data sources to Data Warehouse
  • Defining facts and dimensions for Analytic reports for Data Marts
  • Used Teradata utilities such as TPT (Teradata Parallel Transporter), FLOAD (Fastload) and MLOAD (Multiload) for handling various tasks.
  • Defining data mappings for the Data Marts
  • Analyzing existing data sources and reports

Environment: - Erwin, Teradata, DB2, Oracle 9i, SQL Server 2008, Teradata sql assistant, Unix, SSRS, Micro strategy, Data stage, Microsoft Excel, Windows XP.

Confidential, Grand Rapids, MI

Systems Analyst/Data Modeling

Responsibilities:

  • Gather Requirements from project Teams.
  • Give estimates for the task to be completed to project managers.
  • Discuss the requirements with the architects and project teams
  • Explain our approach plan to the other teams on our project.
  • Send out notification to all the application teams if there would be a change to existing tables.
  • Once we get the green light from all the teams we will start working on our data Modeling task
  • We will find out the list of tables required for this project both existing and New
  • We will get the DDL for the required existing tables and its relational tables
  • Reverse engineer the tables into the Erwin Data Modeling Tool
  • Create an OLTP ER diagram By Reverse engineering.
  • Create backup Model Before making any changes for this project
  • Add the columns to existing tables by following the naming standards and add class words as postfix to the column names.
  • Add Meaningful comments to tables and columns with examples.
  • Performed 1 st Normal form , 2 nd Normal form and 3 rd Normal form
  • Worked on Membership and claims Data.
  • Worked on Agents and websites providing members info.
  • Decide the datatype and length of columns by discuss with teams about expected data in that column and this also depends on database platform.
  • Check if the columns are mandatory and need to be part of the primary key.
  • Add the table owners and check the Generate Box.
  • Create the relationships and Define parent and child table.
  • Create Indexes on required columns to improve the performance
  • Create table level and column level constraints and default values.
  • Create synonyms and grants for the required teams.
  • Allocate the table spaces to the Indexes and primary keys.
  • Create partitions If the tables need to be purged periodically
  • Create history Tables for the OLAP teams and new tables if required for the project.
  • The History tables will be Type 2 SCD and will be loaded with triggers.
  • Make sure all the tables have Audit columns.
  • Create lookup tables if there any columns which need to be described
  • Created Teradata SQL scripts using OLAP functions like RANK () to improve the query performance while pulling the data from large tables.
  • Designed, developed and Unit tested SQL views using Teradata SQL to load data from source to target.
  • Skillfully defined the list codes and code conversions between the Source Systems and Data Mart as well as actively involved in extensive data analysis on Teradata and Oracle Systems querying and writing in SQL.
  • Generate the DDL by forward engineering.
  • Performed, Data mining, data profiling, data cleansing, data validation to maintain Data quality.
  • DDL Can be generated either by comparing to backup Model or by using generate database option.
  • Deploy the DDL in the Development environment and required database.
  • Save this DDL in share point and send DDL folder path to the DBA’S to be deployed in other environments like QA, External, and Production.
  • Generate Data dictionary and ER Diagram PDF and save it on share point in project folder.
  • Share the project related Documents and the change log to the Teams involved in the project.
  • Then Conduct model review sessions
  • Once the data Model is completed Create Source to target mapping Document.
  • Review the Source to target mapping document with ETL teams.
  • Follow Up With teams as the code gets deployed in different environments to find if there are any issues.
  • Populate or refresh Teradata tables using Fast load, Multi load & Fast export utilities for user Acceptance testing

Environment: - Oracle 11g, Erwin 7.3, Toad, Excel, Visio, Microsoft outlook 2010, Adobe pdf, DQ Analyzer, Teradata, Mainframe, PLSQL Developer.

Confidential

Data Analyst

Responsibilities:

  • Gathered user requirements.
  • Identified high-level requirements for developing and documenting detailed business requirements
  • Involved in meetings with SME (subject matter experts) and users for requirements gathering.
  • Revised and updated the technical feasibility documents and created sequence diagrams.
  • Responsible for the Analysis, Design and Modeling.
  • Worked with the Microsoft office tools for mapping and multiple other purposes.
  • Utilized RUP to create use cases, activity, class diagrams and workflow process diagrams.
  • Responsible for providing database solutions using Oracle database.
  • Proficient use of T-SQL/ PL/SQL in creating tables, views, triggers, stored procedures.
  • Created reports using Crystal Reports and Oracle Forms/Report.
  • Provided technical/Functional support to end-user.

Environment: TOAD, PL/SQL Developer, UNIX Shell Scripting, Visio.

Hire Now