We provide IT Staff Augmentation Services!

Data Modeler & Data Analyst Resume

Richmond, VA

SUMMARY

  • To obtain a challenging position as Data Analyst with an emphasis on Data Warehousing to utilize my qualification and experience in Data Modeling.
  • 5+ years of professional Experience in data modeling, data profiling, designing and data analysis with Conceptual, Logical and PhysicalModeling for OLTP & OLAP.
  • Extensive experience on interaction with system users in gathering business requirements and involved in developing projects. Chaired and conducted JAD sessions with business and technical teams. Also experienced in Process improvement.
  • Excellent knowledge inDataAnalysis,DataValidation,DataCleansing,DataVerification and identifyingdatamismatch.
  • Vast experience of working in the area of data management including data analysis and data mapping.
  • Worked extensively with Relational Modeling and Dimensional Modeling.
  • Experience in Data Transformation, Data Loading, Data Modeling, Metadata, Master data management and Performance Tuning.
  • Worked with various RDBMS like Oracle 9i/10g/11g, SQL Server 2005/2008, DB2, Teradata and Snowflake.
  • Well versed in Normalization/Denormalization techniques for optimum performance in relational and dimensional database environments.
  • Efficient inDimensional Data Modeling, identifying Facts and Dimensions,Star SchemaandSnowflake Schema.
  • Exposure toTop Down (Inmon) and Bottom Up (Kimbal).
  • Extensive experience in Data Analysis and ETL Techniques for loading high volumes of data and smooth structural flow of the data.
  • Adept at writing Data Mapping Documents, Data Transformation Rules and maintaining Data Dictionary and Interface requirements documents.
  • Worked on integration and implementation of projects and products, database creations, modeling, calculation of object sizes, table spaces and database sizes.
  • Experience with creating reports using Tableau.
  • Worked closely with Data Architect in creating data models.
  • Coordinated and prioritized outstanding defects and system requests based on business requirements. Acted as a liaison between the development team and the management team to resolve any conflicts in terms of requirements.

TECHNICAL SKILLS

Database: Snowflake, Teradata, Oracle, SQL Server.

Data Warehouse Concepts: Star Schema, Snowflake schema, OLTP, OLAP

Operating Systems: UNIX, Linux, Windows 10/8

Design: Entity - Relationships, Data Flow Diagrams.

Languages: SQL, PL/SQL, T-SQL.

Tools: Erwin 9.x, MS-Visio, Microsoft Access, Informatica, TOAD, Business Objects, Tableau.

Desktop Software: Microsoft Word, Excel and PowerPoint.

PROFESSIONAL EXPERIENCE

Data modeler & Data analyst

Confidential, Richmond, vA

Responsibilities:

  • Worked with business requirement analysts and Business SME to understand the requirements and identify available data sources for the business unit.
  • Worked on identifying existing source extracts and defining new extracts to be captured for the EDW.
  • Performed Detailed Data Analysis (DDA), Data Quality Analysis (DQA) and Data Profiling on source data and documenting data quality issues.
  • Dealt with different data sources ranging from flat files, Excel, Oracle and Mainframes.
  • Experienced on working with Agile Methodology.
  • Understanding and translating source code values.
  • Created pivot tables for source data analysis purposes.
  • Supporting the Data Steward in understanding potential enterprise key data elements and Master Data Management.
  • Worked with OLAP tools such as ETL, Data warehousing and Modeling.
  • Identifying enterprise level data element and supported in defining common data definitions.
  • Experience working with Metadata repositories (MDR).
  • Onboarded the datasets and created the business views depending on the NPI and Credit data for the business users, across all environments.
  • Worked on 13 sub-domains in Card like Risk, Fraud, LossMittigation, Fulfillment etc.
  • Cleaning up the unwanted data from the production datasets.
  • Worked for multiple LOBs (Card, Bank, Digital) and multiple sub domains in each LOB.
  • Created the Schemas on snowflake and granted the requested access.
  • Assigned the batch roles on the schema level and the on the datasets and the business views.
  • Resolved the DB space issues in the containers in production.
  • Onboarded the Public information, NON-Public information and the Credit business views across all environments for all the LOBs.
  • Creating the customized views from different datasets depending on the business requirement.
  • Worked on user role creation and granting them on the appropriate user views.
  • Worked on HP Service Manager and ServiceNow for on boarding the datasets or making the changes in production.
  • Created ECO in ServiceNow for resolving the issues in production with Incident number.
  • Granted the RUDI access to the batch user roles for loading the data.

Environment:Data Modeling,ERWIN.9.6, Snowflake, Teradata, SQL workbench, ServiceNow, HPSM, Nebula, MS Office, E-R Studio, PL/SQL.

Data modeler & data Analyst

Confidential, Dallas, tx

Responsibilities:

  • Worked closely with Data Architect to review all the conceptual, logical and physical database design models with respect to functions, definition, maintenance review and support Data analysis, Data Quality and ETL design that feeds the logical data models.
  • Resolved the data related issues such as: assessing data quality, data consolidation, evaluating existing data sources.
  • Created data mapping documents mapping Logical Data Elements to Physical Data Elements and Source Data Elements to Destination Data Elements.
  • Developed and ran Unix shell scripting.
  • Involved working on different set of layers of Business Intelligence Infrastructure.
  • Performed Data cleansing and scrubbing while finding the quality data.
  • Worked on integration workflows and load processes.
  • Dealt with Oracle Hyperion Financial Management tool for financial consolidation and reporting.
  • Worked extensively in Data consolidation.
  • Worked with pivot table creation for data analysis.
  • Used Informatica for Data Integration purposes.
  • Worked extensively on the T-SQL environment to run out the queries and explore the Databases.
  • Defined key facts and dimensions necessary to support the business requirements.
  • Worked with OLAP tools such as ETL, Data warehousing and Modeling.
  • Meeting with user groups to analyze requirements and proposed changes in design and specifications.
  • Created visual analytics for large data using Tableau on Sales and marketing Data, to assure integrity, identifying the root cause of data inconsistencies.
  • Analyzed various reports, dashboards, scorecards in MicroStrategy and created the same using Tableau desktop server.
  • Provided customer support to Tableau users and Wrote Custom SQL to support business requirements.
  • Worked closely with reporting team for deploying Tableau reports and publishing them on the Tableau and Share point server.
  • Performed Detailed Data Analysis (DDA), Data Quality Analysis (DQA) and Data Profiling on source data.
  • Flat file conversion from the data warehouse scenario.
  • Created Fact and Dimension tables in Mart.

Environment: Oracle, UNIX, Informatica, Tableau, MS Office, E-R Studio, PL/SQL, and Putty.

data modeler

Confidential, San Antonio, tx

Responsibilities:

  • Worked as a Data Modeler to generate Data Models and developed relational database system.
  • Identified and compiled common business terms for the new policy generating system and worked on contract Subject Area.
  • Maintained the stage and production conceptual, logical, and physical data models along with related documentation for a large data warehouse project.
  • Involved in logical and Physical Database design & development, Normalization and Data modeling using Erwin and SQL Server Enterprise manager.
  • Investigated data sources to identify new data elements needed for data integration.
  • Performed data analysis using SQL queries on source systems to identify data discrepancies and determine data quality.
  • Served as a resource for analytical services utilizing SQL Server.
  • Created SQL queries using SQL Navigator and created various databases object stored procedure, tables, views.
  • Used Erwin to create report templates. Maintained and changed the report templates as needed to generate varying data dictionary formats as contract deliverables.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Created Data stage jobs (ETL Process) for populating the data into the Data warehouse constantly from different source systems.
  • Wrote SQL scripts for creating tables, Sequences, Triggers, views and materialized views.
  • Designed Data Flow Diagrams, E/R Diagrams and enforced all referential integrity constraints.
  • Developed and maintains data models and data dictionaries, data maps and other artifacts across the organization, including the conceptual and physical models, as well as metadata repository
  • Performed extensive Data Validation, Data Verification against Data Warehouse and performed debugging of the SQL-Statements and stored procedures for business scenarios.

Environment: Erwin9.5, SQL, Oracle11g, ETL, OLAP, OLTP, MS Visio v15.0, XML, ER Diagrams.

data modeler

Confidential

Responsibilities:

  • Involved in the entire System study, analysis and Design.
  • Created Entity relationship diagrams, Function relationship diagrams, data flow diagrams and enforced all referential integrity constraints using Oracle Designer.
  • Developed the logical and physical data model for the proposed solution.
  • Worked as DBA to create a best fit physical data model from the logical data model.
  • Created and updated the tables in the database as per the logical data model.
  • Created 3NF business area data modeling with de-normalized physical implementation data and information requirements analysis using Erwin tool.
  • Worked on data deletion as per the user request and removing the duplicate data and maintain data quality.
  • Extensively used star schema methodologies in building and designing the logicaldatamodels into dimensional model.
  • Used Erwin Database Generation for generating DDL, stored procedure and trigger code for your target database.
  • Defined best practices for project support and detailed documentation.
  • Ensured delivery of results as per pre-defined quality parameters and customer satisfaction metrics. This involved the right kind of data analysis that in turn leads to close ended action plans.

Environment: Oracle, Erwin 7.38, SQL, TOAD, VBA, MS Excel

Hire Now