We provide IT Staff Augmentation Services!

Sr. Data Modeler/data Analyst Resume

2.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY:

  • Overall 11+ years of professional experience in Data Modeling, Business Data Analysis and design of OLTP and OLAP systems with excellent communication, presentation, team support, database, and interfacing skills.
  • Excellent Software Development Life Cycle (SDLC) with good working knowledge of testing methodologies, disciplines, tasks, resources and scheduling.
  • Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
  • Proficient in interacting with users, analyzing client business processes, documenting business requirements, performing design analysis and developing design specifications.
  • Strong experience in Data modeling - Conceptual, Logical/Physical, Relational and Multi-dimensional modeling, Data analysis for Decision Support Systems (DSS), Data Transformation (ETL) and Reporting.
  • Involved in JAD sessions with developers, user interface personnel and end users.
  • Experienced in identifying entities, attributes, metrics, and relationships; also assigning keys and optimizing the model.
  • Proficient in data mart design, creation of cubes, identifying facts & dimensions, star & snowflake schemes and canonical model.
  • Expertise in Data modeling for RDBMS and NoSQL data modeling; Data Warehouse/ Data Mart development, ER modeling, Canonical Modeling, Dimensional Modeling, Data Analysis for Online Transaction Processing (OLTP) and Data Warehousing (OLAP)/ Business Intelligence (BI) applications.
  • Well versed in Normalization/De-normalization techniques for optimum performance in relational and dimensional database environments.
  • Proficient in Normalization/De-normalization techniques in relational/dimensional database environments and have done normalizations up to 3NF.
  • Involved in Data Architecture, Data profiling, Data analysis, data mapping and Data architecture artifacts design.
  • Good understanding of AWS, big data concepts and Hadoop ecosystem.
  • Experienced in various Teradata utilities like Fastload, Multiload, BTEQ, and Teradata SQL Assistant.
  • Expert in writing SQL queries and optimizing the queries in Oracle, SQL Server and Teradata.
  • Proficiency in extracting data and creating SAS datasets from various sources like Oracle database, Access database, and flat files using Import techniques.
  • Experience in dealing with different data sources ranging from flat files, Oracle, Sybase, and SQL server.
  • Strong experience in using MS Excel and MS Access to dump the data and analyze based on business needs.
  • Conduct data analysis, mapping, transformation, data modeling and data-warehouse concepts.
  • Proficient in data governance, data quality, metadata management, master data management.
  • Hands on experience in working with Tableau Desktop, Tableau Server and Tableau Reader in various versions.
  • Expertise in performing User Acceptance Testing (UAT) and conducting end user training sessions.
  • Experience in testing Business Intelligence reports generated by various BI Tools like Cognos and Business Objects.
  • Good knowledge and experienced in producing tables, reports, graphs and listings using various procedures and handling large databases to perform complex data manipulations.

TECHNICAL SKILLS:

Data Modeling Tools: Erwin Data Modeler 9.7/9.6, Erwin Model Manager, ER Studio v17, and Power Designer.

Programming Languages: SQL, PL/SQL, HTML5, XML and VBA.

Reporting Tools: SSRS, Power BI, Tableau, SSAS, MS-Excel, SAS BI Platform.

Cloud Platforms: AWS, EC2, EC3, Redshift & MS Azure

OLAP Tools: Tableau 7, SAP BO, SSAS, Business Objects, and Crystal Reports 9

Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server 2016/2014, DB2.

Operating System: Windows, Unix, Sun Solaris

ETL/Data warehouse Tools: Informatica 9.6/9.1, SAP Business Objects XIR3.1/XIR2, Talend, and Pentaho.

Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Model

PROFESSIONAL EXPERIENCE:

Confidential - Chicago, IL

Sr. Data Modeler/Data Analyst

Responsibilities:

  • As a Sr. Data Modeler/ Analyst to generate Data Models using Erwin and subsequent deployment to Enterprise Data Warehouse.
  • Participated in design discussions and assured functional specifications are delivered in all phases of SDLC in an Agile Environment.
  • Defined facts, dimensions and designed the data marts using the Ralph Kimball’s Dimensional Data Mart modeling methodology using Erwin.
  • Identified and documented data sources and transformation rules required to populate and maintain data Warehouse content.
  • Assisted design logical models (relationship, cardinality, attributes, candidate keys) as per business requirements using Erwin Data Modeler.
  • Used forward engineering to create a physical data model with DDL that best suits the requirements from the Logical Data Model.
  • Wrote DDL and DML statements for creating, altering tables and converting characters into numeric values.
  • Developed MDM integration plan and hub architecture for customers, products and vendors, Designed MDM solution for three domains.
  • Extensively used Star and Snowflake Schema methodologies.
  • Used Normalization (1NF, 2NF & 3NF) and De-normalization techniques for effective performance in OLTP and OLAP systems.
  • Developed dimensional model for Data Warehouse/OLAP applications by identifying required facts and dimensions.
  • Worked on Performance Tuning of the database which includes indexes, optimizing SQL Statements.
  • Worked on Data load using Azure Data factory using external table approach.
  • Involved in Installing, Configuring Hadoop Eco-System, Cloudera Manager using CDH3, CDH4 Distributions.
  • Involved in Normalization / De normalization techniques for optimum performance in relational and dimensional database environments.
  • Involved in development and implementation of SSIS and SSAS application solutions for various business units across the organization.
  • Designed and implemented a Data Lake to consolidate data from multiple sources, using Hadoop stack technologies like SQOOP, HIVE/HQL.
  • Designed ER diagrams (Physical and Logical using Erwin) and mapping the data into database objects and identified the Facts and Dimensions from the business requirements
  • Wrote the SQL queries on data staging tables and data warehouse tables to validate the data results.
  • Build and maintain SQL scripts, Indexes, and complex queries for data analysis and extraction.
  • Involved in extracting, cleansing, transforming, integrating and loading data into different Data Marts using Data Stage Designer.
  • Performed Data Analysis, Data Migration and data profiling using complex SQL on various sources systems.
  • Generate ad-hoc or management specific reports using SSRS and Excel.
  • Created SQL queries to generate ad-hoc reports for the business.
  • Used windows Azure SQL reporting services to create reports with tables, charts and maps.
  • Worked with data compliance teams, Data governance team to maintain data models, Metadata, Data Dictionaries.

Environment: Erwin 9.7, Agile, Ralph Kimball, MDM, 3NF, OLAP, OLTP, Azure, Hadoop 3.0, Hive 2.3, SSRS, SSIS

Confidential - Philadelphia, PA

Sr.Data Modeler

Responsibilities:

  • Worked as a Data Modeler to generate Data Models and developed relational database system.
  • Identified and compiled common business terms for the new policy generating system and also worked on contract Subject Area.
  • Interacted with Business Analysts to gather the user requirements and participated in data modeling JAD sessions.
  • Extensively used Agile methodology as the Organization Standard to implement the data Models.
  • Maintained the stage and production conceptual, logical, and physical data models along with related documentation for a large data warehouse project.
  • Involved in logical and Physical Database design & development, Normalization and Data modeling using Erwin and SQL Server Enterprise manager.
  • Performed data analysis using SQL queries on source systems to identify data discrepancies and determine data quality.
  • Created AWS S3 buckets also managed policies for AWS S3 buckets and Utilized AWS S3 bucket to store.
  • Served as a resource for analytical services utilizing SQL Server and TOAD/Oracle.
  • Created SQL queries using TOAD and SQL Navigator and also created various databases object stored procedure, tables, views.
  • Used Erwin to create report templates. Maintained and changed the report templates as needed to generate varying data dictionary formats as contract deliverables.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Created Data stage jobs (ETL Process) for populating the data into the Data warehouse constantly from different source systems.
  • Analyzed and designed the business rules for data cleansing that are required by the staging and OLAP and OLTP database.
  • Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable.
  • Responsible for troubleshooting issues in the execution of MapReduce jobs by inspecting and reviewing log files in AWS s3.
  • Identified and documented data sources and transformation rules required to populate and maintain data warehouse content.
  • Responsible for indexing the tables in the data warehouse and performed data modeling within information areas across the enterprise including data cleansing and data quality.
  • Created a high-level industry standard, generalized data model to convert it into Logical and Physical model at later stages of the project using Erwin and Visio.
  • Generated XMLs from the Erwin to be loaded into MDR (metadata repository)
  • Conducted design walk through sessions with Business Intelligence team to ensure that reporting requirements are met for the business.
  • Wrote SQL scripts for creating tables, Sequences, Triggers, views and materialized views.
  • Designed Data Flow Diagrams, E/R Diagrams and enforced all referential integrity constraints.
  • Developed and maintains data models and data dictionaries, data maps and other artifacts across the organization, including the conceptual and physical models, as well as metadata repository
  • Performed extensive Data Validation, Data Verification against Data Warehouse and performed debugging of the SQL-Statements and stored procedures for business scenarios.

Environment: Erwin9.5, SQL, Oracle11g, ETL, OLAP, OLTP, MS Visio v15.0, XML, ER Diagrams, TOAD, AWS, Agile

Confidential - Plano, TX

Data Analyst /Data Modeler

Responsibilities:

  • Connected to Redshift through Tableau to extract live data for real time analysis.
  • Identified/documented data sources and transformation rules required populating and maintaining data warehouse content.
  • Extensively Generated DDL's and make the same available to the DBA for execution
  • Trained Spotfire tool and gave guidance in creating Spotfire Visualizations to couple of colleagues
  • Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
  • Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
  • Analyzed the physical data model to understand the relationship between existing tables.
  • Extensively used ER Studio as the main tool for modeling along with MS Visio
  • Designed data process flows using Informatica to source data into Statements database on Oracle platform.
  • Involved in Data profiling in order to detect and correct inaccurate data and maintain the data quality.
  • Performed Normalization of the existing OLTP systems (3rd NF), to speed up the DML statements execution time.
  • Performed data cleaning and data manipulation activities using NZSQL utility.
  • Developed triggers, stored procedures, functions and packages using cursors and ref cursor concepts associated with the project using PL/SQL
  • Worked on creating filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
  • Designed and developed the universe and map them with SAP report fields.
  • Designed a STAR schema for sales data involving shared dimensions (Conformed) for other subject areas using E/R Studio.
  • Used Informatica & SAS to extract transform & load source data from transaction systems.
  • Generated various reports using SQL Server Report Services (SSRS) for business analysts and the management team.
  • Designed and Developed Use Cases, Activity Diagrams, and Sequence Diagrams using Unified Modeling Language (UML)
  • Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable
  • Worked on Unit Testing for three reports and created SQL Test Scripts for each report as required

Environment: ER/Studio 10.2, SAS, SSIS SSRS, PL/SQL, OLTP, 3NF, DDL.

Confidential - McLean, VA

Data Analyst

Responsibilities:

  • Created reports using MS Access and Excel. Applying filters to retrieve best results.
  • Analysed business requirements, system requirements data mapping requirement specifications.
  • Wrote and executed unit, system, integration and UAT scripts in a data warehouse project.
  • Performed the SQL Tuning and optimized the database and created the technical documents.
  • Imported the Excel Sheet, CSV, Delimited Data, advanced excel features, ODBC compliance data sources into database for data extractions, data processing, and business needs.
  • Designed and optimized the SQL queries and exported the data into database server.
  • Worked with Business Analysts to create hierarchies and design reports using OBIEE (Oracle Business Intelligence Enterprise Edition).
  • Performed Data Analysis and Data Profiling and worked on data transformations and data quality rules.
  • Generated and documented Metadata while designing OLTP and OLAP system environment.
  • Worked on the migration of data from subsystem database to system database using PL/SQL.
  • Developed and maintained data solutions that utilize SQL, Microsoft SQL Server Reporting Services.
  • Wrote SQL Stored Procedures and Views, and coordinate and perform in-depth testing of new and existing systems.
  • Worked primarily on SQL Server, creating Store Procedures, Functions, Triggers, Indexes and Views using T-SQL
  • Performed analysis and presented results using SQL, SSIS, Excel, and Visual Basic scripts.
  • Identified source databases and created the dimensional tables and checked for data quality using complex SQL queries.
  • Developed Unix shell programs and scripts to maximize productivity and resolve issue
  • Created data visualization report in tableau for claims data, generated dashboards for the project delivering insights graphically.
  • Created pivot tables, graphs, charts, macros in MS Excel and built Tableau dashboards.

Environment: MS Access MS Excel SQL OLTP OLAP T-SQL SSIS

Confidential - Cary, NC

Data Analyst/ Data Modeler

Responsibilities:

  • Worked with Business users for requirements gathering, business analysis and project coordination.
  • Worked closely with various business teams in gathering the business requirements.
  • Experienced in data cleansing and Data migration for accurate reporting
  • Translated business concepts into XML vocabularies by designing XML Schemas with UML.
  • Participated in all phases of data mining, data collection, data cleaning, developing models, validation, and visualization and performed Gap analysis.
  • Interacted with Business Analysts to gather the user requirements and participated in datamodeling JAD sessions.
  • Performed Data mapping, logical data modeling, data mining, created class diagrams and ER diagrams and used SQL queries to filter data.
  • Created Schema objects like Indexes, Views, and Sequences, triggers, grants, roles, Snapshots.
  • Performed data analysis and data profiling using complex SQL on various sources systems
  • Worked on SQL queries in a dimensional data warehouse as well as a relational data warehouse.
  • Defined facts, dimensions and designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
  • Involved with Business Analysts team in requirements gathering and in preparing functional specifications and changing them into technical specifications.
  • Coordinated with DBAs and generated SQL codes from data models.
  • Part of team conducting logical data analysis and data modeling JAD sessions, communicated data-related standards.
  • Developed detailed ER diagram and data flow diagram using modeling tools following the SDLCstructure.
  • Created 3NF business area data modeling with de-normalized physical implementation; data and information requirements analysis.
  • Used SQL tools to run SQL queries and validate the data loaded in to the target tables
  • Created tables, views, sequences, indexes, constraints and generated SQL scripts for implementing physical data model.
  • Created dimensional model for reporting system by identifying required dimensions and facts using Erwin.
  • Worked and extracted data from various database sources like DB2, CSV, XML and Flat files into the Data Stage.
  • Generated ad-hoc reports in Excel Power Pivot and sheared them using PowerBI to the decision makers for strategic planning.

Environment: XML, SQL, ErwinR9.6, 3NF, DB2, CSV, MS Excel 2014, Power BI, Flat Files, DB2, JAD.

Confidential - Indianapolis, IN

Data Analyst

Responsibilities:

  • Performed Data analysis and Data profiling using complex SQL on various sources systems.
  • Developed SAS macros for data cleaning, reporting and to support routing processing.
  • Created SQL scripts to find Data quality issues and to identify keys, Data anomalies, and Datavalidation issues.
  • Actively involved in writing T-SQL Programming for implementing Stored Procedures andFunctions and cursors, views for different tasks.
  • Performed Gap Analysis to check the compatibility of the existing system infrastructure with the new business requirement.
  • Used MS Visio for business flow diagrams and defined the workflow.
  • Wrote SQL Stored Procedures and Views, and coordinate and perform in-depth testing of new and existing systems.
  • Experienced in developing business reports by writing complex SQL queries using views, volatile tables.
  • Extensively use SAS procedures like means, frequency and other statistical calculations for Data validation.
  • Performed Data Analysis and extensive Data validation by writing several complex SQL queries.
  • Involved in design and development of standard and ad-hoc reporting using SQL/SSRS
  • Identified source databases and created the dimensional tables and checked for data quality usingcomplex SQL queries.
  • Responsible for data lineage, maintaining data dictionary, naming standards and data quality.
  • Performed Data Manipulation using MS Excel Pivot Sheets and produced various charts for creating the mock reports.
  • Used SQL Server and MS Excel on daily basis to manipulate the data for business intelligencereporting needs.
  • Performed Data Analysis and Data Profiling and worked on data transformations and data quality rules.
  • Extracted data from different sources like Oracle and text files using SAS/Access, SAS SQL procedures and created SAS datasets.
  • Conducted data mining and data modeling in coordination with finance manager.
  • Developed SQL Joins, SQL queries, tuned SQL, views, test tables, scripts in development environment.

Environment: SQL, SAS macros, T-SQL, MS Visio 2010, MS Excel 2010, SQL Server 2010

We'd love your feedback!