We provide IT Staff Augmentation Services!

Sr. Data Analyst/data Modeler Resume

5.00/5 (Submit Your Rating)

Atlanta, GA

PROFILE SUMMARY:

  • Over 7+ years of professional experience in Data Modeling, Business Data Analysis and design of OLTP systems.
  • Proficient in data mart design, creation of cubes, identifying facts & dimensions, star & snowflake schemes and canonical model.
  • Solid hands on experience with administration of data model repository, documentation in metadata portals in such as Erwin, ER Studio and Power Designer tools.
  • Strong working Experience with Agile, Scrum, Kanban and Waterfall methodologies.
  • Capture, validate and publish metadata in accordance with enterprise data governance policies.
  • Responsible for data governance rules and standards to maintain the consistency of the business element names in the different data layers.
  • Extensive experience in in - depth data analysis on different data bases and structures.
  • Extensive experience in development of T-SQL, Oracle PL/SQL Scripts, Stored Procedures and Triggers for business logic implementation.
  • Proficient in Normalization/De-normalization techniques in relational/dimensional database environments and have done normalizations up to 3NF.
  • Expertise in SQL Server Analysis Services (SSAS) and SQL Server Reporting Services (SSRS) tools.
  • Experience in backend programming including schema and table design, stored procedures, Triggers, Views, and Indexes.
  • Excellency in MS Excel with proficiency in Vlookups, Pivot Tables and understanding of VBA Macros.
  • Extensive experience using R packages like ggplot2.
  • Conduct data analysis, mapping transformation, data modeling and data-warehouse concepts.
  • Experience in conducting Joint Application Development (JAD) sessions with SMEs, Stakeholders and other project team members for requirements gathering and analysis.
  • Software Development Life Cycle (SDLC) experience including Requirements, Specifications Analysis/Design and Testing as per the Software Development Life Cycle.
  • Solid Excellent experience in creating cloud based solutions and architecture using Amazon Web services (Amazon EC2, Amazon S3, Amazon RDS) and Microsoft Azure.
  • Experience in designing star schema, Snowflake schema for Data Warehouse, ODS architecture by using tools like Erwin Data Modeler, Power Designer, Embarcadero E-R Studio and Microsoft Visio.
  • Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of big data.
  • Good understanding and hands on experience in setting up and maintaining NoSQL Databases like Cassandra and HBase.
  • Experience in data transformation, data mapping from source to target database schemas, data cleansing procedures using.
  • Excellent knowledge on creating reports on SAP Business Objects, Web reports for multiple data providers.
  • Excellent experience in writing and executing unit, system, integration and UAT scripts.
  • Extensive experience in using ER modeling tools such as Erwin and ER/Studio, Teradata and BTEQ
  • Hands on experience in SQL queries and optimizing the queries in Oracle, SQL Server, DB2, and Netezza.
  • Experience in working with RDBMS like Oracle, Microsoft SQL Server and Teradata.
  • Experience in data analysis to track data quality and for detecting/correcting inaccurate data from the databases.
  • Good Knowledge on SQL queries and creating database objects like stored procedures, triggers, packages and functions using SQL and PL/SQL for implementing the business techniques.

TECHNICAL SKILLS:

Data Modeling Tools: Erwin Data Modeler 9.7/9.6, Erwin Model Manager, ER Studio v17, and Power Designer.

Cloud Platforms: AWS, EC2, EC3, Redshift & MS Azure

Programming Languages: SQL, PL/SQL, HTML5, XML and VBA.

OLAP Tools: Tableau 7, SAP BO, SSAS, Business Objects, and Crystal Reports 9

Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server 2016/2014, DB2.

Operating System: Windows, Unix, Sun Solaris

ETL/Data warehouse Tools:: Informatica 9.6/9.1, SAP Business Objects XIR3.1/XIR2, Talend, and Pentaho.

Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Model

Big Data technologies: HBase 1.2, HDFS, Sqoop 1.4, Spark, Hadoop 3.0, Hive 2.3, EC2, S3 Bucket, AMI, RDS

Reporting Tools: SSRS, Power BI, Tableau, SSAS, MS-Excel, SAS BI Platform.

WORK EXPERIENCE:

Confidential, Atlanta, GA

Sr. Data Analyst/Data Modeler

Responsibilities:

  • As a Sr. Data Analyst/Data Modeler I was responsible for all data related aspects of a project.
  • Translated business and data requirements into data models in support of Enterprise Data Models and Analytical systems.
  • Created Logical & Physical Data Model on Relational (OLTP) on Dimension tables using Erwin.
  • Worked on master data (entities and attributes) and capture how data is interpreted by users in various parts of the organization.
  • Conducted JAD sessions with management, vendors, users and other stakeholders for open and pending issues to develop specifications.
  • Performed rigorous data analysis and data discovery and data profiling.
  • Performed GAP analysis to analyze the difference between the system capabilities and business requirements.
  • Used Agile Method for daily scrum to discuss the project related information.
  • Maintained Data Mapping documents, Bus Matrix and other Data Design artifacts that define technical data specifications and transformation rules.
  • Understand database performance factors and trends pertaining to very large database design.
  • Collaborated with DBAs to implement mitigating physical modeling solutions.
  • Provided data structures optimized for information entry and retrieval.
  • Worked on normalization & De-normalization techniques, normalized the data into 3rd Normal Form (3NF).
  • Performed detailed data analysis to analyze the duration of claim processes.
  • Implemented Forward engineering to create tables, views and SQL scripts and mapping documents.
  • Used reverse engineering to connect to existing database and create graphical representation (E-R diagram).
  • Created Hive External tables to stage data and then move the data from Staging to main tables
  • Created and maintain the metadata (data dictionary) for the data models.
  • Designed OLTP system environment and maintained documentation of Metadata.
  • Worked with Data Architects to design and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
  • Worked on AWS Redshift and RDS for implementing models and data on RDS and Redshift
  • Monitored the Data quality and integrity of data was maintained to ensure effective functioning of department.
  • Used Hive to analyze data ingested into HBase by using Hive-HBase integration andcompute various metrics for reporting on the dashboard
  • Extensively involved in PL/SQL programming Stored Procedures, Functions, Packages and Triggers
  • Managed database design and implemented a comprehensive Star-Schema with shared dimensions.
  • Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
  • Primarily used R packages for the data mining tasks.
  • Provided expertise on data storage structures and data mining.
  • Created and publish regularly scheduled ad hoc reports as needed.
  • Involved in data lineage and Informatica ETL source to target mapping development, complying with data quality and governance standards.
  • Wrote and executed unit, system, integration and UAT scripts.
  • Assisted in defining business requirements and created BRD (Business Requirements Document) and functional specifications documents.
  • Analyze and interpret data results using statistical techniques for ongoing reports.
  • Create ad-hoc reports with sensitive data pulled from Microsoft excel while mining more than 40,000 lines of data per report.
  • Contributed in all phases of data mining; data collection, data cleaning, developing models, validation and visualization.
  • Involved in data pattern recognition and data cleaning.
  • Supported development team & QA team during process design and during performance tuning, Test Strategy and test case development.

Environment: OLTP, Erwin v9.7, Agile, Oracle 12c, Big Data 3.0, SQL, XML, Apache Hive, AWS, Python 3.6, Redshift, PL/SQL, 3NF, SSAS, Metadata, ETL, Hbase.

Confidential, Newport Beach, CA

Sr. Data Analyst/Data Modeler

Responsibilities:

  • Massively involved as Data Analyst/Modeler role to review business requirement and compose source to target data mapping documents.
  • Gather all the analysis reports prototypes from the business analysts belonging to different Business units.
  • Interacted with Business Analysts to gather the user requirements and participated in data modeling JAD sessions.
  • Created 3NF business area data modeling with de-normalized physical implementation; data and information requirements analysis.
  • Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
  • Extensively used Agile methodology as the Organization Standard to implement the data Models.
  • Ensured the feasibility of the logical and physical design models.
  • Designed logical and physical data models, Reverse engineering, Complete compare for Oracle and SQL server objects using E/R Studio.
  • Developed Data Migration and Cleansing rules for the Integration Architecture using OLTP.
  • Actively participated in JAD sessions involving the discussion of various reporting needs.
  • Created PL/SQL packages and Database Triggers and developed user procedures and prepared user manuals for the new programs.
  • Interacted with Business Analyst, SMEs to understand Business needs and functionality for various project solutions.
  • Planned and defined system requirements to Use Case, Use Case Scenario and Use Case Narrative using the UML (Unified Modeling Language) methodologies.
  • Reverse Engineering the existing data marts and identified the Data Elements (in the source systems), Dimensions, Facts and Measures required for reports.
  • Created Entity Relationship Diagrams (ERD), Functional diagrams, Data flow diagrams and enforced referential integrity constraints.
  • Validated and updated the appropriate LDM's to process mappings, screen designs, use cases, business object model, and system object model as they evolve and change.
  • Conduct Design reviews with the business analysts and content developers to create a proof of concept for the reports.
  • Wrote PL/SQL statement, stored procedures and Triggers for extracting as well as writing data.
  • Worked extensively on Data Migration by using SSIS.
  • Developed rule sets for data cleansing and actively participated in data cleansing and anomaly resolution of the legacy application.
  • Defined facts, dimensions and designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using E/R Studio.
  • Involved in Data profiling and performed Data Analysis based on the requirements, which helped in catching many Sourcing Issues upfront.
  • Developed Data mapping, Transformation and Cleansing rules for the Data Management involving OLTP and ODS.
  • Used statistical data methods to analyze data and generate useful business reports while working with quality assurance.
  • Responsible for data governance rules and standards to maintain the consistency of the business element names in the different data layers.
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Created SSIS package for daily email subscriptions to alert Tableau subscription failure using the ODBC driver and PostgreSQL database.

Environment: PL/SQL, E/R Studio v17, MS SQL 2016, OLTP, ODS, SSIS, Tableau, ODBC, Transact-SQL, TOAD, Teradata r15, SQL Assistant

Confidential, New Albany, OH

Data Modeler

Responsibilities:

  • Worked as a Sr. Data Modeler to generate Data Models using E/R Studio.
  • Documented Technical & Business User Requirements during requirements gathering sessions.
  • Used Agile Method for daily scrum to discuss the project related information.
  • Analyzed conceptual into logical data and had JAD sessions and also communicated data related issues and standards.
  • Responsible for defining Conceptual Data Model (CDM), Logical Data Model (LDM) and Physical Data Model (PDM) for OLTP systems.
  • Designed a STAR schema for sales data involving shared dimensions (Conformed) for other subject areas using E/R Studio.
  • Implemented Snow-flake schema to ensure no redundancy in the database.
  • Construct complex SQL queries with sub-queries, inline views as per the functional needs in the Business Requirements Document (BRD).
  • Analyzed, retrieved and aggregated data from multiple datasets to perform data mapping.
  • Involved in designing and developing SQL server objects such as Tables, Views, Indexes (Clustered and Non-Clustered), Stored Procedures and Functions in Transact-SQL.
  • Developed scripts that automated DDL and DML statements used in creations of databases, tables, constraints, and updates.
  • Involved in modeling business processes through UML diagrams.
  • Developed Logical and Physical Data models for the loan Systems by using E/R Studio.
  • Conversed with Business Analyst and developers to gather information about the data models (Data Definition) to place the data dictionary in place.
  • Supported the DBA to physically implement the tables in both oracle databases.
  • Create DDL for /lookup data, online/real-time data (OLTP), and downstream data distribution (OLAP) and SQL components like Stored Procedures, Triggers, Functions, Indexes, ETL Packages with to the defined LDM and PDM.
  • Established process on the work flow to create Work flow diagrams using Microsoft Visio.
  • Developed the Conceptual Data Models, Logical Data Models and transformed them to creating schema using E/R Studio.
  • Forward Engineering the Data Models, Reverse Engineering on the existing Data Models and updates the data models.
  • Build and maintain SQL scripts, Indexes, and complex queries for data analysis and extraction.
  • Involved in extracting, cleansing, transforming, integrating and loading data into different Data Marts using Data Stage Designer.
  • Developed SQL Queries to fetch complex data from different tables in remote databases using joins, database links and bulk collects.
  • Worked on Data governance, data quality, data lineage establishment processes.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Assisted project with analytical techniques including data modeling, data mining techniques, regression and hypothesis to get output from large data sets.
  • Produced data team's weekly newsletters by summarizing data mining results and presenting visualization.

Environment: E/R Studio v13, SQL, SQL server 2014, Transact-SQL, DDL, DML, UML diagrams, OLTP, Microsoft Visio 2014

Confidential

Data Analyst

Responsibilities:

  • Performed Data analysis and Data profiling using complex SQL on various sources systems.
  • Developed SAS macros for data cleaning, reporting and to support routing processing.
  • Created SQL scripts to find Data quality issues and to identify keys, Data anomalies, and Data validation issues.
  • Actively involved in writing T-SQL Programming for implementing Stored Procedures and Functions and cursors, views for different tasks.
  • Performed Gap Analysis to check the compatibility of the existing system infrastructure with the new business requirement.
  • Used MS Visio for business flow diagrams and defined the workflow.
  • Wrote SQL Stored Procedures and Views, and coordinate and perform in-depth testing of new and existing systems.
  • Experienced in developing business reports by writing complex SQL queries using views, volatile tables.
  • Extensively use SAS procedures like means, frequency and other statistical calculations for Data validation.
  • Performed Data Analysis and extensive Data validation by writing several complex SQL queries.
  • Involved in design and development of standard and ad-hoc reporting using SQL/SSRS
  • Identified source databases and created the dimensional tables and checked for data quality using complex SQL queries.
  • Responsible for data lineage, maintaining data dictionary, naming standards and data quality.
  • Performed Data Manipulation using MS Excel Pivot Sheets and produced various charts for creating the mock reports.
  • Used SQL Server and MS Excel on daily basis to manipulate the data for business intelligence reporting needs.
  • Performed Data Analysis and Data Profiling and worked on data transformations and data quality rules.
  • Extracted data from different sources like Oracle and text files using SAS/Access, SAS SQL procedures and created SAS datasets.
  • Conducted data mining and data modeling in coordination with finance manager.
  • Developed SQL Joins, SQL queries, tuned SQL, views, test tables, scripts in development environment.

Environment: SQL, SAS macros, T-SQL, MS Visio 2010, MS Excel 2010, SQL Server 2010

We'd love your feedback!