We provide IT Staff Augmentation Services!

Sr. Data Analyst/data Modeler Resume

Eden Prairie, MN


  • Over 7+ years of experience as Data Modeler/Data Analyst with high proficiency in requirement gathering and data modeling including design and support of various applications in OLTP, Data Warehousing, OLAP and ETL Environment.
  • Experience in Big Data, NoSQL Database like Cassandra and technical expertise in Hadoop.
  • Good understanding of AWS, big data concepts and Hadoop ecosystem.
  • Extensive ETL testing experience using Informatica, Talend, Pentaho.
  • Experience in SQL and good knowledge in PL/SQL programming and developed Stored Procedures and Triggers and Data Stage, DB2, UNIX, Cognos, MDM, UNIX, Hadoop, Pig.
  • Logical and physical database designing like Tables, Constraints, Index, etc. using Erwin, ER Studio, TOAD Modeler and SQL Modeler.
  • Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of big data
  • Experience in Dimensional Data Modeling, Star/Snowflake schema, FACT & Dimension tables.
  • Expertise on Relational Data modeling (3NF) and Dimensional data modeling.
  • Proficient in System Analysis, ER/Dimensional Data Modeling, Database design and implementing RDBMS specific features.
  • Experience in integration of various relational and non - relational sources such as DB2, Teradata, Oracle, Netezza, SQL Server, NoSQL, COBOL, XML and Flat Files, to Netezza database.
  • Expertise in Physical Modeling for multiple platforms such as Oracle/Teradata/ SQL Server/DB2.
  • Design data marts using Ralph Kimball and Bill Inmon dimensional data modeling techniques.
  • Proficient in writing DDL, DML commands using SQL developer and Toad.
  • Experience in Oracle SQL and PL/SQL including all database objects: Stored procedures, stored functions.
  • In depth and thorough knowledge of development and design with RDBMS -OLTP, dimensional modeling using data modeling tool Erwin, Sybase Power Designer.
  • Involve in analysis, development and migration of Stored Procedures, Triggers, Views and other related database objects with solid experience in MS-SQL Server, Sybase 10.12, Sybase 4.2 and Oracle, IBM-DB2.
  • Having good knowledge in Normalization and De-Normalization techniques for optimum performance in relational and dimensional database environments.
  • Strong experience working with conceptual, logical and physical data modeling considering Metadata standards.
  • Extensive experience in development of T-SQL, Oracle PL/SQL Scripts, Stored Procedures and Triggers and Packages for business logic implementation.
  • Experience in performance analysis and created partitions, indexes and Aggregate tables where necessary.
  • Experience with DBA tasks involving database creation, performance tuning, creation of indexes, creating and modifying table spaces for optimization purposes.
  • Perform extensive Data profiling and analysis for detecting and correcting inaccurate data from the databases and to track data quality.


Project Execution Methodologies: Ralph Kimball & Bill Inmon data warehousing methodology, Rational Unified Process (RUP), Agile, Rapid Application Development (RAD), Joint Application Development (JAD)

Programming Languages: Sql, PL/SQL, Unix, shell Scripting, Perl

Databases: Netezza, MS SQL Server, Oracle, MS Access 2016, IBM DB2.

Cloud Platform: AWS, Amazon Redshift

Big Data: Hadoop 3.0, HDFS, Hive 2.3, Pig, HBase 1.2, Sqoop, Flume 1.8.

Data Modeling Tools: Erwin 9.7/9.6, Sybase Power Designer, Oracle Designer, ER/Studio V17

Reporting Tools: Crystal reports, Business Intelligence, SSRS, Business Objects, and Cognos.

Languages: SQL, PL/SQL, T-SQL, HTML 5, XML

BI Tools: Cognos, Business Objects and Crystal Reports.


Confidential, Eden Prairie, MN

Sr. Data Analyst/Data Modeler


  • As a Data Analyst/Modeler Gathered and translated business requirements into detailed production-level technical specifications, new features, and enhancements to existing technical business functionality.
  • Worked in an Agile team using Scrum methodology and participated in backlog prioritization with product owner and Scrum master.
  • Developed a generic model for predicting repayment of debt owed in the healthcare, large commercial, and government sectors.
  • Created Logical & Physical Data Modeling on Relational (OLTP), Dimensional Data Modeling (OLAP) on Star schema for Fact & Dimension tables using Erwin.
  • Create and maintain data model standards, including master data management (MDM).
  • Implemented Normalization Techniques and build the tables as per the requirements given by the business users.
  • Worked Normalization and De-normalization concepts and design methodologies like Ralph Kimball and Bill Inmon approaches.
  • Developed and presented Business Intelligence reports and product demos to the team using SSRS (SQL Server Reporting Services).
  • Used windows Azure SQL reporting services to create reports with tables, charts and maps.
  • Used Azure reporting services to upload and download reports
  • Worked with MDM systems team with respect to technical aspects and generating reports.
  • Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information
  • Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
  • Constructed complex SQL queries with sub-queries, inline views as per the functional needs in the Business Requirements Document (BRD).
  • Involved in designing and developing SQL server objects such as Tables, Views, Indexes (Clustered and Non-Clustered), Stored Procedures and Functions in Transact-SQL.
  • Delivered dimensional data models using Erwin to bring in the Employee and Facilities domain data into the oracle data warehouse.
  • Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture involving OLTP, ODS and OLAP.
  • Performed detailed data analysis to analyze the duration of claim processes and created the cubes with Star Schemas using facts and dimensions through SQL Server Analysis Services (SSAS).
  • Performed Data Analysis and Data Profiling using complex SQL queries on various sources systems including Oracle, Teradata.
  • Part of JAD sessions for defining business requirements and translating them into technical specifications.
  • Responsible for designing a Data dictionary & Business Glossary.
  • Designed stunning visualization dashboards using Tableau desktop and publishing dashboards on Tableau server and desktop reader.
  • Created DDL scripts using Erwin and source to target mappings to bring the data from source to the warehouse.
  • Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT).
  • Worked with various Teradata tools and utilities like Teradata Viewpoint, Multiload, Teradata Administrator, BTEQ and other Teradata Utilities.
  • Created stored procedures and functions using Dynamic SQL and T- SQL.
  • Developed complex Stored Procedures for SSRS (SQL Server Reporting Services) and created database objects like tables, indexes, synonyms, views, materialized views etc.
  • Working closely with the Data Stewards to ensure correct and related data is captured in the data warehouse as part of Data Quality check.

Environment: Erwin 9.7, SQL, Oracle 12c, Teradata 15, MS SQL Server 2016, Multiload, BTEQ, OLTP, OLAP, SSRS, HBase 1.2, MongoDB, Cassandra 3.11, MDM, AWS, EC2, PL/SQL

Confidential, Research Triangle Park, NC

Data Analyst/Data Modeler


  • Participated in all phases of project including Requirement gathering, Analysis, Design, Coding, Testing, Documentation and warranty period.
  • Worked with Business users for requirements gathering, business analysis and project coordination.
  • Translated business concepts into XML vocabularies by designing XML Schemas with UML.
  • Participated in design discussions and assured functional specifications are delivered in all phases of SDLC in an Agile Environment.
  • Worked with data compliance teams, data governance team to maintain data models, Metadata, data Dictionaries, define source fields and its definitions.
  • Developed a Conceptual Model and Logical Model using Erwin based on requirements analysis.
  • Created various Physical Data Models based on discussions with DBAs and ETL developers.
  • Defined facts, dimensions and designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
  • Designed Star and Snowflake Data Models for Enterprise Data Warehouse using Erwin.
  • Worked on data mapping process from source system to target system.
  • Created dimensional model for the reporting system by identifying required facts and dimensions using Erwin.
  • Extensively used Star and Snowflake Schema methodologies.
  • Developed and maintained data Dictionary to create Metadata Reports for technical and business purpose.
  • Worked on Performance Tuning of the database which includes indexes, optimizing SQL Statements.
  • Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Implemented Forward engineering to create tables, views and SQL scripts and mapping documents.
  • Wrote complex SQL queries and optimizing them to pull the required data from multiple data sources in an effective way
  • Proceed and cleansed data using Excel and SQL queries
  • Validated any changes of data, Behind Logic/Formula/Calculation of new created fields
  • Prepared data dictionaries and Source-Target Mapping documents to ease the ETL process and user's understanding of the data warehouse objects
  • Used Normalization (1NF, 2NF & 3NF) and Denormalization techniques for effective performance in OLTP and OLAP systems.
  • Created PL/SQL procedures in order to aid business functionalities like bidding and allocation of inventory to the shippers.
  • Worked with Business analysts for requirements gathering, business analysis, testing, metrics and project coordination.
  • Performed Data profiling, data mining and identified the risks involved with data integration to avoid time delays in the project.
  • Wrote DDL and DML statements for creating, altering tables and converting characters into numeric values.
  • Worked on Performance Tuning of the database which includes indexes, optimizing SQL Statements.
  • Created Project Plan documents, Software Requirement Documents, Environment Configuration and UML diagrams.
  • Directly worked with project managers, SME and various stakeholders for project budgets, billing hours and schedules.
  • Performed User Acceptance Testing, Device and Performance testing, Functional and Regression testing.
  • Created data flow, process documents and ad-hoc reports to derive requirements for existing system enhancements.

Environment: Erwin 9.6, EDW, Informatica, Oracle 12c, SQL, DB2, OLAP, OLTP, PL/SQL, Teradata 15

Confidential, Merrimack, NH

Sr. Data Analyst/Data Modeler


  • Used Agile software development methodology in defining the problem, gathering requirements, development iterations, business modeling and communicating with the technical team for development of the system.
  • Developed normalized Logical and Physical database models to design OLTP system for insurance applications
  • Developed a Conceptual model using Erwin based on requirements analysis
  • Created dimensional model for the reporting system by identifying required dimensions and facts using Erwin.
  • Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model
  • Involved in the analysis of the existing claims processing system, mapping phase according to functionality and data conversion procedure.
  • Performed Normalization of the existing OLTP systems (3rd NF), to speed up the DML statements execution time.
  • Data modeling in Erwin; design of target data models for enterprise data warehouse (Teradata)
  • Created and Maintained Logical Data Model for the project. Includes documentation of all Entities, Attributes, Data Relationships, Primary and Foreign Key Structures, Allowed Values, Codes, Business Rules, Glossary Terms, etc.
  • Developed the required data warehouse model using Star schema for the generalized model.
  • Performed Data validation, Data cleansing, Data integrity, Data Quality checking before delivering data to operations, Business, Financial analyst by using Oracle, Teradata.
  • Experienced in Oracle installations, upgrades, migration, designing logical/physical architecture, Tuning, Capacity planning, database access and Security and auditing.
  • Possessed strong Conceptual and Logical Data Modeling skills, has experience with JAD sessions for requirements gathering, creating data mapping documents, writing functional specifications.
  • Developed the stored Procedures, SQL Joins, SQL queries for data retrieval, accessed for analysis and exported the data into CSV, Excel files.
  • Used Microsoft Excel tools like pivot tables, graphs, charts, solver to perform quantitative analysis.
  • Collaborated with ETL, BI and DBA teams to analyze and provide solutions to data issues and other challenges while implementing the OLAP model.
  • Worked for cleansing and organizing various tables in a presentable manner to help with better understanding of already existing models.
  • Generated various presentable reports and documentation using report designer and pinned reports in Erwin.

Environment: Erwin 9.4, OLTP, Teradata 14, Oracle 11g, Microsoft Excel, OLAP

Confidential, Dallas, TX

Sr. Data Analyst


  • Worked with business requirements analysts/subject matter experts to identify and understand requirements. Conducted user interviews and data analysis review meetings.
  • Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
  • Performed thorough data analysis for the purpose of overhauling the database using SQL Server.
  • Prepared complex T-SQL queries, views and stored procedures to load data into staging area.
  • Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
  • Worked Extensively with DBA and Reporting team for improving the Report Performance with the Use of appropriate indexes and Partitioning.
  • Extensively used SQL, T-SQL and PL/SQL to write stored procedures, functions, packages and triggers.
  • Worked on Data Analysis, Data profiling, and Data Modeling, data governance identifying Data Sets, Source Data, Source Metadata, Data Definitions and Data Formats.
  • Designed and Developed PL/SQL procedures, functions and packages to create Summary tables.
  • Worked in importing and cleansing of data from various sources like Teradata, flat files, SQL Server with high volume data.
  • Prepared process flow/activity diagram for existing system using MS Visio and re-engineer the design based on business requirements.
  • Designed and developed Use Cases, Activity Diagrams, Sequence Diagrams, using UML and Business Process Modeling.
  • Used advanced MS Excel with V-Look up and Pivot table functions to identify the issues in the data and helped in further modifications to build new versions.
  • Wrote SQL scripts to run ad-hoc queries, PL/SQL scripts, Stored Procedures & Triggers and prepare reports to the management.
  • Manipulated, cleansing & processing data using Excel, Access and SQL.
  • Created reports analyzing large-scale database utilizing Microsoft Excel Analytics within legacy system.
  • Created reports from several discovered patterns using Microsoft excel to analyze pertinent data by pivoting.
  • Performed Data Analysis and Data validation by writing complex SQL queries.
  • Developed the financing reporting requirements by analyzing the existing business objects reports.

Environment: PL/SQL, DB2, T-SQL, SQL, Teradata 14, MS Visio 2012, MS Excel 2012, MS Access 2012


Data Analyst


  • Worked with the business analysts to understand the project specification and helped them to complete the specification.
  • Worked in Data Analysis, data profiling and data governance identifying Data Sets, Source Data, Source Metadata, Data Definitions and Data Formats.
  • Used MS Access, MS Excel, Pivot tables and charts, MS PowerPoint, MS Outlook, MS Communicator and User Base to perform responsibilities.
  • Extracted Data using SSIS from DB2, XML, Excel and flat files perform transformations and populate the data warehouse
  • Performed Teradata, SQL Queries, creating Tables, and Views by following Teradata Best Practices.
  • Prepared Business Requirement Documentation and Functional Documentation.
  • Primarily responsible for coordinating between project sponsor and stake holders.
  • Conducted JAD sessions to allow different stakeholders such as editorials, designers, etc.
  • Extensively SQL experience in querying, data extraction and data transformations.
  • Experienced in developing business reports by writing complex SQL queries using views, macros, volatile and global temporary tables.
  • Developed numerous reports to capture the transactional data for the business analysis.
  • Collaborated with a team of Business Analysts to ascertain capture of all requirements.
  • Involved in writing complex SQL queries using correlated sub queries, joins, and recursive queries.
  • Designed reports in Access, Excel using advanced functions not limited to pivot tables, formulas
  • Used SQL, PL/SQL to validate the Data going in to the Data warehouse
  • Wrote complex SQL, PL/SQL testing scripts for Backend Testing of the data warehouse application. Expert in writing Complex SQL/PLSQL Scripts in querying Teradata and Oracle.
  • Extensively tested the Business Objects report by running the SQL queries on the database by reviewing the report requirement documentation.
  • Implemented the Data Cleansing using various transformations.
  • Used Data Stage Director for running and monitoring performance statistics.
  • Designed and implemented basic s for testing and report/data validation.
  • Ensured the compliance of the extracts to the Data Quality Center initiatives.
  • Wrote multiple SQL queries to analyze the data and presented the results using Excel, Access, and Crystal reports.
  • Gathered and documented the Audit trail and traceability of extracted information for data quality.

Environment: MS Access 2010, MS Excel 2010, MS Power Point, MS Outlook, SSIS, XML, Teradata 12, SQL, PL/SQL, Oracle 10g

Hire Now