We provide IT Staff Augmentation Services!

Sr. Data Analyst Data Modeler Resume

Columbus, OH


  • With 9+ years of experience in Data Modeling, business and Data Analysis, production support, Database Management, strategic analysis, requirements gathering, data mapping and data profiling.
  • Expert in Conceptual, Logical and Physical Data Modeling for various platforms including Oracle, DB2, Teradata, PostgreSQL, SQL Server.
  • Hands - on experience as Procedural DBA using Oracle toolset (PL/SQL, SQL, Performance Tuning).
  • Experience in ETL techniques and Analysis and Reporting including working experience with the Reporting tools such as Tableau, Informatica and Ab initio.
  • Experience in integration of SalesForce and SQL server using Sql Server Integration Services
  • Develop Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.
  • Develop materialized views for data replication in distributed environments.
  • Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of big data.
  • Experienced in Technical consulting and end-to-end delivery with data modeling, data governance and design - development - implementation of solutions.
  • Knowledge and working experience on big data tools like Hadoop, Azure Data Lake, AWS Redshift.
  • Experience in Business Intelligence (BI) project Development and implementation using Microstrategy product suits
  • Expertise on Relational Data modeling (3NF) and Dimensional data modeling.
  • Experience in developing MapReduce Programs using Apache Hadoop for analyzing the big data as per the requirement.
  • Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Skillful in Data Analysis using SQL on Oracle, MS SQL Server, DB2 & Teradata.
  • Strong background in various Data Modeling tools using ERWIN, Power Designer, MS Visio.
  • Experience with relational (3NF) and dimensional data modeling. Experience in leading cross-functional, culturally diverse teams to meet strategic, tactical and operational goals and objectives.
  • Extensive experience in Relational Data Modeling, Logical data model/Physical data models Designs, ER Diagrams, Forward and Reverse Engineering, Publishing ERWIN diagrams, analyzing data sources and creating interface documents.
  • Experience in working with business intelligence and data warehouse software, including SSAS, Pentaho, Cognos, OBIEE, Greenplum Database, Amazon Redshift and Azure Data Warehouse.
  • Excellent experience in developing Stored Procedures, Triggers, Functions, Packages, Inner Joins & Outer Joins, views using TSQL/PLSQL.
  • Experience in designing error and exception handling procedures to identify, record and report errors.
  • Excellent experience in writing and executing unit, system, integration and UAT scripts in a data warehouse projects.
  • Excellent knowledge on creating reports on SAP Business Objects, Webi reports for multiple data providers.
  • Experience in Data transformation and Data mapping from source to target database schemas and also data cleansing.
  • Experience in Ralph Kimball and Bill Inmon approaches.
  • Experience in migration of data from Excel, DB2, Sybase, Flat file, Teradata, Netezza, Oracle to MS SQL Server using BCP and DTS utility and extracting, transforming and loading of data.
  • Experience in using Oracle, SQL*PLUS, and SQL*Loader.
  • Experience in automating and scheduling the Informatica jobs using UNIX shell scripting configuring Korn-jobs for Informatica sessions.


Sr. Data Analyst Data Modeler

Confidential - Columbus, OH


  • Conducted JAD sessions, gathered information from Business Analysts, Developers, end users and stakeholders to determine the requirements and various systems.
  • Analyzed Business Requirements and created Data and System Flow Diagrams as a visual aid to Developers and other stakeholders to enable them to quickly and easily understand the project architecture and the processes that should be implemented.
  • Analyzed source data that was coming in various layouts such as XML, JSON, Cobalt, Fixed Width flat files etc and converted the layouts to an easy to consume layout.
  • Profiled source data using Informatica Data Quality tool and custom queries so as to discover the various data patterns, data anomalies and to understand the business by relating the Business Requirements with the source data.
  • Documented all observations from Profiling Data highlighting all the data issues discovered, documenting the queries that led to the discovery of the issues and giving recommendations to the Subject Matter Experts, Developers and all other stakeholders.
  • Created and Maintained the various Data Models for all projects I was involved in which were Conceptual, Logical and Physical Data Models.
  • Proficient in using IBM Infosphere Data Architect, Reverse engineering, Forward engineering and Complete Compare functions to create, updated and resolve issues with existing data models.
  • Created and Implemented Naming Standards and Domain on IBM Infosphere Data Architect to enable the entire Enterprise Data Management Team to have set Enterprise Standards.
  • Set up Git Hub repository for the team and educated the team on how to use git hub for version control and sharing Models and Queries.
  • Created Source to Target Mapping Documents.
  • Created and documented the models, data Workflow Diagrams, Sequence Diagrams, Activity Diagrams and field mappings of all existing system.
  • Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetch data from the source and SQL Server database systems.

Confidential, OH

Sr. Data Analyst Modeler

  • Involved in Relational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagrams with all related entities and relationship with each entity based on the rules provided by the business manager using ERWIN r9.6.
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks
  • Accomplished financial tests to ensure compliance with CCAR, BASEL, Dodd-Frank and Sarbanes-Oxley using SQL, Oracle, SAS, DB2, Teradata, and MS Access; being proficient with business intelligence tools such as SSIS, SSRS, TOAD, SAS Enterprise Guide, Teradata SQL Assistant, VBA, Tableau, and Actimize.
  • Successfully worked on Data visualization with tools like Excel,Qlick View.
  • Worked in importing and cleansing of data from various sources like DB2, Oracle, flat files onto SQL Server with high volume data
  • Experience with writing scripts in Oracle, SQL Server and Netezza databases to extract data for reporting and analysis
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Developed automated data pipelines from various external data sources (web pages, API etc) to internal data warehouse (SQL sever, AWS), then export to reporting tools like Datorama by Python.
  • Used Informatica power center for (ETL) extraction, transformation and loading data from heterogeneous source systems.
  • Studied and reviewed application of Kimball data warehouse methodology as well as SDLC across various industries to work successfully with data-handling scenarios, such as data
  • Connected to RedShift through Tableau to extract live data for real time analysis.
  • Worked on Normalization and De-normalization concepts and design methodologies like Ralph Kimball and Bill Inmon's Data Warehouse methodology.
  • Used SQL tools to run SQLqueries and validate the data loaded in to the target tables.
  • Developed normalized Logical and Physical database models to design OLTP system for finance applications.
  • Created dimensional model for reporting system by identifying required dimensions and facts using Erwin r8.0.
  • Extensively used ERwin for developing data model using star schema methodologies
  • Collaborated with other data modeling team members to ensure design consistency and integrity.
  • Involved in Planning, Defining and Designing data base using Erwin on business requirement and provided documentation.
  • Validated the data of reports by writing SQLqueries in PL/SQL Developer against ODS.
  • Involved in user training sessions and assisting in UAT (User Acceptance Testing).
  • Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Netezza
  • Excellent experience on Teradata Appliance Backup Utility (ABU) and ARC to backup data to and from Teradata nodes.
  • Developed mappings/sessions using Informatica Power Center 8.6 for data loading.
  • Designed the database in teradata and worked with Data Architect to implement it in the data model
  • Modeled new tables and added them to the existing data model using Erwin as part of data modeling.
  • Implemented Referential Integrity using primary key and foreign key relationships.
  • Worked with the Business Analyst, QA team in their testing and DBA for requirements gathering, business analysis, testing and project coordination.
  • Worked with Database Administrators, Business Analysts and Content Developers to conduct design reviews and validate the developed models.

Hire Now