We provide IT Staff Augmentation Services!

Sr. Data Modeler/data Analyst Resume

Austin, TX


  • 8+ years of industry experience with solid understanding of Data Modeling, Data Analysis, Evaluating Data Sources, and strong understanding of Data Warehouse/Data Mart Design, ETL, BI, Data visualization, OLAP, OLTP and Client/Server applications.
  • Experience driving cross - functional analytics projects from beginning to end: question formation, data model design, exploratory data analysis (EDA), validation, analysis, visualization, and presentation.
  • Strong Experience in ER & Dimensional Data Modeling to deliver normalized ER & Star/Snowflake schemas using Erwin, ER Studio, EA Sybase power designer, SQL Server Enterprise manager and Oracle designer.
  • Proven skills in designed and maintained teh Detail Design Document (DDD), Business requirement documents (BRD), Data Requirement Document (DRD), Data Flow Diagram (DFD), Data Management Plan Document, Data Dictionary, Meta Data Model, Logical and Physical Data Models, Full DDL, Alter DDL, Insert statement for all teh applications.
  • Hands on experience in Normalization and De-Normalization techniques for optimum performance in relational and dimensional database environments and experience with modeling using Erwin in both forward and reverse engineering processes.
  • Extensive noledge of Big-data, Hadoop, Hive, Sqoop, HDFS, NoSQL Databases such as MongoDB and Cassandra and other emerging technologies.
  • Sound noledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
  • Worked with Amazon Web Services (AWS) for a multitude of applications utilizing teh Amazon Web Services focusing on high-availability, fault tolerance and auto-scaling and good experience and noledge on AWS Redshift, RDS, AWS S3 and AWS EMR.
  • Excellent SQL programming skills and developed Stored Procedures, Triggers, Functions, Packages using SQL, PL/SQL.
  • Experience in Data transformation and Data mapping from source to target database schemas and also data cleansing.
  • Extensive experience in Text Analytics, generating data visualizations using R, Python and creating dashboards using tools like Tableau.
  • Expert in building Enterprise Data Warehouse or Data warehouse appliances from Scratch using both Kimball and Inmon Approach.
  • Extensive experience on usage of ETL & Reporting tools like Datastage, Informatica SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), Tableau, Power BI and Microstrategy.
  • Proficient in Software Development Life Cycle (SDLC), Project Management methodologies, and Microsoft SQL Server database management and working with Agile and Waterfall data modeling methodologies.
  • Extensive experience in Data Visualization including producing tables, graphs, listings using various procedures and tools such as Tableau and PowerBI.
  • Solid understanding of Data Governance, Meta Data, Data Management and control; Adept in Data warehouse and Data mart architecture.
  • Strong background in data processing, data analysis with hands on experience in MS Excel, MS Access, Unix and Windows Servers.


Data Modeling Tools: ERwin, ER Studio, and Power Designer.

Cloud Platform: AWS (S3 Bucket, RDS and Amazon Redshift) and MS Azure (Azure SQL, Azure DW, Azure ADF and Storage Blob.

Reporting Tools: SSRS, Power BI, Tableau and Microstrategy

Databases: Oracle 12c/11g/10g, Teradata R15/R14, MS SQL Server 2016/2014, DB2, MongoDB and Cassandra.

Programming Languages: SQL, T-SQL, PL/SQL, R and Python.

Operating System: Windows, Unix, Sun Solaris.

Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Model.

Big-Data Tools: Hadoop, Hive, HDFS, Sqoop, and Spark


Confidential -Austin, TX

Sr. Data Modeler/Data Analyst


  • Involved in Data Modeling role to review business requirement and compose source to target data mapping documents and involved in relational and dimensional Data Modeling for creating Logical and Physical design of teh database and ER diagrams using data modeling tool like Erwin.
  • Involved in story-driven Agile development methodology and actively participated in daily scrum meetings and used methodology as teh organization Standard to implement teh data Models.
  • Created data models for AWS Redshift and Hive from dimensional data models and worked on Data modeling, Advanced SQL with Columnar Databases using AWS and driven teh technical design of AWS solutions by working with customers to understand their needs.
  • Worked on development of data warehouse, Data Lake and ETL systems using relational and non relational tools like SQL, No SQL.
  • Generated various dashboards as per teh requirements, which were used by management to make key business decisions and developed and maintained data dictionary to create metadata reports for technical and business purpose and write Python scripts to parse JSON documents and load teh data in database.
  • Involved in requirement gathering and database design and implementation of star-schema, snowflake schema/dimensional data warehouse using ERwin.
  • Involved in extensive Data validation by writing several complex SQL queries and involved in back-end testing and worked with data quality issues.
  • Migrated reference data from existing product into Informatica MDM hub and involved in several facets of MDM implementations including Data Profiling, Metadata acquisition and data migration.
  • Generated comprehensive analytical reports by running SQL queries against current databases to conduct Data Analysis and improved performance on SQL queries used Explain plan / hints /indexes for tuning created DDL scripts for database.
  • Created PL/SQL Procedures and Triggers.Worked on Normalization and De-Normalization techniques for both OLTP and OLAP systems and worked with data investigation, discovery and mapping tools to scan every single data record from many sources and performed Data mapping between source systems to Target systems, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data.
  • Participated in Data Acquisition with Data Engineer team to extract historical and real-time data by using Hadoop MapReduce and HDFS and developed Hive queries for analysis, and exported teh result set from Hive to MySQL using Sqoop after processing teh data.
  • Designed Datastage ETL jobs for extracting data from heterogeneous source system, transform, and finally load into teh Data Marts.
  • Created and published multiple dashboards and reports using Tableau server, Power BI and Microstrategy.

Environment: ERwin, Microsoft SQL Server, AWS S3, Redshift, RDS, MySQL, SAS, MDM, HDFS, HBase, Hive QL Queries, Sqoop, OLTP, OLAP, Metadata, MS Excel, QlikView, Tableau, Power BI and Microstrategy, SQL, T-SQL, Python, Spark and PL/SQL.

Confidential - New York, NY

Sr. Data Modeler / Data Analyst


  • Interacted with technical and non-technical business user\'s to id business requirements and to translate into data requirements to build data model design.
  • Involved in logical and physical designs and transforming logical models into physical implementations and Created Entity/Relationship Diagrams, grouped and created teh tables, validated teh data, identified PKs for lookup tables.
  • Worked on Software Development Life Cycle (SDLC), testing methodologies, resource management and scheduling of tasks and used Agile Method for daily scrum to discuss teh project related information.
  • Utilized Erwin's forward/reverse engineering tools and target database schema conversion process.
  • Involved in collaborating with ETL/Informatica teams to source data, perform data analysis to identify gaps and Involved in loading teh data from Source Tables to Operational Data Source tables using Transformation and Cleansing Logic.
  • Presented teh Dashboard to Business users and cross functional teams, define KPIs (Key Performance Indicators), and identify data sources.
  • Involved in Data Migration using SQL, SQL Azure, Azure Storage, and Azure Data Factory, SSIS, and PowerShell and created processes to load data from Azure Storage blob to Azure SQL, to load from web API to Azure SQL and scheduled web jobs for daily loads.
  • Defined Validation rules in MDM system by analyzing excel sheet part master data and input from teh business users and worked with MDM systems team with respect to technical aspects and generating reports.
  • Clean data and processed third party spending data into maneuverable deliverables within specific format with Excel macros and python libraries such as NumPy, SQLAlchemy and matplotlib.
  • Involved in Data flow analysis, Data modeling, Physical database design, forms design and development, data conversion, performance analysis and tuning.
  • Designed teh data marts using teh Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin and created DDL scripts using Erwin and source to target mappings to bring teh data from source to teh warehouse.
  • Worked with various databases like Oracle, SQL and performed teh computations, log transformations, feature engineering, and Data exploration to identify teh insights and conclusions from complex data using R- programming in R-studio
  • Developed automated data pipelines from various external data sources (web pages, API etc) and used Spark and SparkSQL for data integrations, manipulations.
  • Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide SQL access on Hadoop data and Created tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
  • Involved in Teradata utilities (BTEQ, Fast Load, Fast Export, Multiload, and Tpump) in both Windows and Mainframe platforms.
  • Performed data analysis and data profiling using complex SQL queries on various sources systems including Oracle and monitored teh Data quality and integrity of data was maintained to ensure effective functioning of department.
  • Generated detailed report after validating teh graphs using R, and adjusting teh variables to fit teh model and worked on Tableau for insight reporting and data visualization and extracted data from IBM Cognos to create automated visualization reports and dashboards on Tableau.

Environment: Agile, Erwin 9.7, Python, Azure SQL, Azure DW, StorageBlob, R, SQL, Teradata, Informatica, Tableau, Microstrategy, Hive, HDFS, Sqoop, MDM, Spark, Oracle 12c, PL/SQL, SQL Server, SSIS and SSRS.

Confidential -Columbus, GA.

Sr. Data Modeler/ Data Analyst/ Data Warehousing


  • Understand teh business process; gather business requirements; determine impact analysis based onERP and Created logical physicaldatamodels and MetaDatato support teh requirements Analyzed requirements to develop design concept and technical approaches to find teh business requirements by verifying Manual Reports.
  • Involved in fixing invalid mappings, testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and teh Target Data.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snowflake Schema.
  • Worked on data integration and workflow application on SSIS platform and responsible for testing all new and existing ETL data warehouse components.
  • Reverse Engineered teh Data Models and identified teh Data Elements in teh source systems and adding new Data Elements to teh existing data modelsand used SQL for querying teh database in UNIX environment.
  • Involved in all teh steps and scope of teh project referencedataapproach to MDM and CreatedData Dictionary and Data Mapping from Sources to teh Target in MDMDataModel.
  • End to End process involvement from gathering client business requirements, developing teh dashboard inTableauand publishing teh dashboard into server.
  • Developed Data Migration and Cleansing rules for teh Integration Architecture (OLTP, ODS, DW)and performed Business Area Analysis and logical and physical data modeling for a Data Warehouse utilizing teh Bill Inmon Methodology and also designed Data Mart application utilizing teh Star Schema Dimensional Ralph Kimball methodology.
  • Extracted/Transformed/Loaded (ETL) design and implementation in areas related to Teradata utilities such as Fast Export and MLOAD for handling numerous tasks.
  • Implement functional requirements using Base/SAS,SAS/Macros,SAS/QL, UNIX, Oracle and CodingSASprograms with teh use of BaseSASandSAS/Macros for Adhoc jobs requested by Users and DB2 and Upgrading teh SQL Server Databases, Monitoring and Performances tuning and developed reports using Crystal Reports with T-SQL, MS Excel and Access.
  • Involved in several facets of MDM implementations including Data Profiling, Metadata acquisition and data migration
  • MigratedSQLServer2005 databases toSQLServer2008R, 2008R2 databases also migrated to IBM DB2.
  • Worked on multiple Data Marts in Enterprise Data Warehouse Project (EDW) and involved in designing OLAP data models extensively used slowly changing dimensions (SCD).
  • Worked on all activities related to teh development, implementation, administration and support of ETL processes for large-scale Data Warehouses using SQL Server SSIS.
  • Developed automated procedures to produce data files using Microsoft Integration Services (SSIS) and performeddataanalysis anddataprofiling using complex SQL on various sources systems including Oracle and Netezza
  • Developed ER and Dimensional Models using ER Studio advanced features and created physical and logicaldatamodels using ER Studio.
  • Used SQL Profiler for monitoring and troubleshooting performance issues in T-SQL code and stored procedures.
  • ImplementedAgileMethodology for building an internal application.
  • Extracted data from databases Oracle, Teradata, Netezza, SQL server and DB2 using Informatica to load it into a single repository for data analysis and used SQL on a wide scale for analysis, performancetuning and testing.

Environment: ER Studio, SQL Server 2012, SQL Server Analysis Services 2008, SSIS, SSRS 2008, Oracle 10g, Business Objects XI, Rational Rose,Tableau, ERP, Netezza, Teradata, Excel, Informatica MDM, Pivot tables, DB2, Datastage, MS Office, MS Visio, SQL, Rational Rose, T-SQL, UNIX, Agile, SAS, MDM, Shell Scripting, Crystal Reports 9.

Confidential -Eden Praire MN

Sr. Data Analyst/Data Modeler


  • Worked as a Data Analyst/Modeler to generate Data Models using SAP PowerDesigner and developed relational database system.
  • Developed long term data warehouse roadmap and architectures, designs and builds teh data warehouse framework per teh roadmap.
  • Conducted user interviews, gathering requirements, analyzing teh requirements using Rational Rose, Requisite pro RUP.
  • Developed logical data models and physical database design and generated database schemas using SAP PowerDesigner.
  • Analyzed teh business requirements of teh project by studying teh Business Requirement Specification document.
  • Extensively worked on Data Modeling tools SAP PowerDesigner Data Modeler to design teh data models.
  • Created ER diagrams using Power Designer modeling tool for teh relational and dimensional data modeling.
  • Involved in data mapping document from source to target and teh data quality assessments for teh source data.
  • Responsible for data profiling and data quality checks to suffice teh report requirements
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Developed and maintained data dictionary to create metadata reports for technical and business purpose.
  • Created SQL tables with referential integrity and developed queries using SQL, SQL*PLUS and PL/SQL.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas.
  • Designed teh Database Tables & Created Table and Column Level Constraints using teh suggested naming conventions for constraint keys.
  • Reversed Engineered teh existing database structure to understand teh existing data models so that any changes in corporate would synchronize with current model.
  • Involved in Normalization / De-normalization, Normal Form and database design methodology.
  • Conducted JAD Sessions with teh SME, stakeholders and other management teams in teh finalization of teh User Requirement Documentation.
  • Wrote T-SQL statements for retrieval of data and Involved in performance tuning of T-SQL queries and Stored Procedures.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing.
  • Handled performance requirements for databases in OLTP and OLAP models and used excel sheet, flat files, CSV files to generated Tableau ad-hoc reports
  • Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT).
  • Involved in Data profiling and performed Data Analysis based on teh requirements, which helped in catching many Sourcing Issues upfront.
  • Developed Data mapping, Data Governance, Transformation and Cleansing rules for teh Data Management involving OLTP, ODS and OLAP.

Environment: SAP, PowerDesigner 16.6, OLTP, OLAP, T-SQL, SSIS, SQL Server, SQL, PL/SQL, Rational Rose, ODS

Confidential - Portland OR

Data Analyst


  • Worked as Data Analyst for requirements gathering, business analysis and project coordination.
  • Responsible for teh analysis of business requirements and design implementation of teh business solution.
  • Performed Data Analysis and Data validation by writing SQL queries using SQL assistant.
  • Translated business concepts into XML vocabularies by designing XML Schemas with UML
  • Gathered business requirements through interviews, surveys with users and Business analysts.
  • Worked on Data Mining and data validation to ensure teh accuracy of teh data between teh warehouse and source systems.
  • Developed SQL Queries to fetch complex data from different tables in databases using joins, database links.
  • Involved in designing and developing SQL server objects such as Tables, Views, Indexes (Clustered and Non-Clustered), Stored Procedures and Functions in Transact-SQL.
  • Participated in JAD sessions, gathered information from Business Analysts, end users and other stakeholders to determine teh requirements.
  • Performed Data analysis of existing data base to understand teh data flow and business rules applied to Different data bases using SQL.
  • Performed data analysis and data profiling using complex SQL on various sources systems and answered complex business questions by providing data to business users.
  • Performed teh detail data analysis, Identify teh key facts and dimensions necessary to support teh business requirements.
  • Generated Data dictionary reports for publishing on teh internal site and giving access to different users.
  • Used MS Visio and Rational Rose to represent system under development in a graphical form by defining use case diagrams, activity and workflow diagrams.
  • Wrote a complex SQL, PL/SQL, Procedures, Functions, and Packages to validate data and testing process.
  • Worked in generating and documenting Metadata while designing OLTP and OLAP systems environment.
  • Worked in data management performing data analysis, gap analysis, and data mapping.
  • Established a business analysis methodology around teh RUP (Rational Unified Process).
  • Developed stored procedures in SQL Server to standardize DML transactions such as insert, update and delete from teh database.
  • Created SSIS package to load data from Flat files, Excel and Access to SQL server using connection manager.
  • Develop all teh required stored procedures, user defined functions and triggers using T-SQL and SQL.
  • Produced report using SQL Server Reporting Services (SSRS) and creating various types of reports.

Environment: SQL, SQL server, PL/SQL, MS Visio, Rational Rose, SSIS, T-SQL, SSRS

Hire Now