We provide IT Staff Augmentation Services!

Sr. Data Architect/data Modeler Resume

Mentor, OH


  • Above 9+ years of experience as Sr. Data Architect/Modeler with Data Analytics Professional in System Analysis, Data Architecture and Development, Testing and Deployment of business applications.
  • Experience in analyzing data using Hadoop Ecosystem including HDFS, Hive, Spark, Spark Streaming, Elastic Search, Kibana, Kafka, HBase, Zookeeper, PIG, Sqoop and Flume.
  • Hands - on experience in architecting and data modeling for AWS Redshift, AWS Oracle RDS, AWS PostgreSQL and AWS Aurora.
  • Good understanding and hands on experience with AWS S3 and EC2, EMR.
  • Experience in designing stunning visualizations using Tableau software and publishing and presenting Dashboards, Storyline on Web and Desktop platforms.
  • Experienced in generating and documenting Metadata while designing OLTP and OLAP systems environment.
  • Experience in writing SQL queries and optimizing the queries in Oracle, SQL Server, Netezza, Confidential and Big Data.
  • Experience in developing Map Reduce Programs using Apache Hadoop for analyzing the big data as per the requirement.
  • Hands on experience in Normalization and De-Normalization techniques up to 3NF for optimum performance in relational and dimensional database environments.
  • Strong background in various Data Modeling tools using Confidential, ER/Studio and Power Designer.
  • Strong experience in using Excel and MS Access to dump the data and analyze based on business needs.
  • Experience in metadata design, real time BI Architecture including Data Governance for greater ROI.
  • Experience in Dimensional Data Modeling, Star/Snowflake schema, FACT & Dimension tables.
  • Strong Database experience using Oracle, XML, DB2, Confidential, SQL server, Big data and NoSQL.
  • Proficient in using Python, SQL, Hadoop ecosystem for extracting data and building predictive models.
  • Extensive ETL testing experience using Informatica 9x/8x, Talend, Pentaho.
  • Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export
  • Experience in conducting Joint Application Development (JAD) sessions with SMEs, Stakeholders and other project team members for requirement gathering and analysis.
  • Strong validation experience of data models by different measures such as AUC, ROC, and confusion matrix.
  • Proficient in project implementations using various Software Development Life Cycle (SDLC) methodologies like Waterfall, Agile (SCRUM) and RUP.
  • Experienced in Client-Server application development using Oracle, PL/SQL, SQL PLUS, SQL Developer, TOAD, and SQL LOADER.
  • Expertise in SQL Server Analysis Services (SSAS) to deliver Online Analytical Processing (OLAP) and data mining functionality for business intelligence applications.
  • Extensive experience in SSIS Packages, SSRS reports and SSAS cubes on production server.
  • Experience in designing Enterprise Data Warehouses, Data Marts, Reporting data stores (RDS) and Operational data stores (ODS).


Data Modeling Tools: Confidential r9.6/9.5, ER/Studio 9.7, Sybase Power Designer

Languages: SQL, PL/SQL, ASP, Visual Basic, XML, Python, SQL, T-SQL, SQL Server, C, C++, JAVA, HTML, UNIX shell scripting, PERL.

Big Data Tools: Hadoop, Hive, Spark, Pig, HBase, Sqoop, Flume.

Database: Oracle 11g/12c, MS Access, SQL Server Sybase and DB2, Teradata14/15, Hive

BI Tools: Tableau 7.0/8.2, Tableau server 8.2, Tableau Reader 8.1,SAP Business Objects, Crystal Reports

Operating Systems: Microsoft Windows 8/7 and UNIX.

Applications: Toad for Oracle, Oracle SQL Developer, MS Word, MS Excel, MS Power Point, Confidential, Designer 6i

Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Waterfall Model

Project Execution Methodologies: Ralph Kimball and BillInmon data warehousing methodology, Rational Unified Process (RUP), Rapid Application Development (RAD), Joint Application Development (JAD)


Sr. Data Architect/Data Modeler

Confidential, Mentor OH


  • Massively involved in Data Architect role to review business requirement and compose source to target data mapping documents.
  • Installed and Configured Open Source Software like Pig, Hive, HBase, Flume and Sqoop and designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using Star and Snow Flake Schemas
  • Worked with Netezza and Oracle databases and implemented various logical and physical data models for them.
  • Involved in Normalization/De-normalization techniques for optimum performance in relational and dimensional database environments.
  • Worked on Amazon Redshift, AWS & Azure and architecting a solution to load data, create data models and run BI on it.
  • Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide SQL access on Hadoop data
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture.
  • Processed data using Python pandas to examine transaction data identify outliers and inconsistencies.
  • Deployed the model on AWS Lambda, collaborated with develop team to build the business solutions and stored data from SQL Server database into Hadoop clusters which were set up in AWS EMR.
  • Applied Data Governance rules for primary qualifier, Class words and valid abbreviation in table name and Column names.
  • Worked on Tableau for insight reporting and data visualization, Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
  • Performed POC for Big data solution using Cloudera Hadoop for data loading and data querying
  • Created OLAP data architecture, analytical data marts, and cubes optimized for reporting and cleaned and maintained the database by removing and deleting old data.
  • Involved in Logical modeling using the Dimensional Modeling techniques such as Star Schema and Snow Flake Schema.
  • Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
  • Designed ER diagrams (Physical and Logical using Confidential ) and mapping the data into database objects.
  • Performed data integrity checks, data cleansing, exploratory data analysis, and feature engineer using Python.
  • Created SSIS packages for different data loading operations for many applications.
  • Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetch data from legacy Oracle and SQL Server database systems.
  • Developed LINUX Shell scripts by using NZSQL/NZLOAD utilities to load data from flat files to Netezza database.
  • Worked on Metadata Repository (MRM) for maintaining the definitions and mapping rules up to mark.
  • Used Ab-Initio DQE for data quality solution for enterprise-level data processing and data management systems

Environment: Erwin9.6, Informatica, Python, Big Data, LINUX, Confidential, SQL, Oracle, AWS Redshift, AWS S3, AWS EMR, Netezza, Tableau, Hadoop, Hive, OLAP, OLTP, Sqoop, Flume, NZSQL/NZLOAD, MongoDB, Cassandra.

Sr. Data Architect/Data Modeler

Confidential, Dallas TX


  • As an Architect implement MDM hub to provide clean, consistent data for a SOA implementation.
  • Gathered and translated business requirements into detailed, production-level technical specifications, new features, and enhancements to existing technical business functionality.
  • Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture involving OLTP, ODS and OLAP.
  • Created and maintained Database Objects (Tables, Views, Indexes, Partitions, Synonyms, Database triggers, Stored Procedures) in the data model.
  • Worked with other Data Architects to design New Data Mart to design the Google Analytics data reports.
  • Acquired image dataset of products from different data sources and aggregated into one dataset on Amazon Redshift.
  • Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, HBase database and Sqoop.
  • Connected to Amazon Redshift through Tableau to extract live data for real time analysis.
  • Created Rich dashboards using Tableau Dashboard and prepared user stories to create compelling dashboards to deliver actionable insights.
  • Created Hive architecture used for real time monitoring and HBase used for reporting
  • Worked for map reduce and query optimization for Hadoop hive and HBase architecture
  • Worked with DBA group to create Best-Fit Physical Data Model from the Logical Data Model using Forward engineering using Confidential .
  • Generate DDL scripts for database modification, Confidential, Macros, Views and set tables.
  • Build and maintain scalable data pipelines using the Hadoop ecosystem and other open source components like Hive and HBase.
  • Performed data management and fulfilling ad-hoc requests according to user specifications by utilizing data management software programs and tools like Perl, TOAD, MS Access, Excel and SQL.
  • Worked on Naming standards for Table/Column/Index/Constraints names thru Confidential Macros and Master Abbreviations file.
  • Worked in Data Analysis, data profiling and data governance identifying Data Sets, Source Data, Source Meta Data, Data Definitions and Data Formats.
  • Worked on Physical design for both SMP and MPP RDBMS, with understanding of RDMBS scaling features.
  • Performed Researching and deploying new tools, frameworks and patterns to build a sustainable Big data platform.
  • Extensively used agile methodology as the Organization Standard to implement the data Models.
  • Involved in performing extensive Back-End testing by writing SQL queries and PL/SQL stored procedures to extract the data from SQL Database.
  • Involved in the validation of the OLAP Unit testing and System Testing of the OLAP Report Functionality and data displayed in the reports.
  • Applies architectural and technology concepts to address scalability, security, reliability, maintainability and sharing of enterprise data
  • Designed Metadata Repository to store data definitions for entities, attributes & mappings between data warehouse and source system data elements.

Environment: Erwin9.6, Big Data, OLTP, OLAP, SMP, Confidential R13, Confidential SQL Assistant, Hadoop, Hive, Pig, HBASE, DB2, Big Data, Agile, MS-Office, Oracle, SQL Server, Sqoop, Pig, Python, SSRS, T-SQL.

Sr. Data Architect/Data Modeler

Confidential, Chicago, IL


  • Worked as a Data Modeler/Architect to generate Data Models using Confidential and developed relational database system.
  • Lead Architectural Design in Big Data, Hadoop projects and provide for a designer that is an idea-driven.
  • Acquired image dataset of products from different data sources and aggregated into one dataset on Amazon Redshift.
  • Involved in several facets of MDM implementations including Data Profiling, Metadata acquisition and data migration
  • Designed Physical Data Model (PDM) using Confidential and Oracle PL/SQL and designed Normalization up to 3NF and performed Forward & Reverse Engineering using the Confidential .
  • Built relationships and trust with key stakeholders to support program delivery and adoption of enterprise architecture.
  • Involved in writing T-SQL, working on SSIS, SSRS, SSAS, Data Cleansing, Data Scrubbing and Data Migration.
  • Developing full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
  • Involved in Confidential utilities ( Confidential, Fast Load, Fast Export, Multiload, and Confidential ) in both Windows and Mainframe platforms.
  • Developed Business Intelligence architecture using Microsoft and Tableau products.
  • Provided guidance and solution concepts for multiple projects focused on data governance and master data management.
  • Performed extensive data profiling and data analysis for detecting and correcting inaccurate data from the databases and track the data quality.
  • Developed Source to Target Matrix with ETL transformation logic for ETL team and Created data masking mappings to mask the sensitive data between production and test environment.
  • Participated with key management resources in the strategic analysis and planning requirements for Data Warehouse/Data Mart reporting and data mining solutions.
  • Managed the meta-data for the Subject Area models for the Data Warehouse environment.
  • Designed and implemented a recommendation system which utilized Collaborative filtering techniques to recommend course for different customers and deployed to AWS EMR cluster.
  • Worked Extensively with DBA and Reporting team for improving the Report Performance with the Use of appropriate indexes and Partitioning.
  • Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide SQL-like access on Hadoop data
  • Worked with Hadoop eco system covering HDFS, HBase, YARN and Map Reduce.
  • Generated and DDL (Data Definition Language) scripts using Confidential and assisted DBA in Physical Implementation of Data Models.
  • Responsible for technical data governance, enterprise wide data modeling and database design.
  • Developed Data mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture involving OLTP, ODS and OLAP.

Environment: Confidential 9.5, Tableau, MDM, QlikView, PL/SQL, HDFS, Confidential 13, JSON, HADOOP (HDFS), MapReduce, PIG, Spark, AWS Redshift, AWS EMR, AWS S3, Hive, MongoDB, Hbase, SQL, T-SQL.

Sr. Data Modeler/Data Analyst

Confidential, New York, NY


  • Worked as a Data Modeler/Analyst to generate Data Models using Confidential and developed relational database system.
  • Created logical data model from the conceptual model and it's conversion into the physical database design using Confidential
  • Interacted with users for verifying User Requirements, managing Change Control Process, updating existing Documentation.
  • Developed dimensional model for Data Warehouse/OLAP applications by identifying required facts and dimensions.
  • Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW)
  • Worked very close with Data Architectures and DBA team to implement data model changes in database in all environments.
  • Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
  • Performed data cleaning and data manipulation activities using NZSQL utility.
  • Generated DDL (Data Definition Language) scripts using Confidential 8 and supported the DBA in Physical Implementation of data Models.
  • Maintained warehouse metadata, naming standards for future application development.
  • Produced PL/SQL statement and stored procedures in SQl for extracting as well as writing data.
  • Extensively made use of Triggers, Table Spaces, Pre/Post SQL, Sequences, Materialized Views, Procedures and Packages in Data Models.
  • Performed data analysis and data profiling using SQL queries on various sources systems including Oracle and SQL Server 2008
  • Used Confidential for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information.
  • Facilitated meetings with the business and technical team to gather necessary analytical data requirements.
  • Development of database objects like Tables, views and materialized views etc using SQL.
  • Assisted in designing test plans, test scenarios and test cases for integration, regression and user acceptance testing.

Environment: Confidential 9.0, OLAP, OLTP, SSIS, ODS, PL/SQL, Metadata, SQL Server 2008, Oracle9i

Data Analyst/Data Modeler



  • Involved with all the phases of Software Development Life Cycle (SDLC) methodologies throughout the project life cycle.
  • Analyzed Data sources and requirements and business rules to perform logical and physical Data Modeling
  • Created a logical design and physical design in ER/Studio.
  • Communicated with users and business analysts to gather requirements. Involved in business process modeling using UML through Rational Rose.
  • Reverse Engineered the Data Models and identified the Data Elements in the source systems and adding new Data Elements to the existing data models.
  • Worked on data profiling and data validation to ensure the accuracy of the data between the warehouse and source systems.
  • Executed the UNIX shell scripts that invoked SQL loader to load data into tables.
  • Developed Star Schema and Snowflake Schema in designing the Logical Model into Dimensional Model.
  • Creation of database objects like tables, views, Materialized views, procedures, packages using Oracle tools like PL/SQL, SQL* Plus, SQL*Loader and Handled Exceptions.
  • Involved in database development by creating Oracle PL/SQL Functions, Procedures and Collections.
  • Extensively used ER Studio for developing data model using star schema methodologies.
  • Participated in several JAD (Joint Application Design/Development) sessions to track end to end flow of attributes starting from source screens to all the downstream systems.
  • Involved in Data Profiling, Data Cleansing and make sure the data is accurate and analyzed when it is transferring from OLTP to Data Marts and Data Warehouse
  • Involved in data extraction, validation, analysis of the data and the store in data marts.
  • Performed Data Analysis and data profiling using complex SQL on various sources systems including Oracle 8i and Confidential, to ensure accuracy of the data between the warehouse and source systems.
  • Involved in completing the Data Dictionary, Data Lineage and Data Flow diagrams for Meta data
  • Involved in Performance tuning by leveraging oracle explain utility and SQL tuning.
  • Involved in creating Sessions, worklets and Workflows and scheduling workflows using Workflow Manager.

Environment: ER Studio, Star Schema, Oracle 9i/8i, Confidential Oracle SQL Developer, PL/SQL, Business Objects, OLAP, OLTP, Workflow Manager.

Hire Now