We provide IT Staff Augmentation Services!

Sr. Data Modeler/data Architect Resume

4.00/5 (Submit Your Rating)

Denver, CO

PROFESSIONAL SUMMARY:

  • Over 8 years of Industry experienced in IT with solid understanding of Data Modeling, Data Analysis, Data Architecture, Evaluating Data Sources and strong understanding of Data Warehouse/Data Mart Design, BI, OLAP, OLTP, Client/Server applications.
  • Experienced in writing SQL queries and optimizing the queries in Oracle, SQL Server, and Netezza, Teradata.
  • Experienced in Dimensional Data Modeling using Data Modeling, Relational Data modeling, ER/Studio, Erwin, and Sybase Power Designer, Star Schema/Snowflake modeling, FACT & Dimensions tables, Conceptual, Physical & logical Data Modeling.
  • Experienced in Management and implementation of database models, data flow diagrams, database schemas, db scripts, DTD schemas, structures and data standards to support a robust data management infrastructure.
  • Experienced in Data Analysis and Data Profiling using complex SQL on various sources systems including Oracle and Teradata.
  • Very good knowledge and experience on AWS, Redshift, S3 and EMR.
  • Extensive experience in Normalization (1NF, 2NF, 3NF and BCNF) and De - normalization techniques for improved database performance in OLTP, OLAP and Data Warehouse/Data Mart environments.
  • Experienced on Metadata definition, implementation and maintenance, new business rules identification and implementation to data rules, transformation program library maintenance, XML file generation and data quality.
  • Experienced using MapReduce and Big data work on Hadoop and other NO SQL platforms
  • Experienced in designing standards for using normalized data structures, de-normalized structures and dimensional structures. Defines common design patterns for modeling various types of relationships.
  • Experienced in Batch processes, Import, Export, Backup, Database Monitoring tools and Application support.
  • Experienced in big data analysis and developing data models using Hive, PIG, and Map reduce, SQL with strong data architecting skills designing data-centric solutions.
  • Experienced in Teradata SQL queries, Teradata Indexes, Utilities such as Mload, Tpump, Fast load and Fast Export.
  • Experienced in using the databases like DB2, Teradata and its utilities, Netezza, Oracle, SQL Server Integration Services (SSIS)
  • Experienced in data from various data sources/business systems including MS Excel, MS Access, Flat Files etc to SQL Server using SSIS using various features like data conversion etc.
  • Experienced in Oracle, Netezza, and Teradata, SQL Server, and DB2 database architecture.
  • Expertise in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
  • Extensive experience in development of T-SQL, DTS, OLAP, PL/SQL, Stored Procedures, Triggers, Functions, Packages, performance tuning and optimization for business logic implementation.
  • Good knowledge of Data Marts, Operational Data Store (ODS), OLAP, Dimensional Data Modeling with Ralph Kimball Methodology (Star Schema Modeling, Snow-Flake Modeling for FACT and Dimensions Tables) using Analysis Services.
  • Excellent in performing data transfer activities between SAS and various databases and data file formats like XLS, CSV, DBF, MDB etc.
  • Experienced in ER Studio and Dimensional Models using Erwin advanced features, Conceptual, logical and physical data models using Erwin.
  • Experienced in development and support knowledge on Oracle, SQL, PL/SQL, T-SQL queries.
  • Experienced in testing integration solutions for Data import, export and Migration using EIM (Enterprise Integration Manager)
  • Excellent Knowledge of Ralph Kimball and Bill Inmon's approaches to Data Warehousing.
  • Excellent knowledge in developing Informatica Mappings, Mapplets, Sessions, Workflows and Work lets for data loads from various sources such as Oracle, Flat Files, DB2, SQL Server etc.
  • Experienced in writing UNIX shell scripting and hands on experienced with scheduling of shell scripts using Control-M.
  • Extensive experience in Relational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagrams using multiple data modeling tools like ERWIN, ER Studio.

TECHNICAL SKILLS:

Big Data technologies: MapReduce, Hbase1.2, HDFS, Sqoop1.4, Spark2.3, Hadoop3.0, Hive2.3, PIG0.17, Impala2.10.

Data Modelling Tools: ER/Studio 17, Erwin 9.7, Power Sybase Designer.

OLAP Tools: Tableau 10.5, SAP BO, SSAS, Business Objects, and Crystal Reports 14.2

Cloud Platform: AWS, EC2, EC3, AWS ELB, RDS, Redshift, MS Azure Basic

Programming Languages: SQL, PL/SQL, UNIX shell Scripting, PERL, AWK, SED

Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server 2016/2014, DB2.

Testing and defect tracking Tools: HP/Mercury (Quality Center, Win Runner, Quick Test Professional, Performance Center, Requisite, MS Visio & Visual Source Safe

Operating System: Windows, Unix, Sun Solaris

ETL/Data warehouse Tools: Informatica, SAP Business Objects XIR3.1/XIR2 and Tableau 10.5.

Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Waterfall Model.

WORK EXPERIENCE:

Confidential, Denver, CO

Sr. Data Modeler/Data Architect

Responsibilities:

  • Lead the design and modeling of tactical architectures for development, delivery, and support of projects.
  • Developing full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
  • Used Agile Methodology of Data Warehouse development using Kanbanize.
  • Interacting with business users to analyze the business process and requirements and transforming requirements into Conceptual, logical and Physical Data Models, designing database, documenting and rolling out the deliverables.
  • Responsible for Master Data Management (MDM) and Data Lake design and architecture. Data Lake is built using Cloudera Hadoop.
  • Involved in Normalization and De-Normalization of existing tables for faster query retrieval and designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas.
  • Used forward engineering to create a physical data model with DDL that best suits the requirements from the Logical Data Model.
  • Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information.
  • Implementation of full lifecycle in Data warehouses and Business Data marts with Star Schemas, Snowflake Schemas, SCD & Dimensional Modeling.
  • Designed and documented Use Cases, Activity Diagrams, Sequence Diagrams, OOD (Object Oriented Design) using UML and Visio.
  • Worked with NoSQL databases like HBase in creating HBase tables to load large sets of semi-structured data coming from various sources.
  • Exported data from HDFS environment into RDBMS using Sqoop for report generation and visualization purpose.
  • Responsible for Dimensional Data Modeling and Modeling Diagrams using Erwin.
  • Extracted files from Cassandra, MongoDB through Sqoop and placed in HDFS for processed.
  • Implemented Dynamic Partition and Bucketing in Hive as part of performance tuning for the workflow and co-ordination files using Oozie framework to automate tasks.
  • Developed Pig Latin scripts for replacing the existing legacy process to the Hadoop and the data is fed to AWS S3.
  • Worked with BTEQ to submit SQL statements, import and export data, and generate reports in Teradata.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Developed SQL, BTEQ (Teradata) queries for Extracting data from production database and built data structures, reports.
  • Wrote and executed unit, system, integration and UAT scripts in a data warehouse project.
  • Created queries using BI-Reporting variables, navigational attributes and Filters. Developed workbooks, info set Queries. Defined reports as per reporting requirements.
  • Implemented slowly changing and rapidly changing dimension methodologies; created aggregate fact tables for the creation of ad-hoc reports.
  • Created and maintained surrogate keys on the master tables to handle SCD type 2 changes effectively.
  • Worked with reversed engineer Data Model from Database instance and Scripts.
  • Implemented the Slowly Changing Dimensions as per the requirement.
  • Running Quality checks using SQL Queries and keep sync all databases with Erwin model and across all environments.
  • Deployed naming standard to the Data Model and followed company standard for Project Documentation.

Environment: Erwin 9.6, Agile, MDM, Kanbanize, SQL, BTEQ, Teradata r14, DBA, ODS, OLTP, OOD, UML, ETL, Hadoop 3.0, Cassandra, MongoDB, Sqoop 1.4, HDFS, Oozie, Pig.

Confidential, Chicago, IL

Sr. Data Modeler/Data Architect

Responsibilities:

  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and BigData technologies.
  • Monitored and measured data architecture processes and standards to ensure value is being driven and delivered as expected.
  • Responsible for the data architecture design delivery, data model development, review, approval and Data warehouse implementation.
  • Established data architecture strategy, best practices, standards and roadmaps.
  • Review system architecture, data flow, data warehouse dimensional model, DDL to identify the area for improvement to reduce the loading & reporting time for a meter reading system.
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
  • Assigned tasks among development team monitored and tracked progress of project following Agile methodology.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas.
  • Worked with Data governance, Data quality, data lineage, Data architect to design various models and processes.
  • Perform data validation against source system data for analyzing the existing database source files and tables to ingest data into Hadoop Data Lake.
  • Maintained database architecture and metadata that support the Enterprise Data Warehouse (EDW)
  • Involved in designing Logical and Physical data models for different database applications using the Erwin 9.6.
  • Worked as Architect and build data marts using hybrid Inmon and Kimball DW methodologies.
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management (MDM) Architecture involving OLTP, ODS.
  • Created MDM base objects, tables to follow Landing and Staging the comprehensive data model in MDM.
  • Involved in all the steps and scope of the project reference data approach to MDM, Creating a Data Dictionary and Mapping from Sources to the Target in MDM Data Model.
  • Extensively worked on creating the migration plan to Amazon web services (AWS).
  • Conducted numerous POCs (Proof of Concepts) to efficiently import large data sets into the database from AWS S3 Bucket.
  • Involved in resolving the dependencies that create problem while migrating to the cloud.
  • Responsible for Big data initiatives and engagement including analysis, brain storming, POC, and architecture.
  • Imported required tables from RDBMS to HDFS using Sqoop.
  • Handled importing of data from various data sources, performed transformations using Hive and MapReduce for loading data into HDFS and extracted the data from MySQL into HDFS using Sqoop.
  • Designed and architecting AWS Cloud solutions for data and analytical workloads such as warehouses, Big Data, data lakes, real-time streams and advanced analytics.
  • Created tables, views, sequences, indexes, constraints and generated SQL scripts for implementing physical data model.
  • Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
  • Created Dispatch scripts to dispatch the data from HDFS to Teradata data.
  • Responsible for Metadata Management, keeping up to date centralized metadata repositories using Erwin modeling tools.
  • Performed data validation on the flat files that were generated in UNIX environment using UNIX commands as necessary.
  • Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetch data from Teradata database.
  • Collected large amounts of log data using Apache Flume and aggregating using PIG in HDFS for further analysis.
  • Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
  • Generated periodic reports based on the statistical analysis of the data using SQL Server Reporting Services (SSRS).
  • Worked with the QA team member to understand the test coverage on the functionalities, identify missing test cases, help QA to build strong test suite for the data pipelines.

Environment: MDM, OLTP, ODS, Hadoop3.0, Spark 2.3, AWS, Pipeline, SSRS, Erwin 9.7, Teradata r15, HDFS, BTEQ, MLOAD, Sqoop 1.4, RDBMS, DDL, DML, POC, Hbase 1.2, Yarn, MapReduce, Pig 0.17, Unix

Confidential, Malvern, PA

Sr. Data Analyst/Data Modeler

Responsibilities:

  • Created Physical Data Analyst from the Logical Data Analyst using Compare and Merge Utility in ER Studio and worked with the naming standards utility.
  • Implemented the NoSQL database HBase and the management of the other tools and process observed running on YARN.
  • Worked with developers on data Normalization and De-normalization, performance tuning issues, and aided in stored procedures as needed.
  • Utilized SDLC and Agile methodologies such as SCRUM.
  • Developed normalized Logical and Physical database models for designing an OLTP application.
  • Extensively used Star Schema methodologies in building and designing the logical data model into Dimensional Models
  • Creation of database objects like tables, views, Materialized views, procedures, packages using Oracle tools like PL/SQL, SQL*Loader and Handled Exceptions.
  • Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Involved in administrative tasks, including creation of database objects such as database, tables, and views, using SQL, DDL, and DML requests.
  • Worked on Data Analysis, Data profiling, and Data Modeling, data governance identifying Data Sets, Source Data, Source Meta Data, Data Definitions and Data Formats.
  • Loaded multi format data from various sources like flat-file, Excel, MS Access and performing file system operation.
  • Used T-SQL stored procedures to transfer data from OLTP databases to staging area and finally transfer into data marts.
  • Worked on Physical design for both SMP and MPP RDBMS, with understanding of RDMBS scaling features.
  • Wrote SQL Queries, Dynamic-queries, sub-queries and complex joins for generating Complex Stored Procedures, Triggers, User-defined Functions, Views and Cursors.
  • Wrote simple and advanced SQL queries and scripts to create standard and ad hoc reports for senior managers.
  • Performed ETL SQL optimization designed OLTP system environment and maintained documentation of Metadata.
  • Involved with Data Analysis primarily Identifying Data Sets, Source Data, Source Meta Data, Data Definitions and Data Formats
  • Used Teradata for OLTP systems by generating models to support Revenue Management Applications that connect to SAS.
  • Created SSIS Packages for import and export of data between Oracle database and others like MS Excel and Flat Files.
  • Worked in the capacity of ETL Developer (Oracle Data Integrator (ODI) / PL/SQL) to migrate data from different sources in to target Oracle Data Warehouse.
  • Designed and Developed PL/SQL procedures, functions and packages to create Summary tables.
  • Involved in creating tasks to pull and push data from Salesforce to Oracle Staging/Data Mart.
  • Created VBA Macros to convert the Excel Input files in to correct format and loaded them to SQL Server.
  • Helped the BI, ETL Developers in understanding the Data Model, data flow and the expected output for each model created

Environment: ER/Studio 8.0, Oracle 11g Application Server, Oracle Developer Suite, PL/SQL, T-SQL, SQL plus, SSIS, Teradata 13, OLAP, OLTP, SAS, MS Excel, NoSQL, HBase & Yarn.

Confidential, Mount Laurel, NJ

Sr. Data Modeler

Responsibilities:

  • Identified the business function activities and processes, data attribute and table metadata, and documented detailed design specifications.
  • Conducted JAD sessions with Business Analysts and SME to understand, analyze and document business requirements.
  • Managed development of Logical and Physical Data models by using Erwin.
  • Interacted with Business analysts and Developers to gather the Data Definitions for the Data Models to get the Data Dictionaries in place.
  • Generated and DDL (Data Definition Language) scripts using Erwin and assisted DBA in Physical Implementation of data Models.
  • Involved in making ER diagrams for the project using Erwin and Visio.
  • Created repositories for data modelers using Erwin.
  • Administrated repositories for team of Data Modelers using Erwin.
  • Coordinated with the Business Analyst and prepared Logical and Physical Data-models as per the requirement.
  • Designed and developed the DDL scripts, data mapping documents, data dictionary and Meta data of the models and maintained them.
  • Coordinated with DBA on table normalizations and de-normalizations
  • Prepared documentation for all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules and change during the project
  • Generated user and technical documentation for individual processes
  • Identified and streamlined complex queries which were causing iterations and effecting database and system performance
  • Participated in the status meetings and updated the status to the management team.

Environment: Erwin 8.5, Oracle 9i, SQL, Microsoft Access, Oracle, SQL Server 2012, MS PowerPoint, Microsoft Excel, Microsoft Visio.

Confidential, Denver, CO

Data Analyst/Data Modeler

Responsibilities:

  • Performed in team responsible for the analysis of business requirements and design implementation of the business solution.
  • Developed logical and physical data models for central model consolidation.
  • Worked with DBAs to create a best fit physical data model from the logical data model.
  • Conducted data modeling JAD sessions and communicated data-related standards.
  • Used Erwin r8 for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Used Star/Snowflake schemas in the data warehouse architecture.
  • Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns as part of data analysis responsibilities
  • Developed process methodology for the Reverse Engineering phase of the project.
  • Used reverse engineering to connect to existing database and create graphical representation (E-R diagram)
  • Utilized Erwin's reverse engineering and target database schema conversion process.
  • Involved in logical and physical designs and transforms logical models into physical implementations.
  • Created 3NF business area data modeling with de-normalized physical implementation data and information requirements analysis using ERWIN tool.
  • Involved in extensive data analysis on Teradata, and Oracle Systems Querying and Writing in SQL and Toad.
  • Involved using ETL tool Informatica to populate the database, data transformation from the old database to the new database using Oracle and SQL Server.
  • Creation of database objects like tables, views, Materialized views, procedures, packages using Oracle tools like PL/SQL, SQL* Plus, SQL*Loader and Handled Exceptions.
  • Used Informatica Designer, Workflow Manager and Repository Manager to create source and target definition, design mappings, create repositories and establish users, groups and their privileges
  • Involved in Data profiling in order to detect and correct inaccurate data and maintain the data quality.
  • Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
  • Involved in the creation, maintenance of Data Warehouse and repositories containing Metadata.
  • Developed Star and Snowflake schemas based dimensional model to develop the data warehouse.
  • Involved in the study of the business logic and understanding the physical system and the terms and condition for database.
  • Worked closely with the ETL SQL Server Integration Services (SSIS) Developers to explain the Data Transformation.
  • Created reports using SQL Reporting Services (SSRS) for customized and ad-hoc Queries.
  • Created documentation and test cases, worked with users for new module enhancements and testing.
  • Created simple and complex mapping using Data stage to load Dimensions and Fact tables as per Star schema techniques.
  • Designed and Developed Oracle database Tables, Views, Indexes with proper privileges and Maintained and updated the database by deleting and removing old data.
  • Generated ad-hoc reports using Crystal Reports.

Environment: Erwin r8, Informatica 7.0, Windows XP, Oracle10g, SQL Server 2008, MS Excel, MS Visio, Microsoft Transaction Server, Crystal Reports, SQL*Loader

Confidential, San Jose, CA

Data Analyst

Responsibilities:

  • Maintained channels of organizational communication and acted as the point of contact between teams.
  • Creation of Teradata, Fast Load, BTEQ, Fast export scripts and UNIX shell scripts to perform the fulfillments Creation of Tables, Views to fulfill the business needs and support the Production systems with day to day activities and support the performance issues.
  • Developed SQL queries for Extracting data from production database and built data structures, reports.
  • Used the MS Excel, MS Access for data pulls and ad hoc reports for analysis.
  • Performance tuning and Optimization of large database for fast accessing data and reports Ms Excel.
  • Reports generated for various departments like Tele Marketing, Mailing, and New Accounts by using SQL, and Ms Access.
  • Developed Data mining processes for large data files.
  • Ad-hoc reports developed using Oracle, SQL, and UNIX.
  • Performed in depth analysis in data & prepared weekly, biweekly, monthly reports by using SQL, SAS, Ms Excel, Ms Access, and UNIX.
  • Experience in documentation of business requirements and system functional specifications in the form of Use Cases.
  • Extensively used excel and VBA to generate reports
  • Created and manipulated datasets using SAS, Access, and Excel.
  • Wrote JCL (Job Control Language) programs to communicate with the MVS (ISPF) operating system.
  • Designed and implemented SQL queries for QA Testing and Report / Data Validation
  • Utilized a combination of business knowledge, technical skills and strategic analysis to provide solutions and creative insights to critical business problems.
  • Execution, Maintenance & Debugging of the scripts with Teradata BTEQ.
  • Created Sql tables, Views, Macros and analyzed various Teradata tables for UPI's and Monitor the same. Execution of Mainframes JCL scripts.
  • Utilized ODBC for connectivity to Sql & MS Excel.
  • Good experience in identification and requirements gathering as per the business requirements via meetings, interviews, interface analysis, research, etc.

Environment: Unix, SQL, MS Excel 2007, MS Access 2007, Oracle r8, Informatica, SAS, VBA, MVS, ISPF, JCL, ODBC

We'd love your feedback!