We provide IT Staff Augmentation Services!

Sr. Data Architect/ Modeler Resume

Chicago, IL

SUMMARY

  • Around 7 years of experience in Data Architect and Data Modelling, Data Development, Implementation and Maintenance of databases and software applications.
  • Hands on experience in SQL queries and optimizing the queries in Oracle, SQL Server, DB2, and Netezza & Teradata.
  • Experience in Data Modeling / Analysis for Enterprise Data Warehouse (OLAP), Master and Reference Data Management (MDM), and OLTP systems. Including Microstrategy Desktop/Developer, Web, Architect, OLAP Services, Administrator and Intelligence server.
  • Experience in designing Star schema, Snowflake schema for Data Warehouse, ODS architecture.
  • Excellent technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP.
  • Logical and physical database designing like Tables, Constraints, Index, etc. using Erwin, ER Studio, TOAD Modeler and SQL Modeler.
  • Strong experience in Data Analysis, Data Migration, Transformation, Integration, Data Import, and Data Export through ETL tools.
  • Hands - on experience on Agile and Waterfall Software Development Methodologies.
  • Hands-on experience in architecting and data modeling for AWS Platform including AWS Redshift. Oracle RDS, PostgreSQL RDS and Aurora.
  • Extensive knowledge of Bill Inmon (Enterprise Data Warehouse) and Ralph Kimball (Data Marts) methodologies, Database Design Methodologies (Normalization and De-Normalization).
  • Expert in Conceptual, Logical and Physical Data Modeling for various platforms including Oracle, DB2, Teradata, PostgreSQL, SQL Server.
  • Hands-on experience as Procedural DBA using Oracle toolset (PL/SQL, SQL, Performance Tuning).
  • Experience in ETL techniques and Analysis and Reporting including working experience with the Reporting tools such as Tableau, Informatica and Ab initio.
  • Experience in integration of SalesForce and SQL server using Sql Server Integration Services
  • Develop Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.
  • Develop materialized views for data replication in distributed environments.
  • Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of big data.
  • Experienced in Technical consulting and end-to-end delivery with architecture, data modeling, data governance and design - development - implementation of solutions.
  • Knowledge and working experience on big data tools like Hadoop, Azure Data Lake, AWS Redshift.
  • Experience in Business Intelligence (BI) project Development and implementation using Microstrategy product suits
  • Expertise on Relational Data modeling (3NF) and Dimensional data modeling.
  • Experience in developing MapReduce Programs using Apache Hadoop for analyzing the big data as per the requirement.
  • Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Skillful in Data Analysis using SQL on Oracle, MS SQL Server, DB2 & Teradata.
  • Strong background in various Data Modeling tools using ERWIN, IBM Data Architect, Power Designer, MS Visio.
  • Experience with relational (3NF) and dimensional data architectures. Experience in leading cross-functional, culturally diverse teams to meet strategic, tactical and operational goals and objectives.
  • Extensive experience in Relational Data Modeling, Logical data model/Physical data models Designs, ER Diagrams, Forward and Reverse Engineering, Publishing ERWIN diagrams, analyzing data sources and creating interface documents.
  • Experience in working with business intelligence and data warehouse software, including SSAS, Pentaho, Cognos, OBIEE, Greenplum Database, Amazon Redshift and Azure Data Warehouse.
  • Excellent experience in developing Stored Procedures, Triggers, Functions, Packages, Inner Joins & Outer Joins, views using TSQL/PLSQL.
  • Experience in designing error and exception handling procedures to identify, record and report errors.
  • Experience in migration of data from Excel, DB2, Sybase, Flat file, Teradata, Netezza, Oracle to MS SQL Server using BCP and DTS utility and extracting, transforming and loading of data.
  • Experience in using Oracle, SQL*PLUS, and SQL*Loader.
  • Experience in automating and scheduling the Informatica jobs using UNIX shell scripting configuring Korn-jobs for Informatica sessions.

TECHNICAL SKILLS

Data Modeling Tools: Rational System Architect, IBM Infosphere Data Architect, Erwin 9.7, E/R Studio 17, Power Designer and Oracle SQL Developer.

Big Data Technology: MapReduce, HBase 1.2, HDFS, Sqoop 1.4, Hadoop 3.0, Hive 2.3, PIG.

Cloud Platforms: Amazon EC2, EC3, Elastic Search, Elastic Load Balance.

OLAP Tools: Tableau 7, SAP BO, SSAS, Business Objects, and Crystal Reports.

ETL/Data warehouse Tools: Informatica 9.6/9.1/8.6.1/8.1 , SAP Business Objects XIR3.1/XIR2, Web Intelligence, Talend, Pentaho.

Database Tools: Microsoft SQL Server 16.0, Teradata 16.0, Oracle 12c/11g and MS Access.

Version Tool: VSS, SVN, CVS.

Operating System: Windows, UNIX, Sun Solaris.

Packages: Microsoft Office 2016, Microsoft Project 2016, SAP and Microsoft Visio, Share point Portal Server.

Programming Languages: SQL, PL/SQL, HTML5 and XML.

Methodologies: Agile, Ralph Kimball, BillInmon’s data warehousing methodology, RUP, RAD and JAD.

Programming Languages: Oracle 12c, PL/SQL, UNIX, Shell Scripting

PROFESSIONAL EXPERIENCE

Confidential, Chicago, IL

Sr. Data Architect/ Modeler

Responsibilities:

  • Interacted with Business Analyst, SMEs and other Data Architects to understanding Business needs and functionality for various project solutions.
  • Developing full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
  • Developed long term data warehouse roadmap and Architectures, Designs and builds the data warehouse framework per the roadmap.
  • Massively involved in Data Architecture/ Modeler role to review business requirement and compose source to target data mapping documents.
  • Specifies overall Data Architecture for all areas and domains of the enterprise, including Data Acquisition, ODS, MDM, Data Warehouse, Data Provisioning and ETL.
  • Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
  • Responsible for Big data initiatives and engagement including analysis, brain storming, POC, and architecture.
  • Working on data profiling and analysis to create test cases for new Architecture evaluation.
  • Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications.
  • Involved in end to end implementation of Big data design.
  • Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide Sql access on Hadoop data.
  • Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from NoSQL and a variety of portfolios.
  • Developed and configured on Informatica MDM hub supports the Master Data Management (MDM), Business Intelligence (BI) and Data Warehousing platforms to meet business needs.
  • Developed dimensional model for Data Warehouse/OLAP applications by identifying required facts and dimensions.
  • Extensively used Erwin r9.7 for Data modeling. Created Staging and Target Models for the Enterprise Data Warehouse.
  • Extensively used agile methodology as the Organization Standard to implement the data Models.
  • Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
  • Working on several ETL Ab Initio assignments to perform extract, transform and load data into Teradata and Oracle databases which had complex data models of Relational, Star and Snowflake schemas.
  • Working on SSIS development using meta-data driven Architecture.
  • Identified and streamlined complex queries which were causing iterations and effecting database and system performance.
  • Coordinated with the Business Analyst and prepared Logical and Physical Data-models as per the requirements involving multiple subject areas, domains and naming standards.
  • Created Data stage jobs (ETL Process) for populating the data into the Data Warehouse constantly from different source systems like ODS, flat files, scheduled the same using Data Stage for System Integration testing.
  • Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
  • Deployed SSRS reports to Report Manager and created linked reports, snapshots, and subscriptions for the reports and worked on scheduling of the reports.
  • Worked with reverse engineering Data Model from Database instance and Scripts.
  • Metadata integration across Enterprise Data dictionary, Catalogs, Data Models and Informatica establishing lineage and traceability.
  • Generated parameterized queries for generating tabular reports using global variables, expressions, functions, and stored procedures using SSRS.
  • Worked on AWS and architecting a solution to load data creates data models and run BI on it.
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
  • Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSql, and Big Data technologies.
  • Worked with the ETL team to document the SSIS packages for data extraction to Warehouse environment for reporting purposes.
  • Documented ER Diagrams, Logical and Physical models, business process diagrams and process flow diagrams.
  • Used SSRS to create reports, customized Reports, on-demand reports, ad-hoc reports and involved in analyzing multi-dimensional reports in SSRS.
  • Presented the data scenarios via, Erwin logical models and excel mockups to visualize the data better.
  • Responsible to create conceptual, logical and physical data models, of disparate information for report development.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models
  • Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
  • Conducted and participated in JAD sessions with the users, modelers, and developers for resolving issues.
  • Developed and maintained data dictionary to create metadata reports for technical and business purpose.
  • Worked with various Teradata15 tools and utilities like Teradata Viewpoint, Multi Load, ARC, Teradata Administrator, BTEQ and other Teradata Utilities.
  • Created one framework in data stage with all tables
  • Created DDL scripts using Erwin and source to target mappings to bring the data from source to the warehouse.
  • Proficient in SQL across a number of dialects (we commonly write MySQL, PostgreSQL, Sql Server, and Oracle)

Environment: Erwin 9.7, HBase 1.2, NoSQL, OLTP, OLAP, Teradata 15, Netezza, SQL Architect, MySQL, Oracle 12c, AWS, Hive 2.3, HDFS, Informatica, Snow- Flake, Star Schema, Sql, Hadoop, SSIS .

Confidential, Monroe, LA

Data Architect /Modeler

Responsibilities:

  • Responsible for Data Architecture, Data Modeling, Data Integration, Data quality & Metadata management solution design and delivery for Enterprise EDW environment.
  • Worked very close with Data Architectures and DBA team to implement data model changes in database in all environments.
  • Responsible for delivering and coordinating data-profiling, data-analysis, data-governance, data-models (conceptual, logical, physical), data-mapping, data-lineage and reference data management.
  • Worked on Data Stage admin activities like creating ODBC connections to various Data sources, Server Start up and shut down, Creating Environmental Variables, Creating Data Stage projects.
  • Participated in all phases of project including Requirement gathering, Architecture, Analysis, Design, Coding, Testing, Documentation and warranty period.
  • Involved in Relational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagrams with all related entities and relationship with each entity based on the rules provided by the business manager using ER Studio.
  • Worked on Normalization and De-normalization concepts and design methodologies like Ralph Kimball and Bill Inmon's Data Warehouse methodology.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using Star and Snow Flake Schemas.
  • Used forward engineering to generate DDL from the Physical Data Model and handed it to the DBA.
  • Integrated Spotfire visualization into client's Salesforce environment.
  • Involved in Normalization and De-Normalization of existing tables for faster query retrieval and designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas.
  • Involved in Planning, Defining and Designing database using ER Studio on business requirement and provided documentation.
  • Worked with BTEQ to submit Sql statements, import and export data, and generate reports in Teradata.
  • Developed Full life cycle of Data Lake, Data Warehouse with Big data technologies like Spark and Hadoop.
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Responsible for all metadata relating to the EDW's overall data architecture, descriptions of data objects, access methods and security requirements.
  • Conducted walkthroughs with the Business Analysts, Development teams and DBA to convey the changes made to the data models.
  • Used Agile Methodology of Data Warehouse development using Kanbanize.
  • Worked with the Business Analyst, QA team in their testing and DBA for requirements gathering, business analysis, testing and project coordination.
  • Worked with DBA group to create Best-Fit Physical Data Model from the Logical Data Model using Forward Engineering.
  • Worked with NoSQL databases like HBase in creating HBase tables to load large sets of semi-structured data coming from various sources.
  • Development of Data stage design concepts, execution, testing and deployment on the client server
  • Developed Linux Shell scripts by using Nzsql/Nzload utilities to load data from flat files to Netezza database.
  • Validated the data of reports by writing SQL queries in PL/SQL Developer against ODS.
  • Involved in user training sessions and assisting in UAT (User Acceptance Testing).
  • Worked with Data governance, Data quality, data lineage, Data architect to design various models and processes.
  • Worked on Amazon Redshift and AWS and architecting a solution to load data, create data models.
  • Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
  • Involved in Data Profiling, Data cleansing and make sure the data is accurate and analyzed when it is transferring from OLTP to Data Marts and Data Warehouse.
  • Worked on SQL Server concepts SSIS (SQL Server Integration Services), SSAS (Analysis Services) and SSRS (Reporting Services).
  • Generated and DDL (Data Definition Language) scripts using ER Studio and assisted DBA in Physical Implementation of Data Models.
  • Extensively worked on creating the migration plan to Amazon web services (AWS).
  • Extracted Mega Data from Amazon Redshift, AWS, and Elastic Search engine using Sql Queries to create reports.
  • Completed enhancement for MDM (Master data management) and suggested the implementation for hybrid MDM (Master Data Management).
  • Exported data from HDFS environment into RDBMS using Sqoop for report generation and visualization purpose.
  • Generated comprehensive analytical reports by running SQL queries against current databases to conduct Data Analysis.
  • Performed Data Analysis, Data Migration and data profiling using complex SQL on various sources systems including Oracle and Teradata
  • Designed and documented Use Cases, Activity Diagrams, Sequence Diagrams, OOD (Object Oriented Design) using UML and Visio.

Environment: ER Studio, AWS, OLTP, Teradata r15, Sqoop 1.4, Cassandra 3.11, MongoDB 3.6, HDFS, Linux, Shell, scripts, NoSQL, SSIS, SSAS, HBase 1.2, MDM.

Confidential, Fairfield, NJ

Sr. Data Modeler /Analyst

Responsibilities:

  • Involved in logical and Physical Database design & development, Normalization and Data modeling using Erwin and Sql Server Enterprise manager.
  • Prepared ETL technical Mapping Documents along with test cases for each Mapping for future developments to maintain Software Development Life Cycle (SDLC).
  • Designed OLTP system environment and maintained documentation of Metadata.
  • Worked on Amazon Redshift and AWS and architecting a solution to load data, create data models.
  • Created dimensional model for the reporting system by identifying required dimensions and facts using
  • Used Reverse Engineering to connect to existing database and create graphical representation (E-R diagram)
  • Using Erwin modeling tool, publishing of a data dictionary, review of the model and dictionary with subject matter experts and generation of data definition language.
  • Coordinated with DBA in implementing the Database changes and also updating Data Models with changes implemented in development, QA and Production.
  • Created and execute test scripts, cases, and scenarios that will determine optimal system performance according to specifications.
  • Automated and scheduled recurring reporting processes using UNIX shell scripting and Teradata utilities such as Mload, BTEQ and Fast Load
  • Participated in all phases including Analysis, Design, Coding, Testing and Documentation.
  • Gathered and translated business requirements into detailed, production-level technical specifications, new features, and enhancements to existing technical business functionality.
  • Involved in Data flow analysis, Data modeling, Physical database design, forms design and development, data conversion, performance analysis and tuning.
  • Created and maintained data model standards, including master data management (MDM) and Involved in extracting the data from various sources like Oracle, Sql, Teradata, and XML.
  • Worked with medical claim data in the Oracle database for Inpatient/Outpatient data validation, trend and comparative analysis.
  • Used Load utilities (Fast Load & Multi Load) with the mainframe interface to load the data into Teradata.
  • Optimized and updated UML Models (Visio) and Relational Data Models for various applications.

Environment: Erwin9.0, AWS, Redshift, Oracle11g, Sql Server 2010, Teradata14, XML, OLTP, PL/Sql, Linux, UNIX, Mload, BTEQ, UNIX shell scripting

Confidential

Data Analyst/ Modeler

Responsibilities:

  • Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
  • Created Entity Relationship Diagrams (ERD), Functional diagrams, Data flow diagrams and enforced referential integrity constraints.
  • Construct complex SQL queries with sub-queries, inline views as per the functional needs in the Business Requirements Document (BRD).
  • Created SSIS package for daily email subscriptions to alert Tableau subscription failure using the ODBC driver and PostgreSQL database.
  • Designed logical and physical data models, Reverse engineering, Complete compare for Oracle and SQL server objects using ER Studio.
  • Worked with supporting business analysis and marketing campaign analytics with data mining, data processing, and investigation to answer complex business questions.
  • Designed a STAR schema for sales data involving shared dimensions (Conformed) for other subject areas using ER Studio Data Modeler.
  • Created and maintained Logical Data Model (LDM) for the project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.
  • Planned and defined system requirements to Use Case, Use Case Scenario and Use Case Narrative using the UML (Unified Modeling Language) methodologies.
  • Worked with Business Analysts team in requirements gathering and in preparing functional specifications and translating them to technical specifications.
  • Involved in Data profiling and performed Data Analysis based on the requirements, which helped in catching many Sourcing Issues upfront.
  • Developed Data mapping, Data Governance, Transformation and Cleansing rules for the Data Management involving OLTP, ODS and OLAP.
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Normalized the database based on the new model developed to put them into the 3NF of the data warehouse.
  • Used SQL tools like Teradata SQL Assistant and TOAD to run SQL queries and validate the data in warehouse.

Environment: PL/SQL, ER Studio, MS SQL 2008, OLTP, ODS, OLAP, ODS, OLAP, SSIS, Tableau, ODBC, Transact-SQL, TOAD, Teradata SQL Assistant

Hire Now