We provide IT Staff Augmentation Services!

Senior Data Architect/data Modeler Resume

0/5 (Submit Your Rating)

Nashville, TN

SUMMARY

  • Above 10+ years of Experience working as a Data Architect/Data Modeler and Data Analyst with emphasis on Data Mapping, Data Validation in Data Warehousing Environment.
  • Highly proficient in Data Modeling retaining concepts of RDBMS using 3NormalForm (3NF) and Multidimensional Data Modeling Schema (Star schema, Snow - Flake Modeling, Facts and dimensions).
  • Experienced in developing Entity-Relationship ER diagrams and modeling Transactional Databases and Data Warehouse using tools like ERWIN, ER/Studio and Power Designer.
  • Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of Big data.
  • Experience in importing and exportingdatausing Sqoop from HDFS to Relational Database Systems (RDBMS) and from RDBMS to HDFS.
  • Strong hands on experience using Teradata utilities like BTEQ, Fast-Load, Multi-Load, Fast-Export, Tpump, Teradata Manager and Visual Explain.
  • Work on Background process in oracle Architecture. Also drill down to teh lowest levels of systems design and construction.
  • Experience in developing MapReduce Programs using Apache Hadoop for analyzing teh big data as per teh requirement.
  • Expertise in developing transactional enterprise data models that strictly meet normalization rules, as well as Enterprise Data Warehouses using Kimball and Inmon Data Warehouse methodologies
  • Strong experience in Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through teh use of multiple ETL tools such as Ab Initio and Informatica Power Center
  • Experience in writing SQL queries and optimizing teh queries in Oracle, SQL Server, Netezza, Teradata and Big Data.
  • Experienced in generating and documenting Metadata while designing OLTP and OLAP systems environment.
  • Excellent experience in troubleshooting test scripts, SQL queries, ETL jobs, data warehouse/data mart/data store models.
  • Expert level understanding of using different databases in combinations for Data extraction and loading, joining data extracted from different databases and loading to a specific database.
  • Experience in analyzing data using Hadoop Ecosystem including Map Reduce, HDFS, Hive, Spark, Spark Streaming, Elastic Search, Kibana, Kafka, HBase, Zookeeper, PIG, Sqoop and Flume.
  • Experienced in generating and documenting Metadata while designing OLTP and OLAP systems environment
  • Strong Database experience using Oracle, XML, DB2, Teradata, SQL server, Bigdataand NoSQL.
  • Capabilities to provide AWS operations and deployment guidance and best practices throughout teh lifecycle of a project.
  • Excellent experience on Teradata SQL queries, Teradata Indexes, Utilities such as Mload, Tpump, Fast load and Fast Export.
  • Strong experience in using Excel and MS Access to dump teh data and analyze based on business needs.
  • Excellent experience in Data mining with querying and mining large datasets to discover transition patterns and examine financial data.
  • Good experience indatatransformation,datamapping from source to target database schemas and also data cleansing.
  • Experience in working with business intelligence and data warehouse software, including SSAS, Pentaho, Cognos, OBIEE, QlikView, Greenplum Database, Amazon Redshift, or Azure Data Warehouse
  • Expert in implementing teh projects from end to end & in providing teh Architectural with emphasis on requirements analysis, design, coding, testing and documentation.
  • Additionally experienced in NameNode where Hadoop stores all teh file location information in HDFS and tracks teh file data across teh cluster or multiple machines.
  • Experience in conducting Joint Application Development (JAD) sessions with SMEs, Stakeholders and other project team members for requirement gathering and analysis.
  • Good experience in handling Data Dictionaries and Warehousing duties

TECHNICAL SKILLS

Data Modeling: Erwin 9.7, Toad, ER studio 9.7, Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables

Big Data Tools: Apache Hadoop 3.0, MapReduce, Sqoop 1.4, Pig, Hive 2.3, NoSql, Cassandra 3.11, MongoDB 3.6, Spark 2.2, Hbase 1.2, and Scala 2.1

Languages: PL SQL, T-SQL, Unix Shell scripting, XML

Database: Oracle 12c/11g, MS SQL Server2016/2014, DB2, Teradata 14/15, DB2, Netezza, Cassandra.

Big Data: Hadoop, HDFS 2, Hive, Pig, HBase, Sqoop, Flume, Splunk

Testing Tools: Win Runner, Load Runner, Test Director, Mercury Quality Center, Rational Clear Quest

BI Tools: Tableau 7.0/8.2, Pentaho 6, SAP Business Objects, Crystal Reports

ETL/Data warehouse Tools: Informatica, SAP Business Objects XIR3.1/XIR2, Talend, Pentaho

Operating System: UNIX, Windows 8/7, Linux, Red Hat

Other Tools: TOAD, BTEQ, MS-Office suite (Word, Excel, Project and Outlook)

PROFESSIONAL EXPERIENCE

Confidential - Nashville, TN

Senior Data Architect/Data Modeler

Responsibilities:

  • Worked as a Sr. Data Architect/Modeler to generate Data Models using Erwin and developed relational database system.
  • Worked on Software Development Life Cycle (SDLC) with good working noledge of testing, agile methodology, disciplines, tasks, resources and scheduling.
  • Worked on NoSQL databases including HBase, Mongo DB, and Cassandra.
  • Extensively used Agile methodology as teh Organization Standard to implement teh data Models.
  • Involved in source to target (MDM) Data mapping sessions with IBM as they master teh target.
  • Performed Reverse Engineering of teh current application using Erwin, and developed Logical and Physical data models for Central Model consolidation.
  • Part of team conducting logical data analysis and data modeling JAD sessions, communicated data-related standards.
  • Implemented dimension model (logical and physical data modeling) in teh existing architecture using Erwin.
  • Created E/R Diagrams, Data Flow Diagrams, grouped and created teh tables, validated teh data, for lookup tables.
  • Developed teh data warehouse model (Kimball's) with multiple data marts with conformed dimensions for teh proposed central model of teh Project.
  • Implemented Star Schema methodologies in modeling and designing teh logical data model into Dimensional Models.
  • Worked on Amazon Redshift and AWS and architecting a solution to load data, create data models.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using Star and Snow Flake Schemas.
  • Involved in Normalization / De normalization techniques for optimum performance in relational and dimensional database environments.
  • Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide SQL-like access on Hadoop data
  • Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in teh form of Entity Relationships and elicit more information.
  • Responsible for Big data initiatives and engagement including analysis, brain storming, POC, and architecture.
  • Worked on data profiling and analysis to create test cases for new Architecture evaluation.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing.
  • Managed and created and altered Databases, Tables, Views, Indexes and Constraints with business rules.
  • Assisted in teh oversight for compliance to teh Enterprise Data Standards, data governance and data quality.
  • Generated various reports using SQL Server Report Services (SSRS) for business analysts and teh management team.
  • Designed OLTP system environment and maintained documentation of Metadata.
  • Coordinate with Data Architects to Design Big Data, Hadoop projects and provide for a designer that is an idea-driven.
  • Configured Hadoop Ecosystems to read data transaction from HDFS and Hive.
  • Prepared reports to summarize teh daily data quality status and work activities.
  • Performed ad-hoc analyses, as needed, with teh ability to comprehend analysis as needed.

Environment: Erwin 9.7, Apache Hadoop 3.0, Hive 2.3, Oracle 12c, SSRS, OLTP, PL/SQL, AWS, Agile, NoSQL, MDM, HDFS

Confidential, Union, NJ

Sr. Data Architect/Data Modeler

Responsibilities:

  • Massively involved in Data Architect role to review business requirement and compose source to target data mapping documents.
  • Researched, evaluated, architect, and deployed new tools, frameworks, and patterns to build sustainable Big Data platforms for our clients
  • Involved in Normalization/De-normalization techniques for optimum performance in relational and dimensional database environments.
  • Implemented logical and physical relational database and maintained Database Objects in teh data model using ER/Studio.
  • Worked on Amazon Redshift, AWS & Azure and architecting a solution to load data, create data models and run BI on it.
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle12c, HBase, Mongo DB PL/SQL Server with high volume data.
  • Used Ab Initio DQE for data quality solution for enterprise-level data processing and data management systems
  • Worked on NoSQL databases including HBase, Mongo DB, and Cassandra. Implemented multi-data center and multi-rack Cassandra cluster.
  • Applied Data Governance rules (primary qualifier, Class words and valid abbreviation in Table name and Column names).
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and extracted teh data from Oracle into HDFS using Sqoop.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using Star and Snow Flake Schemas
  • Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide SQL access on Hadoop data
  • Developed long term data warehouse roadmap and architectures, designs and builds teh data warehouse framework per teh roadmap.
  • Worked on Metadata Repository (MRM) for maintaining teh definitions and mapping rules up to mark.
  • Used Meta Stage to maintain teh metadata for different Data Warehouse environments and projects.
  • Developed MapReduce programs to parse teh raw data, populate staging tables and store teh refined data in partitioned tables in teh EDW.
  • Primarily responsible for Tableau customization for statistical dashboard to monitor sales TEMPeffectiveness and also used Tableau for customer marketing data visualization.
  • Involved in making screen designs, Use Cases and ER diagrams for teh project using ER/Studio.
  • Extracted data from IBM Cognos to create automated visualization reports and dashboards on Tableau.
  • Performed Data mapping between source systems to Target systems, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data
  • Used Agile Methodology of Data Warehouse development using Kanbanize.
  • Developed triggers, stored procedures, functions and packages using cursors and ref cursor concepts associated with teh project using PL/SQL
  • Generate DDL scripts for database modification, Teradata, Macros, Views and set tables.
  • Handled importing data from various data sources, performed transformations using Hive, Map Reduce, and loaded data into HDFS
  • Developed teh performance tuning of teh database by using EXPLAIN PLAN, TKPROF utilities and also debugging teh code.

Environment: ER/Studio 9.7, Oracle12c, Hive, Amazon Redshift, AWS, MapReduce, Hadoop, Cassandra, HBase, Mongo DB, Pig, Agile, NoSQL, PL/SQL, OLAP, OLTP, SQL, IBM Cognos, Ab Initio, Tableau, Crystal Reports 2008, HDFS.

Confidential, Bellevue, WA

Sr. Data Architect/Modeler

Responsibilities:

  • Developing full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
  • Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables and Session Parameters.
  • Extracting data from various source systems like Oracle, SQL Server and flat files as per teh requirements.
  • Worked as Architect and designed conceptual, logical and physical models and build data marts using hybrid Inmon and Kimball DW methodologies
  • Involved in writing Shell Scripts to accumulate teh MTD source file Collaboration with Architects and Managers for review of solutions and data strategy
  • Used data virtualization tool connect multiple heterogeneous sources without requirement of physically moving teh data.
  • Extensively involved in analyzing variousdataformats using industry standard tools and TEMPeffectively communicate them with business users and SME's.
  • Worked on data warehousing, ETL, SQL, scripting and big data (MPP + Hadoop).
  • Worked on Normalization and De-Normalization techniques for both OLTP and OLAP systems
  • Created reports from Greenplum (Pivotal) which s teh positions/transactions for each customer's monthly invoice for all jurisdiction (CFTC, ESMA, CANADA, ASIC & MAS) and make it available on teh Portal.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Involved in Netezza Administration Activities like backup/restore, performance tuning, and Security configuration.
  • Create architectural documents such as, teh e2e data flow for teh IBM Information data warehouse system.
  • Identified security loopholes, established data quality assurance and addressed data governance.
  • Designed Physical Data Model (PDM) using IBM Info sphere Data Architect data modeling tool and Oracle PL/SQL.
  • Used push down optimization in Informatica to call Greenplum GPLoad functions
  • Lead teh design and modeling of tactical architectures for development, delivery, and support of projects.
  • As a Architect implement MDM hub to provide clean, consistent data for a SOA implementation.
  • Called Greenplum Business Rules, Data Rules and Transform Rules functions using Informatica Stored Procedure Transformation.
  • Developed and maintained policies, standards, and guidelines to ensure that a consistent framework is applied across teh company.
  • Involved in all teh steps and scope of teh project data approach to MDM, Creating a Data Dictionary and Mapping from Sources to teh Target in MDM Data Model.
  • Promoted teh use of a shared infrastructure, application roadmap, and documentation of interfaces to improve information flow and reduce costs.
  • Architect and build meta data repository to describe Digital business data, technical data, and processes.
  • Designed solutions for multiple large data warehouses with a good understanding of cluster and parallel architecture as well as high-scale or distributed RDBMS and/or noledge on NoSQL platforms.
  • Developed Ad-Hoc Queries, Views and functions in Greenplum in order to make data accessible for Business Analyst and Managers.

Environment: Erwin 9.6, PL/SQL, ODS, Hadoop, MS SQL Server 2014, flat files, Oracle 12c, MDM, Information Analyzer, Informatica, IBM Infosphere

Confidential, Boston, MA

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Gatheird requirements, analyzed and wrote teh design documents.
  • Developed logical data models and physical database design and generated database schemas using ER Studio.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and DB2.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources. Create and Monitor workflows using workflow designer and workflow monitor.
  • Involved in extensive Data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues. Identify & record defects with required information for issue to be reproduced by development team.
  • Document all data mapping and transformation processes in teh Functional Design documents based on teh business requirements.
  • Generated and DDL (Data Definition Language) scripts using ER Studio and assisted DBA in Physical Implementation of data Models.
  • Developed, managed and validated existing data models including logical and physical models of teh data warehouse and source systems utilizing a 3NFmodel.
  • Prepared High Level Logical Data Models using Erwin, and later translated teh model into physical model using teh Forward Engineering technique.
  • Involved in Data mapping specifications to create and execute detailed system test plans. Teh data mapping specifies wat data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center.

Environment: ER Studio, Microsoft Visio, MS SQL Server 2012, DB2 Oracle10g/11g, workflow designer

Confidential, Providence, RI

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Involved in analysis of Business requirement, Design and Development of High level and Low level designs, Unit and Integration testing
  • Created teh conceptual model for teh data warehouse using Erwin data modeling tool.
  • Reviewed teh conceptual EDW (Enterprise Data Warehouse) data model with business users, App Dev and Information architects to make sure all teh requirements are fully covered.
  • Worked on designing teh OLAP Model, Dimension model for BI Reporting sourcing from SAP Transactions.
  • Used Erwin for TEMPeffective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Designed and documented Use Cases, Activity Diagrams, Sequence Diagrams, OOD (Object Oriented Design) using UML and Visio.
  • Reverse engineered existingdatabases to understand thedataflow and business flows of existing systems and to integrate teh new requirements to future enhanced and integrated system. Designing Partitioned strategy indatamodel.
  • Used and supported database applications and tools for extraction, transformation and analysis of rawdata
  • Datamodeling was performed using ERWIN tool to build logical and physical models.
  • Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in teh form of Entity Relationships and elicit more information.
  • Gatheird reporting and analysis requirements and translated into reporting structures /datamodels, including aggregate tables, relational and dimensional (star-schema) marts
  • Forward Engineered teh physicaldatamodel and generated DDL script using Forward Engineering option in Erwin.
  • Analyzed teh impact on teh Enterprisedatawarehouse and Down streams.
  • Developed PL/SQL scripts to validate and loaddatainto interface tables

Environment: Erwin 7.x, Microsoft Visio, OOD, SAP, OLAP, ODS, MS SQL Server 2008, DB2 Oracle 10g

Confidential

Data Modeler/Data Analyst

Responsibilities:

  • Developed, managed and validated existing data models including logical and physical models of teh data warehouse and source systems utilizing a 3NF model
  • Transformed projectdatarequirements into projectdatamodels using Erwin.
  • Written SQL scripts to test teh mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Assisted in defining business requirements for teh IT team and created BRD and functional specifications documents along with mapping documents to assist teh developers in their coding.
  • Interacted with business analysts to get functional requirements.
  • Creating databases for OLAP Metadata Catalog tables using forward engineering of models in ERwin.
  • Involved in designing data warehouse/ Data Marts using Ralph Kimball methodology.
  • Create standard abbreviation document for logical, physical and dimensional data models.
  • Worked with report developers for building OLAP Cubes for ad hoc reporting
  • Creating logical, physical and dimensional data models using Erwin and documenting it.
  • Generating reports from teh data models
  • Review teh data model with functional and technical team
  • Worked on master data management, data governance and data integration for improving teh data quality using IDQ.
  • Exhaustively collected business and technical Metadata and maintained naming standards.
  • Create SQL code from data model and co-ordinate with DBA's.
  • Generated PL/SQL code and created procedures, functions, views, etc.

Environment: Erwin7.0, Microsoft Visio, MS SQL Server2008, Metadata, Oracle 9i/8i, PL/SQL, OLAP, Ralph Kimball

We'd love your feedback!