We provide IT Staff Augmentation Services!

Senior Data Architect/data Modeler Resume

0/5 (Submit Your Rating)

Nashville, TN

SUMMARY

  • Above 10+ years of Experience working as a Data Architect/Data Modeler and Data Analyst wif emphasis on Data Mapping, Data Validation in Data Warehousing Environment.
  • Highly proficient in Data Modeling retaining concepts of RDBMS using 3NormalForm (3NF) and Multidimensional Data Modeling Schema (Star schema, Snow - Flake Modeling, Facts and dimensions).
  • Experienced in developing Entity-Relationship ER diagrams and modeling Transactional Databases and Data Warehouse using tools like ERWIN, ER/Studio and Power Designer.
  • Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of Big data.
  • Experience in importing and exportingdatausing Sqoop from HDFS to Relational Database Systems (RDBMS) and from RDBMS to HDFS.
  • Strong hands on experience using Teradata utilities like BTEQ, Fast-Load, Multi-Load, Fast-Export, Tpump, Teradata Manager and Visual Explain.
  • Work on Background process in oracle Architecture. Also drill down to the lowest levels of systems design and construction.
  • Experience in developing MapReduce Programs using Apache Hadoop for analyzing the big data as per the requirement.
  • Expertise in developing transactional enterprise data models that strictly meet normalization rules, as well as Enterprise Data Warehouses using Kimball and Inmon Data Warehouse methodologies
  • Strong experience in Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Ab Initio and Informatica Power Center
  • Experience in writing SQL queries and optimizing the queries in Oracle, SQL Server, Netezza, Teradata and Big Data.
  • Experienced in generating and documenting Metadata while designing OLTP and OLAP systems environment.
  • Excellent experience in troubleshooting test scripts, SQL queries, ETL jobs, data warehouse/data mart/data store models.
  • Expert level understanding of using different databases in combinations for Data extraction and loading, joining data extracted from different databases and loading to a specific database.
  • Experience in analyzing data using Hadoop Ecosystem including Map Reduce, HDFS, Hive, Spark, Spark Streaming, Elastic Search, Kibana, Kafka, HBase, Zookeeper, PIG, Sqoop and Flume.
  • Experienced in generating and documenting Metadata while designing OLTP and OLAP systems environment
  • Strong Database experience using Oracle, XML, DB2, Teradata, SQL server, Bigdataand NoSQL.
  • Capabilities to provide AWS operations and deployment guidance and best practices throughout the lifecycle of a project.
  • Excellent experience on Teradata SQL queries, Teradata Indexes, Utilities such as Mload, Tpump, Fast load and Fast Export.
  • Strong experience in using Excel and MS Access to dump the data and analyze based on business needs.
  • Excellent experience in Data mining wif querying and mining large datasets to discover transition patterns and examine financial data.
  • Good experience indatatransformation,datamapping from source to target database schemas and also data cleansing.
  • Experience in working wif business intelligence and data warehouse software, including SSAS, Pentaho, Cognos, OBIEE, QlikView, Greenplum Database, Amazon Redshift, or Azure Data Warehouse
  • Expert in implementing the projects from end to end & in providing the Architectural wif emphasis on requirements analysis, design, coding, testing and documentation.
  • Additionally experienced in NameNode where Hadoop stores all the file location information in HDFS and tracks the file data across the cluster or multiple machines.
  • Experience in conducting Joint Application Development (JAD) sessions wif SMEs, Stakeholders and other project team members for requirement gathering and analysis.
  • Good experience in handling Data Dictionaries and Warehousing duties

TECHNICAL SKILLS

Data Modeling: Erwin 9.7, Toad, ER studio 9.7, Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables

Big Data Tools: Apache Hadoop 3.0, MapReduce, Sqoop 1.4, Pig, Hive 2.3, NoSql, Cassandra 3.11, MongoDB 3.6, Spark 2.2, Hbase 1.2, and Scala 2.1

Languages: PL SQL, T-SQL, Unix Shell scripting, XML

Database: Oracle 12c/11g, MS SQL Server2016/2014, DB2, Teradata 14/15, DB2, Netezza, Cassandra.

Big Data: Hadoop, HDFS 2, Hive, Pig, HBase, Sqoop, Flume, Splunk

Testing Tools: Win Runner, Load Runner, Test Director, Mercury Quality Center, Rational Clear Quest

BI Tools: Tableau 7.0/8.2, Pentaho 6, SAP Business Objects, Crystal Reports

ETL/Data warehouse Tools: Informatica, SAP Business Objects XIR3.1/XIR2, Talend, Pentaho

Operating System: UNIX, Windows 8/7, Linux, Red Hat

Other Tools: TOAD, BTEQ, MS-Office suite (Word, Excel, Project and Outlook)

PROFESSIONAL EXPERIENCE

Confidential - Nashville, TN

Senior Data Architect/Data Modeler

Responsibilities:

  • Worked as a Sr. Data Architect/Modeler to generate Data Models using Erwin and developed relational database system.
  • Worked on Software Development Life Cycle (SDLC) wif good working noledge of testing, agile methodology, disciplines, tasks, resources and scheduling.
  • Worked on NoSQL databases including HBase, Mongo DB, and Cassandra.
  • Extensively used Agile methodology as the Organization Standard to implement the data Models.
  • Involved in source to target (MDM) Data mapping sessions wif IBM as they master the target.
  • Performed Reverse Engineering of the current application using Erwin, and developed Logical and Physical data models for Central Model consolidation.
  • Part of team conducting logical data analysis and data modeling JAD sessions, communicated data-related standards.
  • Implemented dimension model (logical and physical data modeling) in the existing architecture using Erwin.
  • Created E/R Diagrams, Data Flow Diagrams, grouped and created the tables, validated the data, for lookup tables.
  • Developed the data warehouse model (Kimball's) wif multiple data marts wif conformed dimensions for the proposed central model of the Project.
  • Implemented Star Schema methodologies in modeling and designing the logical data model into Dimensional Models.
  • Worked on Amazon Redshift and AWS and architecting a solution to load data, create data models.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using Star and Snow Flake Schemas.
  • Involved in Normalization / De normalization techniques for optimum performance in relational and dimensional database environments.
  • Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide SQL-like access on Hadoop data
  • Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information.
  • Responsible for Big data initiatives and engagement including analysis, brain storming, POC, and architecture.
  • Worked on data profiling and analysis to create test cases for new Architecture evaluation.
  • Worked wif data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing.
  • Managed and created and altered Databases, Tables, Views, Indexes and Constraints wif business rules.
  • Assisted in the oversight for compliance to the Enterprise Data Standards, data governance and data quality.
  • Generated various reports using SQL Server Report Services (SSRS) for business analysts and the management team.
  • Designed OLTP system environment and maintained documentation of Metadata.
  • Coordinate wif Data Architects to Design Big Data, Hadoop projects and provide for a designer that is an idea-driven.
  • Configured Hadoop Ecosystems to read data transaction from HDFS and Hive.
  • Prepared reports to summarize the daily data quality status and work activities.
  • Performed ad-hoc analyses, as needed, wif the ability to comprehend analysis as needed.

Environment: Erwin 9.7, Apache Hadoop 3.0, Hive 2.3, Oracle 12c, SSRS, OLTP, PL/SQL, AWS, Agile, NoSQL, MDM, HDFS

Confidential, Union, NJ

Sr. Data Architect/Data Modeler

Responsibilities:

  • Massively involved in Data Architect role to review business requirement and compose source to target data mapping documents.
  • Researched, evaluated, architect, and deployed new tools, frameworks, and patterns to build sustainable Big Data platforms for our clients
  • Involved in Normalization/De-normalization techniques for optimum performance in relational and dimensional database environments.
  • Implemented logical and physical relational database and maintained Database Objects in the data model using ER/Studio.
  • Worked on Amazon Redshift, AWS & Azure and architecting a solution to load data, create data models and run BI on it.
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle12c, HBase, Mongo DB PL/SQL Server wif high volume data.
  • Used Ab Initio DQE for data quality solution for enterprise-level data processing and data management systems
  • Worked on NoSQL databases including HBase, Mongo DB, and Cassandra. Implemented multi-data center and multi-rack Cassandra cluster.
  • Applied Data Governance rules (primary qualifier, Class words and valid abbreviation in Table name and Column names).
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and extracted the data from Oracle into HDFS using Sqoop.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using Star and Snow Flake Schemas
  • Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide SQL access on Hadoop data
  • Developed long term data warehouse roadmap and architectures, designs and builds the data warehouse framework per the roadmap.
  • Worked on Metadata Repository (MRM) for maintaining the definitions and mapping rules up to mark.
  • Used Meta Stage to maintain the metadata for different Data Warehouse environments and projects.
  • Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
  • Primarily responsible for Tableau customization for statistical dashboard to monitor sales effectiveness and also used Tableau for customer marketing data visualization.
  • Involved in making screen designs, Use Cases and ER diagrams for the project using ER/Studio.
  • Extracted data from IBM Cognos to create automated visualization reports and dashboards on Tableau.
  • Performed Data mapping between source systems to Target systems, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data
  • Used Agile Methodology of Data Warehouse development using Kanbanize.
  • Developed triggers, stored procedures, functions and packages using cursors and ref cursor concepts associated wif the project using PL/SQL
  • Generate DDL scripts for database modification, Teradata, Macros, Views and set tables.
  • Handled importing data from various data sources, performed transformations using Hive, Map Reduce, and loaded data into HDFS
  • Developed the performance tuning of the database by using EXPLAIN PLAN, TKPROF utilities and also debugging the code.

Environment: ER/Studio 9.7, Oracle12c, Hive, Amazon Redshift, AWS, MapReduce, Hadoop, Cassandra, HBase, Mongo DB, Pig, Agile, NoSQL, PL/SQL, OLAP, OLTP, SQL, IBM Cognos, Ab Initio, Tableau, Crystal Reports 2008, HDFS.

Confidential, Bellevue, WA

Sr. Data Architect/Modeler

Responsibilities:

  • Developing full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
  • Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables and Session Parameters.
  • Extracting data from various source systems like Oracle, SQL Server and flat files as per the requirements.
  • Worked as Architect and designed conceptual, logical and physical models and build data marts using hybrid Inmon and Kimball DW methodologies
  • Involved in writing Shell Scripts to accumulate the MTD source file Collaboration wif Architects and Managers for review of solutions and data strategy
  • Used data virtualization tool connect multiple heterogeneous sources wifout requirement of physically moving the data.
  • Extensively involved in analyzing variousdataformats using industry standard tools and effectively communicate them wif business users and SME's.
  • Worked on data warehousing, ETL, SQL, scripting and big data (MPP + Hadoop).
  • Worked on Normalization and De-Normalization techniques for both OLTP and OLAP systems
  • Created reports from Greenplum (Pivotal) which s the positions/transactions for each customer's monthly invoice for all jurisdiction (CFTC, ESMA, CANADA, ASIC & MAS) and make it available on the Portal.
  • Involved wif data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Involved in Netezza Administration Activities like backup/restore, performance tuning, and Security configuration.
  • Create architectural documents such as, the e2e data flow for the IBM Information data warehouse system.
  • Identified security loopholes, established data quality assurance and addressed data governance.
  • Designed Physical Data Model (PDM) using IBM Info sphere Data Architect data modeling tool and Oracle PL/SQL.
  • Used push down optimization in Informatica to call Greenplum GPLoad functions
  • Lead the design and modeling of tactical architectures for development, delivery, and support of projects.
  • As a Architect implement MDM hub to provide clean, consistent data for a SOA implementation.
  • Called Greenplum Business Rules, Data Rules and Transform Rules functions using Informatica Stored Procedure Transformation.
  • Developed and maintained policies, standards, and guidelines to ensure that a consistent framework is applied across the company.
  • Involved in all the steps and scope of the project data approach to MDM, Creating a Data Dictionary and Mapping from Sources to the Target in MDM Data Model.
  • Promoted the use of a shared infrastructure, application roadmap, and documentation of interfaces to improve information flow and reduce costs.
  • Architect and build meta data repository to describe Digital business data, technical data, and processes.
  • Designed solutions for multiple large data warehouses wif a good understanding of cluster and parallel architecture as well as high-scale or distributed RDBMS and/or noledge on NoSQL platforms.
  • Developed Ad-Hoc Queries, Views and functions in Greenplum in order to make data accessible for Business Analyst and Managers.

Environment: Erwin 9.6, PL/SQL, ODS, Hadoop, MS SQL Server 2014, flat files, Oracle 12c, MDM, Information Analyzer, Informatica, IBM Infosphere

Confidential, Boston, MA

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Gatheird requirements, analyzed and wrote the design documents.
  • Developed logical data models and physical database design and generated database schemas using ER Studio.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and DB2.
  • Involved wif data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Worked wif data investigation, discovery and mapping tools to scan every single data record from many sources. Create and Monitor workflows using workflow designer and workflow monitor.
  • Involved in extensive Data validation by writing several complex SQL queries and Involved in back-end testing and worked wif data quality issues. Identify & record defects wif required information for issue to be reproduced by development team.
  • Document all data mapping and transformation processes in the Functional Design documents based on the business requirements.
  • Generated and DDL (Data Definition Language) scripts using ER Studio and assisted DBA in Physical Implementation of data Models.
  • Developed, managed and validated existing data models including logical and physical models of the data warehouse and source systems utilizing a 3NFmodel.
  • Prepared High Level Logical Data Models using Erwin, and later translated the model into physical model using the Forward Engineering technique.
  • Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies wat data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center.

Environment: ER Studio, Microsoft Visio, MS SQL Server 2012, DB2 Oracle10g/11g, workflow designer

Confidential, Providence, RI

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Involved in analysis of Business requirement, Design and Development of High level and Low level designs, Unit and Integration testing
  • Created the conceptual model for the data warehouse using Erwin data modeling tool.
  • Reviewed the conceptual EDW (Enterprise Data Warehouse) data model wif business users, App Dev and Information architects to make sure all the requirements are fully covered.
  • Worked on designing the OLAP Model, Dimension model for BI Reporting sourcing from SAP Transactions.
  • Used Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Designed and documented Use Cases, Activity Diagrams, Sequence Diagrams, OOD (Object Oriented Design) using UML and Visio.
  • Reverse engineered existingdatabases to understand thedataflow and business flows of existing systems and to integrate the new requirements to future enhanced and integrated system. Designing Partitioned strategy indatamodel.
  • Used and supported database applications and tools for extraction, transformation and analysis of rawdata
  • Datamodeling was performed using ERWIN tool to build logical and physical models.
  • Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information.
  • Gatheird reporting and analysis requirements and translated into reporting structures /datamodels, including aggregate tables, relational and dimensional (star-schema) marts
  • Forward Engineered the physicaldatamodel and generated DDL script using Forward Engineering option in Erwin.
  • Analyzed the impact on the Enterprisedatawarehouse and Down streams.
  • Developed PL/SQL scripts to validate and loaddatainto interface tables

Environment: Erwin 7.x, Microsoft Visio, OOD, SAP, OLAP, ODS, MS SQL Server 2008, DB2 Oracle 10g

Confidential

Data Modeler/Data Analyst

Responsibilities:

  • Developed, managed and validated existing data models including logical and physical models of the data warehouse and source systems utilizing a 3NF model
  • Transformed projectdatarequirements into projectdatamodels using Erwin.
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Assisted in defining business requirements for the IT team and created BRD and functional specifications documents along wif mapping documents to assist the developers in their coding.
  • Interacted wif business analysts to get functional requirements.
  • Creating databases for OLAP Metadata Catalog tables using forward engineering of models in ERwin.
  • Involved in designing data warehouse/ Data Marts using Ralph Kimball methodology.
  • Create standard abbreviation document for logical, physical and dimensional data models.
  • Worked wif report developers for building OLAP Cubes for ad hoc reporting
  • Creating logical, physical and dimensional data models using Erwin and documenting it.
  • Generating reports from the data models
  • Review the data model wif functional and technical team
  • Worked on master data management, data governance and data integration for improving the data quality using IDQ.
  • Exhaustively collected business and technical Metadata and maintained naming standards.
  • Create SQL code from data model and co-ordinate wif DBA's.
  • Generated PL/SQL code and created procedures, functions, views, etc.

Environment: Erwin7.0, Microsoft Visio, MS SQL Server2008, Metadata, Oracle 9i/8i, PL/SQL, OLAP, Ralph Kimball

We'd love your feedback!