We provide IT Staff Augmentation Services!

Sr. Data Architect/data Modeler Resume

3.00/5 (Submit Your Rating)

New Hyde Park, NY

SUMMARY

  • Above 7+ years of experience as Data Architect/Modeler and Data Analyst with high proficiency in requirement gathering and data modeling including design and support of various applications in OLTP, Data Warehousing, OLAP and ETL Environment.
  • Experienced in designing Architecture for Modeling a Data Warehouse by using tools like Erwin r9.6/r9.5, Sybase Power Designer and E - R Studio.
  • Experience in SQL and good knowledge in PL/SQL programming and developed Stored Procedures and Triggers and Data Stage, DB2, UNIX, Cognos, MDM, UNIX, Hadoop, Pig.
  • Experience in designing star schema (identification of facts, measures and dimensions), Snowflake schema for Data Warehouse, ODS Architecture by using tools like Erwin Data Modeler, Power Designer, E-R Studio and Microsoft Visio.
  • Experience in importing and exporting data from different relational databases like MySQL6.x, Netezza, Oracle into HDFS and Hive using Sqoop.
  • Experience in metadata design, real time BI Architecture including Data Governance for greater ROI.
  • Excellent experience on Teradata SQL queries, Teradata Indexes, Utilities such as Mload, Tpump, Fast load and Fast Export.
  • Good understanding of AWS, big data concepts and Hadoop ecosystem.
  • Knowledge of BI and data warehousing principles including data modeling, data quality, extract/transform/load process as they apply to data preparation for visual implementation.
  • Experience Data Modeler with conceptual, Logical and Physical Data Modeling skills, Data Profiling skills, Maintaining Data Quality, Teradata 15/14,
  • Good Experience with JAD sessions for requirements gathering, creating Data Mapping, documents, writing functional specifications, queries.
  • Good experienced in Normalization for OLTP and De-normalization of Entities for Enterprise Data Warehouse.
  • Experienced in Database using Oracle, XML, DB2, Teradata, Netezza, SQL server, Big Data and NoSQL.
  • Good knowledge in Database Creation and maintenance of physical data models with Oracle, Teradata, Netezza, DB2 and SQL Server databases.
  • Familiar with Kimball DW/BI modeling principles and knowledgeable in Data warehouse modeling for different kind of business.
  • Experience in Teradata RDBMS using Fast load, Fast Export, Multi load, T pump, and Teradata SQL Assistance and BTEQ Teradata utilities.
  • Hands-on experience with the Hadoop ecosystem ( MapReduce, HBase, Pig, Hive, Impala, Sqoop)
  • Experienced in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Abinitio and Informatica Power Center.
  • Experienced in Data Modeling including Data Validation/Scrubbing and Operational assumptions.
  • Very good knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and Identifying Data Mismatch.
  • Extensively experience on EXCEL PIVOT tables to run and analyze the result data set and perform UNIX scripting.
  • Extensive experience on usage of ETL & Reporting tools like SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS)
  • Experience in Normalization and Demoralization processes, Logical and Physical Data Modeling techniques.
  • Experience in Database performance tuning and Data Access optimization, writing complex SQL quires and PL/SQL blocks like stored procedures, Functions, Triggers, Cursors and ETL packages.
  • Experience with SQL Server and T-SQL in constructing Temporary Tables, Table variables, Triggers, user functions, views, Stored Procedures.

TECHNICAL SKILLS

Data Modeling Tools: Erwin r9.6/r9.5, ER Studio, Sybase Power Designer and Oracle Designer.

OLAP Tools: Tableau, SAP BO, SSAS, Business Objects, and Crystal Reports 9.

ETL Tools: SSIS, Pentaho, Informatica9.6.

Programming Languages: Java, Base SAS and SAS/SQL, SQL, T-SQL, HTML, Java Script, CSS, UNIX shells scripting, PL/SQL.

Database Tools: Oracle 12c/11g, Teradata 14/15, Microsoft SQL Server 2014/2016, and MS Access, PostgreSQL.

Testing and defect tracking Tools: HP/Mercury (Quality Center, Win Runner, Quick Test Professional, Performance Center, Requisite, MS Visio & Visual Source Safe

Reporting Tools: Business Objects, Crystal Reports

Operating Systems: Microsoft Windows 8/7, and UNIX

Tools: & Software: TOAD, MS Office, BTEQ, Teradata SQL Assistant

Big Data: Hadoop, HDFS 2, Hive, Pig, HBase, Sqoop, Flume.

Project Execution Methodologies: Ralph Kimball and Bill Inmon data warehousing methodology, Rational Unified Process (RUP), Rapid Application Development (RAD), Joint Application Development (JAD)

PROFESSIONAL EXPERIENCE

Confidential - New Hyde Park, NY

Sr. Data Architect/Data Modeler

Responsibilities:

  • Working as an Architect and develop scalable, highly available, fault tolerant, secure systems for both on- premises, hybrid and cloud-based data systems that meet client business needs.
  • As a Architect implement MDM hub to provide clean, consistent data for a SOA implementation.
  • Implemented Agile Methodology for building Integrated Data Warehouse, involved in multiple sprints for various tracks throughout the project lifecycle.
  • Implemented various Azure platforms such as Azure SQL Database, Azure SQL Data Warehouse, Azure Analysis Services, HDInsight, Azure Data Lake, Data Factory
  • Involved in developing Database Design Document including Data Model Conceptual, Logical and Physical Models using Erwin 9.64.
  • Responsible for analysis of massive and highly complex data sets, performing ad-hoc analysis and data manipulation for data integration.
  • Designed and implemented scalable Cloud Data and Analytical architecture solutions for various public and private cloud platforms using Azure
  • Designed and developed data architecture solutions in big data architecture or data analytics.
  • Evaluate architecture patterns, Define best patterns for data usage, data security, data compliance.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and extracted the data from Oracle into HDFS using Sqoop
  • Understand transaction data and develop Analytics insights using Statistical models using Azure Machine learning.
  • Applied Data Governance rules (primary qualifier, Class words and valid abbreviation in Table name and Column names).
  • Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas.
  • Developed and presented data flow diagrams, conceptual diagrams, UML diagrams, ER flow diagrams, creating the ETL Source to Target mapping specifications and supporting documentation.
  • Developed long term data warehouse roadmap and architectures, designs and builds the data warehouse framework per the roadmap.
  • Worked on Metadata Repository (MRM) for maintaining the definitions and mapping rules up to mark.
  • Independently coded new programs and design Tables to load and test the program effectively for the given POC using Big Data/Hadoop.
  • Involved in Normalization/De-normalization techniques for optimum performance in relational and dimensional database environments.
  • Used windows Azure SQL reporting services to create reports with tables, charts and maps.
  • Performed data modeling to differentiate between OLTP and Data Warehouse data models.
  • Developed triggers, stored procedures, functions and packages using cursors and ref cursor concepts associated with the project using PL/SQL
  • Dimensional modeling of EDW following Kimball methodology with Erwin data modeling tool for Data marts and data warehouses in Star Schema, with confirmed dimensions.
  • Involved in the hands-on technical delivery of customer projects related to Azure.
  • Support Cloud Strategy team to integrate analytical capabilities into an overall cloud architecture and business case development

Environment: ERWIN r9.6, Azure, Oracle12c, OLAP, OLTP, T-SQL, SQL, Linux, MDM, Hadoop, MapReduce, Pig, HBase, PL/SQL.

Confidential - West Point, PA

Data Architect/Data Modeler

Responsibilities:

  • Massively involved in Data Architect role to review business requirement and compose source to target data mapping documents.
  • Researched, evaluated, architect, and deployed new tools, frameworks, and patterns to build sustainable Big Data platforms for our clients
  • Created logical/physical data models using Erwin for new requirements and existing databases, maintained database standards, provided architectural guidance for various data design/integration/migration and analyzed data in different systems.
  • Developed and configured on Informatica MDM hub supports the Master Data Management (MDM), Business Intelligence (BI) and Data Warehousing platforms to meet business needs.
  • Used Load utilities (Fast Load & Multi Load) with the mainframe interface to load the data into Teradata.
  • Data reconciliation activities between Source and EDW Teradata databases.
  • Demonstrated experience in design and implementation of an enterprise data model, metadata solution and data life cycle management in both RDBMS, Big Data environments.
  • Applies architectural and technology concepts to address scalability, security, reliability, maintainability and sharing of enterprise data.
  • Performed the data profiling and imported/updated table definition using SQL and Power Center Designer.
  • Changed the session properties to override setting values included source/target table name, schema name, source query, connection string, update strategy and log detail tracing level.
  • Involved in Logical modeling using the Dimensional Modeling techniques such as Star Schema and Snow Flake Schema.
  • Handled importing of data from various data sources, performed transformations using Hive, Map Reduce, loaded data into HDFS and Extracted the data from My SQL into HDFS using Sqoop
  • Worked on database design, relational integrity constraints, OLAP, OLTP, Cubes and Normalization (3NF) & De-normalization of database.
  • Worked with various Teradata15 tools and utilities like Teradata Viewpoint, Multi Load, ARC, Teradata Administrator, BTEQ and other Teradata Utilities.
  • Supporting the legacy Teradata ETL process which is developed on Teradata
  • Developed Reports using Business Objects as per the Client Requirements.
  • Involved in several facets of MDM implementations including Data Profiling, metadata acquisition and data migration.
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture
  • Tuned SQL statements and analysis query performance issues in Teradata
  • Developed logical and physical model for schemas, created standard data documentation, such as ERD and Flow Diagram.
  • Worked on database design, relational integrity constraints, OLAP, OLTP, Cubes and Normalization (3NF) & De-normalization of database.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.

Environment: Erwin r9.6, MDM, Hadoop, HBase, HDFS, Sqoop, UNIX, Teradata SQL Assistance, PL/SQL, ETL, OLAP, OLTP, Oracle12c and Teradata15

Confidential - Washington, DC

Sr. Data Modeler /Data Architect

Responsibilities:

  • Accountable for defining database physical structure, functional capabilities, and security, backup, and recovery specifications and for the installation of database systems.
  • Maintain database performance through the resolution of application development and production issues.
  • Presented the Data Scenarios via, Erwin9.5 logical models and excel mockups to visualize the data better.
  • Loaded data directly from Oracle to Netezza without any intermediate files.
  • Designed the Layout for Extraction, Data Pump and Replication for prams for Data Replication/Data Distribution.
  • Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
  • Worked on NoSQL databases Cassandra. Implemented multi-data center and multi-rack Cassandra cluster.
  • Designed and produced client reports using Tableau and SAS.
  • Work closely with Enterprise architect, product manager and technical team to develop data architecture for a new banking product
  • Developed Master data management strategies for storing reference data
  • Represented conceptual, logical and physical data models using relational and NOSQL databases
  • Created logical and physical data model using Cassandra’s model
  • Installed, configured and created Cassandra Database
  • Fine-tuned performance of Cassandra data base by configuring Commit logs, Mem tables and Cache size
  • Involved in Normalization/De-normalization techniques for optimum performance in relational and dimensional database environments.
  • Worked in importing and cleansing of data from flat files, PL/SQL Server with high volume data.
  • Worked on Data Warehousing Concepts like Ralph Kimball Methodology, Bill Inmon Methodology, OLAP, OLTP, Star Schema, Snow Flake Schema, Fact Table and Dimension Table .
  • Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture involving OLTP, ODS and OLAP.
  • Worked with ETL developer for design and development of Extract, Transform and Load processes for data integration projects to build data marts.
  • Created and implemented MDM data model for Consumer/Provider for HealthCare MDM product from Variant.
  • Performed Data analysis, statistical analysis, generated reports, listings and graphs using SAS Tools-SAS/Base, SAS/Macros and SAS/Graph, SAS/SQL, SAS/Connect, SAS/Access.
  • Worked in the capacity of ETL Developer (Oracle Data Integrator (ODI) / PL/SQL) to migrate data from different sources in to target Oracle Data Warehouse.
  • Strong Knowledge on concepts of Data Modeling Star Schema/Snowflake modeling, FACT& Dimensions tables and Logical & Physical data modeling.
  • Worked on Informatica Utilities Source Analyzer, warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.

Environment: Erwin 9.5, PL/SQL, ODS, OLAP, OLTP, SAS, MDM, Tableau, Cassandra, Flat Files, Hadoop, HDFS, Pig, UNIX, PL/SQL

Confidential - San Francisco, CA

Sr. Data Analyst/Data Modeler

Responsibilities:

  • Designed ER diagrams (Physical and Logical using Erwin) and mapping the data into database objects and produced Logical /Physical Data Models.
  • Performed Data Analysis, Data Migration and data profiling using complex SQL on various sources systems including SQL Server and Teradata13.1.
  • Involved in Normalization /De-normalization, Normal Form and database design methodology. Expertise in using data modeling tools like MS Visio and Erwin Tool for logical and physical design of databases.
  • Involved in data analysis and modeling for the OLAP and OLTP environment.
  • Planned, scheduled, and controlled projects based on plans and requirements outlined by the business.
  • Conducted JAD Sessions with the SME, stakeholders and other management teams in the finalization of the User Requirement Documentation.
  • Reviewed Business Requirement Documents and the Technical Specification.
  • Involved in writing test plans and test cases using Mercury Quality Center.
  • Coordinated with Track Leads and Project Manager to setup the pre-validation and validation environment to execute the test scripts.
  • Participated in writing data mapping documents and performing gap analysis on the systems.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Used extensively Base SAS, SAS/Macro, SAS/SQL, and Excel to develop codes and generated various analytical reports.
  • Used Model Manager Option in Erwin to synchronize the data models in Model Mart approach.
  • Worked with the ETL team to document the transformation rules for data migration from OLTP to Warehouse environment for reporting purposes.
  • Worked in importing and cleansing of data from SQL Server2008 with high volume data.
  • Developed dimensional model for Data Warehouse/OLAP applications by identifying required facts and dimensions.
  • Developed the code as per the client's requirements using SQL, PL/SQL and Data Ware housing concepts.
  • Wrote T-SQL statements for retrieval of data and Involved in performance tuning of TSQL queries and Stored Procedures.
  • Used Performance Monitor and SQL Profiler to optimize queries and enhanced the performance of database servers.
  • Created SSIS packages to export data from text file to SQL Server Database.
  • Created new replications, DTS packages to maintain high availability of the data between the servers.
  • Gathered Business requirements by organizing and managing meetings with business stake holders, Application architects, Technical architects and IT analysts on a scheduled basis
  • Analyzed the business requirements by dividing them into subject areas and understood the dataflow within the organization
  • Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable
  • Created and maintained Metadata, including table, column definitions
  • Involved in purpose of this project is to migrate the Current Optum Rx Data Warehouse from the I-series database environment to a Netezza appliance.

Environment: Erwin 8.0, SQL Server 2008, OLAP, OLTP, Flat Files, Metadata, Taradata13, PL/SQL, BTEQ, UNIX, Microsoft SQL Server, TOAD.

Confidential - Englewood Cliffs, NJ

Data Analyst/Data Modeler

Responsibilities:

  • Developed SQL and PL/ SQL scripts for migration of data between databases.
  • Worked on MySQL database on simple queries and writing Stored Procedures for Normalization De-normalization.
  • Develop Logical and Physical data models that capture current state/future state data elements and data flows using Erwin / Star Schema.
  • Generated customized reports using SAS/MACRO facility, PROC REPORT, PROC TABULATE and PROC SQL.
  • Created data flow diagrams and process flow diagrams for various load components like FTP Load, SQL Loader Load, ETL process and various other processes that required transformation.
  • Prepared strategy to implement ETL code in production and load historical as well as incremental loads data in production.
  • Analyzed user statement of requirements to develop new reports.
  • Analyzed user requirements, attended change request meetings to document changes and implemented procedures to test changes.
  • Gathering statistics of data objects using Analyze and DBMS STATS.
  • Involved in mapping spreadsheets that will provide the Data Warehouse Development (ETL) team with source to target data mapping, inclusive of logical names, physical names, data types, domain definitions, and corporate meta-data definitions.
  • Converted physical database models from logical models, to build/generate DDL scripts.
  • Created mapping document and mappings in MDM HUB ORS using various cleanse list and cleanse functions.
  • Coordinating with DBA in implementing the Database changes and also updating Data Models with changes implemented in development, QA and Production
  • Knowledge of Merging several flat files into one XML file.
  • Extensively worked on creating, Altering and Deleting the Tables in different Development Environments and also Production.
  • Extensively worked with logical models to change them into complete physical models from where we can implement these models into database.
  • Extensively worked with developers on analyzing the impacted changes on respective applications based on which design approach is taken and also same changes are implemented in database with help of DBA.
  • In depth analyses of data report was prepared weekly, biweekly, monthly using MS Excel, SQL & UNIX.

Environment: Erwin, Informatica ETL, PL/SQL, Flat Files, SAS, ETL, Meta Data, MySQL, Unix, MS Excel.

We'd love your feedback!