We provide IT Staff Augmentation Services!

Sr. Data Architect/ Data Modeler Resume

3.00/5 (Submit Your Rating)

Charleston, WV

SUMMARY:

  • Around 7 years of IT experience as Sr. Data Architect/Data Modeler and Data Analyst in design, development, testing and maintenance of data warehouse, business intelligence and operational data systems.
  • Architect and design conceptual, logical and physical models and build data marts using hybrid Inmon and Kimball DW methodologies.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems (RDBMS) and from RDBMS to HDFS.
  • Experience as a dedicated Data Warehouse Architect, with expertise in Designing and Developing Data Model.
  • Hands on experience with various Data Architect and ETL Architect, subsystem and patterns, including Change Date Capture, Slow Change Dimension, Data Cleansing, auditing and validation, etc.
  • Experience in developing Map Reduce Programs using Apache Hadoop for analyzing the Big data as per the requirement.
  • Knowledge and working experience on big data tools like Hadoop, Azure Data lake, AWS Redshift.
  • Experience in conducting Joint Application Development (JAD) sessions with SMEs, Stakeholders and other project team members for requirement gathering and analysis.
  • Extensive experience in shell scripting Python, Perl, Ruby, or any other scripting language.
  • Strong experience with architecting highly performance databases using PostgreSQL, PostGIS, MYSQL and Cassandra.
  • Specialization in Data Modeling, Data warehouse design, Building conceptual Architect, Data Integration and Business Intelligence Solution.
  • Experience in working with business intelligence and data warehouse software, including SSAS, Pentaho, Cognos, OBIEE, QlikView, Greenplum Database, Amazon Redshift, or Azure Data Warehouse
  • Experienced in Hadoop Ecosystem development using MapReduce, HDFS, Hive, Pig, HBase, Sqoop, Flume, Spark, and Oozie.
  • Excellent understanding of an Approach to MDM to creating a data dictionary, Using Informatica or other tools to do mapping from sources to the Target MDM Data Model.
  • Experience in various Teradata utilities like Fastload, Multiload, BTEQ, and Teradata SQL Assistant.
  • Experience in integration of various relational and non - relational sources such as DB2, Teradata, Oracle, Netezza, SQL Server, NoSQL, and Netezza database
  • Good experienced in Dimensional and Relational Data Modeling using Star and Snowflake Schemas, OLTP/OLAP system, Fact and Dimension tables, Conceptual, Logical and Physical data Modeling using Erwin.
  • Experienced in SQL, PL/SQL package, function, stored procedure, triggers, and materialized view, to implement business logics of oracle database.
  • Good knowledge of Data Marts, Operational Data Store (ODS), Dimensional Data Modeling.
  • Strong background in various Data Modeling tools using ERWIN, ER/Studio and Power Designer.
  • Extensively used ERWIN for REVERSE Engineering, FORWARD Engineering, SUBJECT AREA, DOMAIN, Naming Standards Document etc.
  • Expertise in SQL Server Analysis Services (SSAS) and SQL Server Reporting Services (SSRS).
  • Excellent understanding and working experience of industry standard methodologies like System Development Life Cycle (SDLC), as per Rational Unified Process (RUP), AGILE and Waterfall Methodologies.
  • Expertise in Normalization/De-normalization techniques for optimum performance in relational and dimensional database environments.

TECHNICAL SKILLS:

Analysis and Modeling Tools: Erwin 9.6/9.5, Sybase Power Designer, Oracle Designer, ER/Studio 9.7

Database Tools: Microsoft SQL Server 2014/2012 Teradata 15/14, Oracle 12c/11g, MS Access, Poster SQL, Netezza.

Cloud: AWS, EC2, S3, Azure.

OLAP Tools: Tableau, SAP BO, SSAS, Business Objects, and Crystal Reports 9.

ETL Tools: SSIS, Pentaho, Informatica Power 9.6 etc.

Web technologies: HTML, DHTML, XML, JavaScript

Reporting Tools: Business Objects, Crystal Reports

Operating Systems: Microsoft Windows 7/8/XP / Vista, Linux and UNIX

Tools: & Software: TOAD, MS Office, BTEQ, SQL Assistant

Big Data: Hadoop, HDFS 2, Hive, Pig, HBase, Sqoop, Flume.

Other tools: TOAD, SQL PLUS, SQL LOADER, MS Project, MS Visio and MS Office, Have worked on C++, UNIX, PL/SQL etc.

PROFESSIONAL EXPERIENCE:

Confidential, Charleston, WV

Sr. Data Architect/ Data Modeler

Responsibilities:

  • As a Architect implement MDM hub to provide clean, consistent data for a SOA implementation.
  • Massively involved in Data Architect role to review business requirement and compose source to target data mapping documents.
  • Worked with BTEQ to submit SQL statements, import and export data, and generate reports in Teradata.
  • Defined and deployed monitoring, metrics, and logging systems on AWS.
  • Responsible for technical Data governance, enterprise wide Data modeling and Database design.
  • Implement data warehouse designs to collect and extract/transform/loading of legacy data to core SAP system.
  • Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using ER Studio.
  • Incorporated business requirements in quality conceptual, logical data models using ER Studio and created physical data models using forward engineering techniques to generate DDL scripts.
  • Designing normalized and star schema data architectures using ER Studio and forward engineering these structures into Teradata.
  • Responsible for Big data initiatives and engagement including analysis, brainstorming, POC, and architecture.
  • Demonstrated expertise utilizing ETL tools, including SQL Server Integration Services (SSIS), Data Transformation Services (DTS), and DataStage and ETL package design, and RDBM systems like SQL Servers, Oracle, and DB2.
  • Review and Patch of Netezza and Oracle environments including DB2, OS and Server firmware.
  • Selecting the appropriate AWS service based on data, compute, database, or security requirements.
  • Used Flume extensively in gathering and moving log data files from Application Servers to a central location in Hadoop Distributed File System (HDFS) for data science.
  • Working on Amazon Redshift and AWS and architecting a solution to load data, create data models and run BI on it.
  • Used Teradata Administrator and Teradata Manager Tools for monitoring and control the system.
  • Developed and configured on Informatica MDM hub supports the Master Data Management (MDM), Business Intelligence (BI) and Data Warehousing platforms to meet business needs.
  • Worked with Hadoop eco system covering HDFS, HBase, YARN and Map Reduce.
  • Worked and used different database types like DynamoDB or Redshift.
  • Developed, Implemented & Maintained the Conceptual, Logical & Physical Data Models using ER/Studio 9 (ER Studio)- Forward/Reverse Engineered Databases.
  • Involved in Agile project management environment.
  • Developed PL/SQL scripts to validate and load data into interface tables
  • Involved in maintaining data integrity between Oracle and SQL databases.
  • Translate business and data requirements into Logical data models in support of Enterprise Data Models, ODS, OLAP, OLTP, Operational Data Structures and Analytical systems.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using Star and Snow flake Schemas.
  • Worked extensively on ER Studio for multiple Operations across Hartford in both OLAP and OLTP applications.
  • Involved in OLAP model based on Dimension and FACTS for efficient loads of data based on Star Schema structure on levels of reports using multi-dimensional models such as Star Schemas and Snowflake Schema.
  • Performed the Data Mapping, Data design (Data Modeling) to integrate the data across the multiple databases in to EDW.
  • Perform administrative tasks, including creation of database objects such as database, tables, and views, using SQL DCL, DDL, and DML requests.
  • Participated in several project activities including Data Architect design, ETL design, QA support, Code review.

Environment: ERStudio9.7, AWS, Teradata15, Star Schema, Snowflake Schema, OLAP, OLTP, Oracle12c, ODS, Business Objects, MDM, Hadoop, Spark, Cassandra, SAP.

Confidential, Peoria, IL

Sr. Data Modeler

Responsibilities:

  • Developing full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
  • Independently coded new programs and design Tables to load and test the program effectively for the given POC's using Big Data/Hadoop.
  • Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.
  • Involved in review of one analytics Azure SQL mart designs of each MCS & Premier features
  • Establish and develop new database maintenance scripts -Automate Netezza database management and monitoring
  • Utilized Informatica toolset (Informatica Data Explorer, and Informatica Data Quality) to analyze legacy data for data profiling
  • Performed database health checks and tuned the databases using Teradata Manager.
  • Experience using MapReduce, and "Big data" work on Hadoop and other NOSQL platforms.
  • Developed, managed and validated existing data models including logical and physical models of the data warehouse and source systems utilizing a 3NF model.
  • Used Star Schema and Snowflake Schema methodologies in building and designing the Logical Data Model into Dimensional Models.
  • Implemented logical and physical relational database and maintained Database Objects in the data model using Erwin 9.5.
  • Full life cycle of Data Lake, Data Warehouse with Big data technologies like Spark, Hadoop.
  • Involved in debugging and Tuning the PL/SQL code, tuning queries, optimization for the Oracle, and DB2 database.
  • Worked on Business Intelligence solution using Redshift DB, and Tableau.
  • Developed Data Mapping, Data Governance, and transformation and cleansing rules involving OLTP, ODS.
  • Designed and deployed scalable, highly available, and fault tolerant systems on MS Azure.
  • Developed various QlikView Data Models by extracting and using the data from various sources files, DB2, Excel, Flat Files and Big data.
  • Involved in developing Database Design Document including Data Model Conceptual, Logical and Physical Models using Erwin 9.5.
  • Provide AWS-based cloud support, technical expertise and guidance with Azure integration and proposals/RFIs/RFPs to our sales team and executive contract management.
  • Responsible for data modeling and managing the data warehouse environments which include mentoring junior DBAs.
  • Developed and implemented strategy for the extraction of data from the different sources and loading into appropriate data-mart composition.

Environment: ERWIN r9.5, Oracle 11g, DB2, Flat files, QlikView, Star Schema, Snowflake Schema, T-SQL, MYSQL, Hadoop, Hive, Map Reduce, MDM, PL/SQL, OLAP, OLTP, DB2, ODS, Azure

Confidential, Hunt Valley, MD

Sr. Data Modeler/Analyst

Responsibilities:

  • Interaction with Business Analyst, SMEs and other Data Architects to understanding Business needs and functionality for various project solutions
  • Translating high level business requirements into solution, infrastructure architectures, involved enterprise architects to align strategic view.
  • Defined the architecture and various phases for the implementation of the transactional and data warehouse systems.
  • Full life cycle of Data Lake, Data Warehouse with Big data technologies like Spark, Hadoop.
  • Architected and delivered various complex OLAP databases/cubes, scorecards, dashboards and reports.
  • Assigned tasks among development team, monitored and tracked progress of project following Agile methodology.
  • Worked with BTEQ to submit SQL statements, import and export data, and generate reports in Teradata.
  • Worked in generating and documenting Metadata while designing OLTP and OLAP systems environment.
  • Involved in relational and dimensional Data Modeling for creating Logical and Physical design of the database and ER diagrams using data modeling like Erwin.
  • Worked with Data governance, Data quality, data lineage, Data architect to design various models and processes.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and DB2.
  • Created a Logical Design and Physical Design in Erwin and used Erwin for developing Data Model using Star Schema methodologies.
  • Controlled data cube development based on requirement prioritization and consolidation.
  • Conducted Learning sessions for business users, analysts, QA team to provide insight into data model and providing documentations.

Environment: Erwin 9.x, DB2, Hadoop, Spark, Star Schema, Snowflake Schema, ETL, ODS, Agile, PL/SQL, Erwin, OLTP, Oracle 11g/12c, SQL server, BTEQ,, MS Access, Excel.

Confidential, New York, NY

Data Modeler/ Data Analyst

Responsibilities:

  • Worked as a Data Modeler/Analyst to generate Data Models using Erwin and developed relational database system.
  • Designed and developed Use Cases, Activity Diagrams, Sequence Diagrams, OOD (Object oriented Design) using UML and Visio.
  • Created DDL scripts using Erwin and source to target mappings to bring the data from source to the warehouse.
  • Developed dimensional model for Data Warehouse/OLAP applications by identifying required facts and dimensions.
  • Worked with Architecture team to get the metadata approved for the new data elements that are added for this project.
  • Designed a STAR schema for the detailed data marts and Plan data marts involving shared dimensions (Conformed)
  • Generated DDL (Data Definition Language) scripts using Erwin and supported the DBA in Physical Implementation of data Models.
  • Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information.
  • Experience in Data Extraction/Transformation/Loading (ETL), Data Conversion and Data Migration by using SQL Server 2008 Integration Services (SSIS) and PL/SQL Scripts.
  • Worked on Oracle PL/SQL and Shell Scripts, Packages, Scheduling, Data Import/Export, Data Conversions and Data Cleansing
  • Development of database objects like Tables, views and materialized views etc using SQL.
  • Involved in analysis of Business requirement, Design and Development of High level and Low level designs, Unit and Integration testing
  • Data governance functional and practical implementation and also responsible for designing common Data governance frameworks
  • Created Data flow diagrams for current system. Developed a data topology based on the data storage, various replications, and movements of data.
  • Created and maintained the data dictionaries, maintained some parameters to provide faster query performance-using SQL.
  • Performed Data Analysis and data profiling using complex SQL on various sources systems including Oracle 10g/11g and Teradata.

Environment: Erwin, SSIS, SSRS, Teradata, PL SQL, Oracle 9i, SQL Server 2008, OLAP, OLTP, Star Schema, Snowflake Schema, ODS.

Confidential

D ata Analyst

Responsibilities:

  • Gathered Business requirements by organizing and managing meetings with business stake holders, Application architects, Technical architects and IT analysts on a scheduled basis.
  • Communicated with users and business analysts to gather requirements. Involved in business process modeling using UML through Rational Rose.
  • Reverse Engineered the Data Models and identified the Data Elements in the source systems and adding new Data Elements to the existing data models.
  • Worked on data profiling and data validation to ensure the accuracy of the data between the warehouse and source systems.
  • Executed the UNIX shell scripts that invoked SQL loader to load data into tables.
  • Participated in several JAD (Joint Application Design/Development) sessions to track end to end flow of attributes starting from source screens to all the downstream systems.
  • Involved in Data Profiling, Data Cleansing and make sure the data is accurate and analyzed when it is transferring from OLTP to Data Marts and Data Warehouse.
  • Involved in data extraction, validation, analysis of the data and the store in data marts.
  • Involved in Performance tuning by leveraging oracle explain utility and SQL tuning.
  • Created PL/SQL packages and Database Triggers and developed user procedures and prepared user manuals for the new programs.
  • Extensively working on Data Modeling tools Erwin Data Modeler to design the data models.
  • Consolidated existing SSRS reports to Micro Strategy reports without losing the functionality.
  • Worked with DBA group to create Best-Fit Physical Data Model from the Logical Data Model using Forward engineering using Erwin tool.
  • Responsible for Metadata Management, keeping up to date centralized metadata repositories using Erwin modeling tools
  • Documented logical, physical, relational and dimensional data models. Designed the Data Marts in dimensional data modeling using star and snowflake schemas.
  • Worked with Data Architects and Data Modelers extensively sharing knowledge and ideas in designing Data Models.
  • Worked between the application team and the data team in order to gather the business requirements and build the database objects accordingly

Environment: Erwin 7.0, Metadata, SSRS, OLAP, OLTP, Oracle 9i/8i,, Oracle SQL Developer, PL/SQL, Business Objects, MS office suite.

We'd love your feedback!