We provide IT Staff Augmentation Services!

Data Modeler Resume

SUMMARY

  • Over 10 years of Information Technologies (IT) experience as Sr. Data Modeler/Data Architect and Data Analyst in design, development, testing and maintenance of data warehouse, business intelligence and operational data systems
  • Excellent understanding of an Approach to MDM to creating a data dictionary, Using Informatica or other tools to do mapping from sources to the Target MDM Data Model
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems (RDBMS) and from RDBMS to HDFS
  • Knowledge and working experience on big data tools like Hadoop, AWS Redshift
  • Expertise in SQL Server Analysis Services (SSAS) and SQL Server Reporting Services (SSRS)
  • Excellent understanding and working experience of industry standard methodologies like System Development Life Cycle (SDLC), as per Rational Unified Process (RUP), AGILE and Waterfall Methodologies
  • Experience in Logical/Physical Data Modeling, Relational and Dimensional Modeling for Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) systems (ODS, ADS and Data Marts)
  • Experience in developing Entity - Relationship diagrams and modeling Transactional Databases and Data Warehouse using tools like Erwin, ER/Studio and Power Designer
  • Efficient in enterprise data warehouses using Kimball data warehouse and Inman's methodologies
  • Experience in designing Star Schema, Snowflake schema for Data Warehouse, by using tools like Erwin data modeler, Power Designer and Embarcadero E-R Studio
  • Good Knowledge on SQL queries and creating database objects like stored procedures, triggers, packages and functions using SQL and PL/SQL for implementing the business techniques
  • Having good knowledge of Hive, Sqoop, MR, Storm, Pig, HBase, Flume, and Spark
  • Involved in various data modeling tasks that included forward engineering, reverse engineering, complete compare, creating DDL scripts, creating subject areas etc
  • Good knowledge in Big Data -Hadoop, AWS Cloud, Amazon Redshift, AWS EC2, AWS EC3, AWS Lambda, MongoDB and Python
  • Good Knowledge of Big Data like Hadoop, Hive, Spark, Pig, Apache Big Data frameworks and standards
  • Proficient in UML Modeling like Use Case Diagrams, Activity Diagrams, and Sequence Diagrams with Rational Rose and MS Visio
  • Excellent understanding on Tableau Workbooks and Background Tasks for Targeted Interactions Validation
  • Experience in NOSQL DB and tools like Apache HBase to handle massive data tables containing billions of rows, millions of columns
  • Strong hands on experience using Teradata utilities (SQL, BTEQ, Fast Load, Multi Load, Fast Export, Tpump, Visual Explain, and Query man), Teradata parallel support and Unix Shell scripting
  • Well versed in conducting Gap analysis, Joint Application Design (JAD) session, User Acceptance Testing (UAT), Cost benefit analysis and ROI analysis
  • Good understanding in Normalization (1NF, 2NF, 3NF and BCNF) techniques for OLTP environments and Denormalization techniques for improved database performance in OLAP environments
  • Experience in designing Enterprise Datawarehouse, Reporting data stores (RDS) and Operational data stores (ODS)

TECHNICAL SKILLS

Data Modeling Tools: Erwin 9.7, IBM Info sphere Data Architect, E/R Studio 17, Power Designer and Oracle SQL Developer

Big Data Technology: MapReduce, HBase, HDFS, Sqoop, Hadoop, Hive, PIG

Cloud Platforms: Amazon EC2, EC3, Elastic Search, Elastic Load Balance

Operating System: Windows 8/10, UNIX, Sun Solaris.

Database Tools: Microsoft SQL Server 16.0, Teradata 16.0, Oracle 12c and MS Access

BI Tools: Tableau 10, Tableau server, Tableau Reader, Crystal Reports

Packages: Microsoft Office 2016, Microsoft Project 2016, SAP and Microsoft Visio, Share point Portal Server

Version Tool: VSS, SVN, CVS.

Programming Languages: Oracle PL/SQL, UNIX Shell Scripting

Methodologies: Agile, Ralph Kimball, BillInmon’s data warehousing methodology, Rational Unified Process (RUP), Rapid Application Development (RAD), and Joint Application Development (JAD)

PROFESSIONAL EXPERIENCE

Confidential

Data Modeler

Responsibilities:

  • Responsible for the data architecture design delivery, data model development, content creation, review, approval and used Agile Methodology for Data Warehouse development
  • Working with Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of big data and conventional RDBMS
  • Designed Common Information Model (CIM) using IBM Infosphere Data Architect data modeling tool
  • Developed and automated multiple departmental Reports using Tableau and MS Excel
  • Responsible for all metadata relating to the EDW's overall data architecture, descriptions of data objects, access methods and security requirements
  • Involved in relational and dimensional Data Modeling for creating Logical and Physical design of the database and ER diagrams using data modeling like Erwin
  • Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using ERWIN
  • Designed both OLTP and ODS databases for high performance using ERWIN modeling tool
  • Worked on Normalization and De-Normalization techniques for both OLTP and OLAP systems
  • Involved in OLAP model based on Dimension and FACTS for efficient loads of data based on Star Schema structure on levels of reports using multi-dimensional models such as Star Schemas and Snowflake Schema
  • Established uniform Master Data Dictionary and Mapping rules for metadata, data mapping and lineage
  • Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted the data from Oracle into HDFS using Sqoop
  • Worked with delivery of Data & Analytics applications involving structured and un-structured data on Hadoop based platforms on AWS EMR
  • Designed and implemented Oracle PL/SQL store procedures, functions and packages for data manipulation and validation
  • Involved in all the steps and scope of the project reference data approach to MDM and Created Data Dictionary and Data Mapping from Sources to the Target in MDM Data Model
  • Participated in the creation of Business Objects Universes using complex and advanced database features
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS
  • Worked on building Aptitude Operational Data Store (ODS) model in an Oracle Ex-data database
  • Set up of environments to be used for testing and the range of functionalities to be tested as per technical specifications
  • Reviewed Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables
  • Created Complex SQL Queries using Views, Indexes, Triggers, Roles, Stored procedures and User Defined Functions worked with different methods of logging in SSIS
  • Automation of SSIS Packages for production deployment with xml configurations
  • Developed Historical/Incremental of SSIS Packages with SCD2 concept of Star Schema

Environment: Erwin 9.7, Oracle PL/SQL, SSIS, ODS, OLTP, Hadoop 3.2, HDFS, Oracle 12c, Sqoop 1.4, AWS, Agile, Sql, ETL & MDM.

Confidential

Data Architect

Responsibilities:

  • Analyzed the Data Warehouse project database requirements from the users in terms of the dimensions they want to measure and the facts for which the dimensions need to be analyzed
  • Implemented Agile Methodology for building Integrated Data Warehouse, involved in multiple sprints for various tracks throughout the project lifecycle
  • Participated in change and code reviews to understand the testing needs of the change components
  • Designed Physical Data Model (PDM) using IBM Infosphere Data Architect data modeling tool and ORACLE PL/SQL
  • Used AWS Infrastructure and features of AWS like S3, EC2, RDS, ELB to host the portal
  • Written java Map-Reduce programs in AWS EMR to get the semi & un-structured data to structured data and to incorporate all the business transformations
  • Worked with business analysts (BA) to gather requirements and mapping specifications that were put together in IBM Infosphere FastTrack
  • Developed logical data models and physical database design and generated database schemas using Erwin 9.5
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and MS SQL Server
  • Document all data mapping and transformation processes in the Functional Design documents based on the business requirements
  • Prepared High Level Logical Data Models using Erwin, and later translated the model into physical model using the Forward Engineering technique
  • Designed data marts using Ralph Kimball and Bill Inmon dimensional data modeling techniques
  • Generated and DDL (Data Definition Language) scripts using Erwin and assisted DBA in Physical Implementation of data Models
  • Translated business requirements into working logical and physical data models for OLTP & OLAP systems
  • Generated SQL scripts and implemented the relevant databases with related properties from keys, constraints, indexes & sequences
  • Used Reverse Engineering to connect to existing database and developed process methodology for the Reverse Engineering phase of the project
  • Developed the batch program in PL/SQL for the OLTP processing and used Unix Shell scripts to run in corn tab
  • Performed extensive data profiling and data analysis for detecting and correcting inaccurate data from the databases and track the data quality
  • Provided guidance and solution concepts for multiple projects focused on data governance and master data management
  • Created DDL scripts using Erwin and source to target mappings to bring the data from source to the warehouse
  • Designed and developed SAS macros, applications and other utilities to expedite SAS Programming activities
  • Involved in writing T-SQL working on SSIS, SSRS, SSAS, Data Cleansing, Data Scrubbing and Data Migration
  • Analyzed and Gathered requirements from business people and management and business requirement document to prioritize their needs
  • Responsible for backing up the data and involved in writing stored procedures and involved in writing ad-hoc queries for the data mining
  • Developed and maintained data dictionary to create metadata reports for technical and business purpose
  • Create and Monitor workflows using workflow designer and workflow monitor
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues
  • Used SSRS for generating Reports from Databases and Generated Sub-Reports, Drill down reports, Drill through reports and parameterized reports using SSRS
  • Developed PL/SQL scripts to validate and load data into interface tables and Involved in maintaining data integrity between Oracle and SQL databases
  • Heavily worked on SQL query optimization also tuning and reviewing the performance metrics of the queries
  • Performed the Data Mapping, Data design (Data Modeling) to integrate the data across the multiple databases in to EDW
  • Collaborated with the Relationship Management and Operations teams to develop and present KPIs to top-tier clients

Environment: Erwin 9.5, Oracle 12c, MS SQL Server 2016, SSRS, OLAP, OLTP, MS Excel, Flat Files,, PL/SQL, OLAP, OLTP, SQL, IBM Cognos, Tableau 10.

Confidential - Chicago, IL

Sr. Data Analyst/Data Modeler

Responsibilities:

  • Involved in Business and Data analysis during requirements gathering. Assisted in creating fact and dimension table implementation in Star Schema model based on requirements
  • Performed segmentation to extract Data and create lists to support direct marketing mailings and marketing mailing campaigns
  • Defined Data requirements and elements used in XML transactions. Reviewed and recommended database modifications
  • Analyzed and rectified d Data in source systems and Financial Data Warehouse databases
  • Generated and reviewed reports to analyze Data using different excel formats Documented requirements for numerous Ad-hoc reporting efforts
  • Troubleshooting, resolving and escalating Data related issues and validating Data to improve Data quality
  • Developed and implemented data cleansing, data security, data profiling and data monitoring processes
  • Involved in Regression, UAT and Integration testing
  • Participated in testing of procedures and Data, utilizing PL/SQL, to ensure integrity and quality of Data in Data warehouse
  • Metrics reporting, Data mining and trends in helpdesk environment using Access
  • Gather Data from Help Desk Ticketing System and write Ad-hoc reports and, charts and graphs for analysis.
  • Compiled Data analysis, sampling, frequencies and stats using SAS
  • Involved in SQL Server and T-SQL in constructing Tables, Normalization and De-normalization techniques on database Tables
  • Identify and report on various computer problems within the company to upper management
  • Report on trends that come up as to identify changes or trouble within the systems using Access and Crystal Reports
  • Performed User Acceptance Testing (UAT) to ensure that proper functionality is implemented
  • Guide, train and support teammates in testing processes, procedures, analysis and quality control of Data, utilizing past experience and training in Oracle, SQL, Unix and relational databases
  • Maintained Excel workbooks, such as development of pivot tables, exporting Data from external SQL databases, producing reports and updating spreadsheet information
  • Modified user profiles, which included changing users cost center location, changed users authority to grant monetary amounts to certain departments - monetary amounts were part of the overall budget amount granted per department
  • Extracted Data from DB2, COBOL Files and converted to Analytic SAS Datasets
  • Deleted users from cost centers, deleted users authority to grant certain monetary amounts to certain departments, deleted certain cost centers and profit centers from database
  • Created a report, using SAP reporting feature that showed which users have not performed scanning of journal voucher documents into the system
  • Created Excel pivot tables, which showed a table of users that, have not performed scanning of journal voucher documents. Users were able to find documents by double-clicking on his/her name within the pivot table
  • Load new or modified Data into back-end Oracle database
  • Optimizing/Tuning several complex SQL queries for better performance and efficiency
  • Created various PL/SQL stored procedures for dropping and recreating indexes on target tables. Worked on issues with migration from development to testing
  • Designed and developed UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the Data
  • Validated cube and query Data from the reporting system back to the source system. Tested analytical reports using Analysis Studio

Environment: SAS/BASE, SAS/Access, SAS/Connect, Informatica Power Center (Power Center Designer, workflow manager, workflow monitor), SQL *Loader, Congas, Oracle, SQL Server 2008, Erwin, Windows 2000, TOAD

Confidential, Charlotte, NC

Data Analyst/ Data Modeler

Responsibilities:

  • Gathered and translated business requirements into detailed, production-level technical specifications, new features, and enhancements to existing technical business functionality
  • Part of team conducting logical data analysis and data modeling JAD sessions, communicated data-related standards
  • Performed Reverse Engineering of the current application using Erwin, and developed Logical and Physical data models for Central Model consolidation
  • Translated logical data models into physical database models, generated DDLs for DBAs
  • Performed Data Analysis and Data Profiling and worked on data transformations and data quality rules.
  • Involved in extensive data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues
  • Collected, analyze and interpret complex data for reporting and/or performance trend analysis
  • Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex DW using Informatica
  • Developed and maintain sales reporting using in MS Excel queries, SQL in Teradata, and MS Access
  • Involved in writing T-SQL working on SSIS, SSRS, SSAS, Data Cleansing, Data Scrubbing and Data Migration
  • Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns as part of Data Analysis responsibilities
  • Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin
  • Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, with high volume data
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business
  • Involved in extensive data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues
  • Created SQL tables with referential integrity, constraints and developed queries using SQL, SQL*PLUS and PL/SQL
  • Performed GAP analysis of current state to desired state and document requirements to control the gaps identified
  • Developed the batch program in PL/SQL for the OLTP processing and used Unix Shell scripts to run in corn tab
  • Identified & record defects with required information for issue to be reproduced by development team
  • Worked on the reporting requirements and involved in generating the reports for the Data Model using crystal reports

Environment: Erwin 8.5, PL/SQL, Business Objects XIR2, Informatica 8.6, Oracle 9i, Teradata R13, Teradata SQL Assistant 12.0, PL/SQL, Flat Files, Teradata

Hire Now