Sr. Data Modeler Resume
Baltimore, MD
SUMMARY
- Above 9+ years of professional experience in Data Modeling and Data Analysis.
- Proficient in Software Development Life Cycle (SDLC), Project Management methodologies, and Microsoft SQL Server database management.
- Very good noledge on Amazon Web Services: AWS Redshift, AWS S3 and AWS EMR.
- Good experience in conducting Joint Application Development (JAD) sessions with SMEs, Stakeholders and other project team members for requirements gathering and analysis.
- Hands on experience in Normalization and De - Normalization techniques for optimum performance in relational and dimensional database environments.
- Sound noledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
- Excellent Knowledge of Ralph Kimball and BillInmon's approaches to Data Warehousing.
- Extensive experience in using ER modeling tools such as Erwin, ER/Studio, Teradata, and MDM.
- Familiar with Installation, configuration, patching and upgrading of Tableau tool across the environments
- Proficient in writing DDL, DML commands using SQL developer and Toad.
- Experience in Performance tuning on oracle databases by leveraging explain plans, and tuning SQL queries.
- Excellent SQL programming skills and developed Stored Procedures, Triggers, Functions, Packages using SQL, PL/SQL.
- Experience in Data transformation and Data mapping from source to target database schemas and also data cleansing.
- Good experience on Relational Data modeling (3NF) and Dimensional data modeling.
- Expert in building Enterprise Data Warehouse from Scratch using both Kimball and Inmon Approach.
- Experience in data analysis to track data quality and for detecting/correcting inaccurate data from the databases.
- Extensive experience on usage of ETL & Reporting tools like SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS).
- Proficient in data mart design using dimensional data modeling identifying Facts and Dimensions, Star Schema and Snowflake Schema.
- Experience designing security at both the schema level and the accessibility level in conjunction with the DBAs
- Experience in Designing and implementing data structures and commonly used data Business Intelligence tools for data analysis.
- Efficient in implementing Normalization to 3NF/ De-normalization techniques for optimum performance in relational and dimensional database environment
- Hands on experience with modeling using Erwin in both forward and reverse engineering processes.
- Experience in Requirement gathering, System analysis, handling business and technical issues & communicating with both business and technical users.
- Strong background in data processing, data analysis with hands on experience in MS Excel, MS Access, UNIX and Windows Servers.
- Experience with DBA tasks involving database creation, performance tuning, creation of indexes, creating and modifying table spaces for optimization purposes.
- Excellent analytical skills with exceptional ability to master new technologies efficiently.
TECHNICAL SKILLS
Data Modeling Tools: Erwin Data Modeler 9.7, Erwin Model Manager, ER Studio v17, and Power Designer.
Programming Languages: SQL, PL/SQL, HTML5, C++, Java, XML and VBA.
Reporting Tools: SSRS, Power BI, Tableau, SSAS, MS-Excel, SAS BI Platform.
Cloud Management: Amazon Web Services(AWS), Redshift
OLAP Tools: Tableau 10.5, SAP BO, SSAS, Business Objects, and Crystal Reports
Cloud Platform: AWS, MS Azure, Google Cloud, Cloud Stack/Open Stack
Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server 2016/2014, DB2.
Operating System: Windows, Unix, Sun Solaris
Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Model.
AWS tools: EC2, S3 Bucket, AMI, RDS, Amazon Redshift.
PROFESSIONAL EXPERIENCE
Confidential - Baltimore, MD
Sr. Data Modeler
Responsibilities:
- As a Data Modeler responsible for data warehousing, data modeling, data governance, data architecture standards, methodologies, guidelines and techniques.
- Partner with various business stakeholders and technology leaders, gather requirements, converted them into scalable technical and system requirement documents
- Designed rule engine to handle complicated data conversion requirements when syncing data among multiple POS systems and the centralized ERP system.
- Designed Data lake, Master data, Security, data hub and data warehouse/data marts layers.
- Created logical, physical models according to the requirements and physicalize them.
- TEMPEffectively articulated reasoning for data model design decisions and strategically incorporated team member feedback to produce the highest quality data models.
- Worked with project and application teams to ensure dat they understand and fully comply with data quality standards, architectural guidelines and designs.
- Performed Reverse engineering of the source systems using Oracle Data modeler.
- Involved in capturing Data Lineage, Table and Column Data Definitions, Valid Values and others necessary information in the data models.
- Identified Facts & Dimensions Tables and established the Grain of Fact for Dimensional Models.
- Generate the DDL of the target data model and attached it to the JIRA to be deployed in different Environments.
- Tuned DB queries/processes and improved performance
- Reverse engineered crystal reports (Command Performance), SSRS reports to identify logic/business rules for the Driver's Performance Metrics, Customer Order Performance, Order management and daily sales.
- Created data mart based on multiple POS system for Power BI Dashboards/reports.
- Worked on Data load using Azure Data factory using external table approach.
- Involved in creating Pipelines and Datasets to load the data onto data warehouse.
- Worked closely with ETL SSIS Developers to explain the complex Transformations using Logic
- Created ETL Jobs and Custom Transfer Components to move data from Transaction systems to centralized area (Azure sql data Warehouse) to meet deadlines.
- Extensively used SSIS transformations such as Lookup, Derived column, Data conversion, Aggregate, Conditional split, SQL task, Script task and Send Mail task etc.
- Developed SSIS packages to load data from various source systems to Data Warehouse.
Environment: Oracle Data modeler, Visio, Microsoft outlook, Adobe PDF, DQ Analyzer, SQL Server, Azure data factory, Power BI, Microsoft Teams, Microsoft Visual studio.
Confidential - Washington, DC
Sr. Data Modeler/Data Analyst
Responsibilities:
- As a Data Modeler/Analyst involved in the entire life cycle of the project starting from requirements gathering to end of system integration.
- Reviewed business requirement and compose source to target data mapping documents.
- Participated in design discussions and assured functional specifications are delivered in all phases of SDLC in an Agile Environment.
- Worked on NoSQL databases including Cassandra.
- Implemented multi-data center and multi-rack Cassandra cluster.
- Coordinated with Data Architects on AWS provisioning EC2 Infrastructure and deploying applications in Elastic load balancing.
- Defined facts, dimensions and designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
- Developed Conceptual Model and Logical Model using Erwin based on requirements analysis.
- Created various Physical Data Models based on discussions with DBAs and ETL developers.
- Designed Star and Snowflake Data Models for Enterprise Data Warehouse using Erwin.
- Designed both 3NF data models for ODS, OLTP systems and dimensional data models using Star and Snow Flake Schemas.
- Used Normalization and Denormalization techniques for TEMPeffective performance in OLTP systems.
- Implemented Forward engineering to create tables, views and SQL scripts and mapping documents.
- Created Project Plan documents, Software Requirement Documents, Environment Configuration and UML diagrams.
- Translated business concepts into XML vocabularies by designing XML Schemas with UML.
- Worked on Amazon Redshift and AWS and architecting a solution to load data, create data models.
- Interacted with users and business analysts to gather requirements.
- Worked on designing a Star schema for the detailed data marts and plan data marts involving confirmed dimensions.
- Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
- Developed and maintained data Dictionary to create Metadata Reports for technical and business purpose.
- Completed enhancement for MDM (Master data management) and suggested the implementation for hybrid MDM (Master Data Management)
- Worked on designing, implementing and deploying into production an Enterprise data warehouse.
- Developed Data Mapping, Data Governance and transformation rules for the Master Data Management Architecture involving OLTP, ODS.
- Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
- Conducted numerous POCs to efficiently import large data sets into the database from AWS.
- Wrote DDL and DML statements for creating, altering tables and converting characters into numeric values.
- Worked with MDM systems team with respect to technical aspects and generating reports.
- Created PL/SQL procedures in order to aid business functionalities like bidding and allocation of inventory to the shippers.
- Developed SQl process using SSIS with Various Control Flow, Data Flow tasks and Store Procedures for Validation process.
- Performed Data profiling, data mining and identified the risks involved with data integration to avoid time delays in the project.
- Worked on Performance Tuning of the database which includes indexes, optimizing SQL Statements.
- Involved in capturing data lineage, table and column data definitions, valid values and others necessary information in the data model.
- Worked on the metadata management and part of data governance team which created the Data.
- Created or modified the T-SQL queries as per the business requirements.
- Produced report using SQL Server Reporting Services (SSRS) and creating various types of reports.
- Created reports using SQL Reporting Services (SSRS) for customized and ad-hoc Queries.
- Involved in user training sessions and assisting in UAT (User Acceptance Testing).
Environment: Erwin9.7, SQL Server 17, SDLC, AWS, EC2, PL/SQL, XML, SSIS, SSRS, OLTP, ETL, Cassandra, NoSQL, EDW.
Confidential - Philadelphia, PA
Sr. Data Modeler/Data Analyst
Responsibilities:
- Worked as a Data Modeler/Analyst to generate Data Models using E/R Studio and developed relational database system.
- Interacted with Business Analyst, SMEs and other Data Architects to understanding Business needs
- Created Logical & Physical Data Model on Relational (OLTP) on Star schema for Fact and Dimension tables using E/R Studio.
- Performed GAP analysis to analyze the difference between the system capabilities and business requirements.
- Used Agile Method for daily scrum to discuss the project related information.
- Responsible for Big data initiatives and engagement including analysis, brain storming, POC, and architecture.
- Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
- Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
- Involved in Data flow analysis, Data modeling, Physical database design, forms design and development, data conversion, performance analysis and tuning.
- Created and maintained data model standards, including master data management (MDM) and Involved in extracting the data from various sources.
- Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using E/R Studio.
- Worked on Master data Management (MDM) Hub and interacted with multiple stakeholders.
- Proficient in developing Entity-Relationship diagrams, Star/Snow Flake Schema Designs, and expert in modeling Transactional Databases and Data Warehouse.
- Worked on normalization techniques, normalized the data into 3rd Normal Form (3NF).
- Worked Normalization and De-normalization concepts and design methodologies like Ralph Kimball and Bill Inmon approaches and implemented Slowly Changing Dimensions.
- Implemented Forward engineering to create tables, views and SQL scripts and mapping documents.
- Used reverse engineering to connect to existing database and create graphical representation (E-R diagram).
- Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
- Performance tuning and stress-testing of NoSQL database environments in order to ensure acceptable database performance in production mode.
- Developed automated data pipelines from various external data sources (web pages, API etc) to internal data warehouse (SQL server) tan export to reporting tools.
- Connected to AWS Redshift through Tableau to extract live data for real time analysis.
- Performed data analysis and data profiling using complex SQL on various sources systems including Oracle.
- Monitored the Data quality and integrity of data was maintained to ensure TEMPeffective functioning of department.
- Managed database design and implemented a comprehensive Star-Schema with shared dimensions.
- Analyzed the data which is using the maximum number of resources and made changes in the back-end code using PL/SQL stored procedures and triggers
- Developed and maintained stored procedures, implemented changes to database design including tables and views.
- Conducted Design reviews with the business analysts and the developers to create a proof of concept for the reports.
- Performed detailed data analysis to analyze the duration of claim processes
- Created the cubes with Star Schemas using facts and dimensions through SQL Server Analysis Services (SSAS).
- Deployed SSRS reports to Report Manager and created linked reports, snapshots, and subscriptions for the reports and worked on scheduling of the reports.
- Generated parameterized queries for generating tabular reports using global variables, expressions, functions, and stored procedures using SSRS.
- Developed, and scheduled variety of reports like cross-tab, parameterized, drill through and sub reports with SSRS.
Environment: E/R Studio v16, OLTP, Agile, MDM, SQL Server 17, NoSQL, AWS, Oracle 12c, PL/SQL
Confidential, Juno Beach, FL
Data Analyst/Data Modeler
Responsibilities:
- Worked as a Sr. Data Analyst/Data Modeler to generate Data Models using E/R Studio and subsequent deployment to Enterprise Data Warehouse.
- Developed the required data warehouse model using Star schema for the generalized model.
- Designed the data marts in dimensional data modeling using star and snowflake schemas.
- Designed 3rd normal form target data model and mapped to logical model.
- Identified the Facts & Dimensions Tables and established the Grain of Fact for Dimensional Models.
- Used forward engineering approach for designing and creating databases for OLAP model.
- Conducted data modeling JAD sessions and communicated data related standards.
- Generated DDL statements for the creation of new ER/studio objects like table, views, indexes, packages and stored procedures
- Developed rule sets for data cleansing and actively participated in data cleansing and anomaly resolution of the legacy application.
- Worked on Oracle PL/SQL and Shell Scripts, Packages, Scheduling, Data Import/Export, Data Conversions and Data Cleansing
- Involved in extensive Data validation using SQL queries and back-end testing.
- Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.
- Performed Data Analysis and extensive Data validation by writing several complex SQL queries.
- Created complex queries to automate data profiling process needed to define the structure of the pre staging and staging area.
- Involved in development and implementation of SSIS and SSAS applications.
- Conducted data mining and data modeling in coordination with finance manager.
- Performed Gap Analysis on existing data models and helped in controlling the gaps identified.
- Performed source data analysis, data discovery, data profiling and data mapping.
- Used advanced features of T-SQL in order to design and tune T-SQL to interface with the Database.
- Applied conditional formatting in SSRS to highlight key areas in the report data.
- Developed and maintained data dictionary to create metadata reports for technical and business purpose.
- Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT).
Environment: E/R Studio, Oracle, SQL, PL/SQL, SSIS, SSRS, T-SQL, OLAP, OLTP, UAT, Tableau.
Confidential
Data Analyst
Responsibilities:
- Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
- Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
- Optimized the existing procedures and SQL statements for the better performance using EXPLAIN PLAN, HINTS, SQL TRACE and etc. to tune SQL queries.
- The interfaces were developed to be able to connect to multiple databases like SQL server and oracle.
- Assisted Kronos project team in SQL Server Reporting Services installation.
- Developed SQL Server database to replace existing Access databases.
- Attended and participated in information and requirements gathering sessions
- Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications.
- Designed and created web applications to receive query string input from customers and facilitate entering the data into SQL Server databases.
- Performed thorough data analysis for the purpose of overhauling the database using SQL Server.
- Designed and implemented business intelligence to support sales and operations functions to increase customer satisfaction.
- Converted physical database models from logical models, to build/generate DDL scripts.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Extensively used ETL to load data from DB2, Oracle databases.
- Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
- Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
- Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas
- Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
- Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
- Worked and experienced on Star Schema, DB2 and IMS DB.
Environment: Oracle, PL/SQL, DB2, Erwin7.0, UNIX, Teradata SQL Assistant, Informatica, OLTP, OLAP, Data Marts, DQ analyzer.