Sr. Data Modeler/data Analyst Resume
Arlington, VA
SUMMARY:
- Overall 8+ years of experience as a Sr. Data Modeler/Data Analyst with high proficiency in requirement gathering and data modeling.
- Proficient in Software Development Life Cycle (SDLC), Project Management methodologies, and Microsoft SQL Server database management.
- Very good knowledge on Amazon Web Services: AWS Redshift, AWS S3 and AWS EMR.
- Good experience in conducting Joint Application Development (JAD) sessions with SMEs, Stakeholders and other project team members for requirements gathering and analysis.
- Hands on experience in Normalization and De - Normalization techniques for optimum performance in relational and dimensional database environments.
- Sound knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
- Excellent Knowledge of Ralph Kimball and Bill-Inmon's approaches to Data Warehousing.
- Extensive experience in using ER modeling tools such as Erwin, ER/Studio, Teradata, and MDM.
- Familiar with Installation, configuration, patching and upgrading of Tableau tool across the environments
- Proficient in writing DDL, DML commands using SQL developer and Toad.
- Experience in Performance tuning on oracle databases by leveraging explain plans, and tuning SQL queries.
- Excellent SQL programming skills and developed Stored Procedures, Triggers, Functions, Packages using SQL, PL/SQL.
- Experience in Data transformation and Data mapping from source to target database schemas and also data cleansing.
- Good experience on Relational Data modeling (3NF) and Dimensional data modeling.
- Expert in building Enterprise Data Warehouse from Scratch using both Kimball and Inmon Approach.
- Experience in data analysis to track data quality and for detecting/correcting inaccurate data from the databases.
- Extensive experience on usage of ETL & Reporting tools like SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS).
- Proficient in data mart design using dimensional data modeling identifying Facts and Dimensions, Star Schema and Snowflake Schema.
- Experience designing security at both the schema level and the accessibility level in conjunction with the DBAs
- Experience in Designing and implementing data structures and commonly used data Business Intelligence tools for data analysis.
- Efficient in implementing Normalization to 3NF/ De-normalization techniques for optimum performance in relational and dimensional database environment
- Hands on experience with modeling using Erwin in both forward and reverse engineering processes.
- Experience in Requirement gathering, System analysis, handling business and technical issues & communicating with both business and technical users.
- Strong background in data processing, data analysis with hands on experience in MS Excel, MS Access, UNIX and Windows Servers.
- Experience with DBA tasks involving database creation, performance tuning, creation of indexes, creating and modifying table spaces for optimization purposes.
- Excellent analytical skills with exceptional ability to master new technologies efficiently.
TECHNICAL SKILLS:
Data Modeling Tools: Erwin Data Modeler 9.7, Erwin Model Manager, ER Studio v17, and Power Designer.
Programming Languages: SQL, PL/SQL, HTML5, C++, XML and VBA.
SSRS, Power BI, Tableau, SSAS, MS: Excel, SAS BI Platform.
Cloud Management: Amazon Web Services(AWS), Redshift
OLAP Tools: Tableau 10.5, SAP BO, SSAS, Business Objects, and Crystal Reports
Cloud Platform: AWS, MS Azure, Google Cloud, Cloud Stack/Open Stack
Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server 2016/2014, DB2.
Operating System: Windows, Unix, Sun Solaris
Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Model.
AWS tools: EC2, S3 Bucket, AMI, RDS, Amazon Redshift.
PROFESSIONAL EXPERIENCE:
Confidential - Arlington, VA
Sr. Data Modeler/Data Analyst
Responsibilities:
- As a Sr. Data Modeler/ Analyst to generate Data Models using Erwin and subsequent deployment to Enterprise Data Warehouse.
- Participated in design discussions and assured functional specifications are delivered in all phases of SDLC in an Agile Environment.
- Defined facts, dimensions and designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
- Identified and documented data sources and transformation rules required to populate and maintain data Warehouse content.
- Assisted design logical models (relationship, cardinality, attributes, candidate keys) as per business requirements using Erwin Data Modeler.
- Used forward engineering to create a physical data model with DDL that best suits the requirements from the Logical Data Model.
- Wrote DDL and DML statements for creating, altering tables and converting characters into numeric values.
- Developed MDM integration plan and hub architecture for customers, products and vendors, Designed MDM solution for three domains.
- Extensively used Star and Snowflake Schema methodologies.
- Used Normalization (1NF, 2NF & 3NF) and De-normalization techniques for effective performance in OLTP and OLAP systems.
- Developed dimensional model for Data Warehouse/OLAP applications by identifying required facts and dimensions.
- Worked on Performance Tuning of the database which includes indexes, optimizing SQL Statements.
- Worked on Data load using Azure Data factory using external table approach.
- Involved in Installing, Configuring Hadoop Eco-System, Cloudera Manager using CDH3, CDH4 Distributions.
- Involved in Normalization / De normalization techniques for optimum performance in relational and dimensional database environments.
- Involved in development and implementation of SSIS and SSAS application solutions for various business units across the organization.
- Designed and implemented a Data Lake to consolidate data from multiple sources, using Hadoop stack technologies like SQOOP, HIVE/HQL.
- Designed ER diagrams (Physical and Logical using Erwin) and mapping the data into database objects and identified the Facts and Dimensions from the business requirements
- Wrote the SQL queries on data staging tables and data warehouse tables to validate the data results.
- Build and maintain SQL scripts, Indexes, and complex queries for data analysis and extraction.
- Involved in extracting, cleansing, transforming, integrating and loading data into different Data Marts using Data Stage Designer.
- Worked with data compliance teams, Data governance team to maintain data models, Metadata, Data Dictionaries.
- Performed Data Analysis, Data Migration and data profiling using complex SQL on various sources systems.
- Generate ad-hoc or management specific reports using SSRS and Excel.
- Created SQL queries to generate ad-hoc reports for the business.
- Used windows Azure SQL reporting services to create reports with tables, charts and maps.
Environment: Erwin 9.7, Agile, Ralph Kimball, MDM, 3NF, OLAP, OLTP, Azure, Hadoop 3.0, Hive 2.3, SSRS, SSIS
Confidential - Lowell, AR
Sr. Data Modeler/Data Analyst
Responsibilities:
- Massively involved in Data Modeler role to review business requirement and compose source to target data mapping documents.
- Extensively used agile methodology as the Organization Standard to implement the data Models.
- Designed and developed logical and physical data models that utilize concepts such as Star Schema, Snowflake Schema and Slowly Changing Dimensions
- Worked on master data (entities and attributes) and capture how data is interpreted by users in various parts of the organization.
- Involved in Master data analysis, design, Interfaces analysis, Data Analysis, Data Quality, Data Architecture tasks.
- Performed Data Analysis and Data Validation by writing complex SQL queries using Teradata SQL Assistant.
- Worked on Amazon Redshift and AWS a solution to load data, create data models.
- Involved in Data profiling and performed Data Analysis based on the requirements, which helped in catching many Sourcing Issues upfront.
- Worked with MDM systems team with respect to technical aspects and generating reports.
- Responsible for data governance rules and standards to maintain the consistency of the business element names in the different data layers.
- Connected to AWS Redshift through Tableau to extract live data for real time analysis.
- Created data masking mappings to mask the sensitive data between production and test environment.
- Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using ER/Studio.
- Developed SQL Queries to fetch complex data from different tables in remote databases using joins, database links and bulk collects.
- Performed tuning and stress-testing of NoSQL database environments in order to ensure acceptable database performance in production mode.
- Created complex SQL queries and scripts to extract and aggregate data to validate the accuracy of the data.
- Used advanced features of T-SQL in order to design and tune T-SQL to interface with the Database.
- Developed and maintained data dictionary to create metadata reports for technical and business purpose.
- Involved in the data transfer creating tables from various tables, coding using PL/SQL, Stored Procedures and Packages.
- Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
- Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from NoSQL and a variety of portfolios.
- Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
- Designed both 3NF data models for ODS, OLTP systems and dimensional data models.
- Developed and configured on Informatica MDM hub supports the Master Data Management (MDM), Business Intelligence (BI) and Data Warehousing platforms to meet business needs.
- Worked on Amazon Redshift and AWS and architecting a solution to load data creates data models and run BI on it.
- Documented ER Diagrams, Logical and Physical models, business process diagrams and process flow diagrams.
- Involved in extensive data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
- Created Tableau dashboard reports, working with developers, creating business intelligence reports and visualizations.
Environment: Teradata r15, agile, AWS, Amazon Redshift, MDM, Tableau, Ralph Kimball, ER/Studio, NoSQL, T-SQL, PL/SQL, SQL, MS Excel
Confidential, Tampa, FL
Data Modeler
Responsibilities:
- Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
- Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
- Optimized the existing procedures and SQL statements for the better performance using EXPLAIN PLAN, HINTS, SQL TRACE and etc. to tune SQL queries.
- The interfaces were developed to be able to connect to multiple databases like SQL server and oracle.
- Assisted Kronos project team in SQL Server Reporting Services installation.
- Developed SQL Server database to replace existing Access databases.
- Attended and participated in information and requirements gathering sessions
- Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications.
- Designed and created web applications to receive query string input from customers and facilitate entering the data into SQL Server databases.
- Performed thorough data analysis for the purpose of overhauling the database using SQL Server.
- Designed and implemented business intelligence to support sales and operations functions to increase customer satisfaction.
- Converted physical database models from logical models, to build/generate DDL scripts.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
- Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
- Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas
- Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
- Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW.
- Implemented Snow-flak schema to ensure no redundancy in the database.
- Designed Star and Snowflake Data Models for Enterprise Data Warehouse using Power Designer.
Environment: PL/SQL, SQL, OLAP, 3NF, OLTP, Power Designer 14.0
Confidential - SFO, CA
Data Analyst
Responsibilities:
- Worked with business requirements analysts & SMEs to identify and understand requirements.
- Conducted data analysis to evaluate data sources to determine the best source for business information.
- Developed complex SQL queries, and perform execution validation for remediation and Analysis.
- Designed Data Quality Framework to perform schema validation and data profiling.
- Used advanced MS excel to create Pivot table and Pivot reports.
- Performed numerous data extraction requests using SQL scripts to prepare ad hoc reports.
- Involved in writing extensive SQL Queries for back end testing oracle database.
- Involved in preparing several Use Cases, Business Process Flows, Sequence Diagrams, using MS Visio.
- Wrote the SQL queries on data staging tables and data warehouse tables to validate the data results.
- Wrote complex SQL queries to retrieve data from different tables utilizing Joins & Sub-queries.
- Responsible for data validation across the risk management systems.
- Responsible for extracting, compiling and tracking data for making reports weekly & monthly basis.
- Used MS Excel to maintain the research database and generated spreadsheets, pivot tables and dashboards.
- Involved in creating database objects like tables, procedures, triggers and functions to maintain data efficiently.
- Performed Data Validations using SQL queries by extracting data and running queries in wamp server.
- Created SQL tables with referential integrity and developed queries using SQL, and PL/SQL.
- Used SAS procedures like means, frequency and other statistical calculations for Data validation.
- Performed Data Analysis, Data Validation and Data mapping by writing complex SQL queries.
- Performed Data Analysis to make sure that the data is loading correctly into relational database.
- Build and maintain SQL scripts, Indexes, and complex queries for data analysis and extraction.
- Wrote PL/SQL statement, stored procedures and Triggers for extracting as well as writing data.
- Involved in designing and developing SQL server objects such as Tables, Views, Indexes (Clustered and Non-Clustered), Stored Procedures and Functions in Transact-SQL.
Environment: SQL, MS excel, Oracle, MS Visio, PL/SQL, SAS, T/SQL