Sr. Data Modeler/data Analyst Resume
Shelton, CT
SUMMARY:
- Overall 8+ years of professional experience in Data Modeling and Data Analysis as a Proficient in gathering business requirements and handling requirements management
- Strong working Experience with Agile, Scrum, Kanban and Waterfall methodologies.
- Solid Excellent experience in creating cloud based solutions and architecture using Amazon Web services (Amazon EC2, Amazon S3, Amazon RDS) and Microsoft Azure.
- Experience in conducting Joint Application Development (JAD) sessions with SMEs, Stakeholders and other project team members for requirements gathering and analysis.
- Experience in Data Extraction/Transformation/Loading (ETL), Data Migration using Microsoft SQL Server Integration Services (SSIS).
- Solid hands on experience with administration of data model repository, documentation in metadata portals in such as Erwin, ER Studio and Power Designer tools.
- Extensive experience in development of T - SQL, Oracle PL/SQL Scripts, Stored Procedures and Triggers for business logic implementation.
- Proficient in Normalization/De-normalization techniques in relational/dimensional database environments and have done normalizations up to 3NF.
- Expertise in SQL Server Analysis Services (SSAS) and SQL Server Reporting Services (SSRS) tools.
- Experience in backend programming including schema and table design, stored procedures, Triggers, Views, and Indexes.
- Experience in generating DDL (Data Definition Language) Scripts and creating Indexing strategies.
- Expertise in creating conceptual, logical and physical data models for OLTP and OLAP systems using dimensional models.
- Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Performance Tuning and System Testing
- Experience in working with RDBMS like Oracle, Microsoft SQL Server and Teradata.
- Proficient in Normalization/De-normalization techniques in relational/dimensional database environments and have done normalizations up to 3NF.
- Proficient in data mart design, creation of cubes, identifying facts & dimensions, star & snowflake schemes and canonical model.
- Experience in modeling OLAP systems using Kimball and Bill-Inmon Data warehousing methodology.
- Conduct data analysis, mapping transformation, data modeling and data-warehouse concepts.
- Good Knowledge on SQL queries and creating database objects like stored procedures, triggers, packages and functions using SQL and PL/SQL for implementing the business techniques.
- Experience in data transformation, data mapping from source to target database schemas, data cleansing procedures using.
- Excellent knowledge on creating reports on SAP Business Objects, Web reports for multiple data providers.
- Experience in data analysis to track data quality and for detecting/correcting inaccurate data from the databases.
- Excellency in MS Excel with proficiency in Vlookups, Pivot Tables and understanding of VBA Macros.
- Strengths include excellent communication skills and interpersonal skills, good analytical skills and logical approach to problem solving.
TECHNICAL SKILLS:
Data Modeling Tools: Erwin Data Modeler 9.7, Erwin Model Manager, ER Studio v17, and Power Designer.
Programming Languages: SQL, PL/SQL, HTML5, C++, Java, XML and VBA.
Reporting Tools: SSRS, Power BI, Tableau, SSAS, MS-Excel, SAS BI Platform.
Cloud Management: Amazon Web Services(AWS), Azure
OLAP Tools: Tableau 10.5, SAP BO, SSAS, Business Objects, and Crystal Reports
Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server 2016/2014, DB2.
Operating System: Windows, Unix, Sun Solaris, LINUX
Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Model.
WORK EXPERIENCE:
Confidential - Shelton, CT
Sr. Data Modeler/Data Analyst
Responsibilities:
- As a Data Modeler/Analyst to generate Data Models using Erwin and subsequent deployment to Enterprise Data Warehouse.
- Participated in design discussions and assured functional specifications are delivered in all phases of SDLC in an Agile Environment.
- Used windows Azure SQL reporting services to create reports with tables, charts and maps.
- Worked with Data Analytics, Data Reporting, Graphs, Scales, PivotTables and OLAP reporting.
- Involved in Installing, Configuring Hadoop Eco-System, Cloudera Manager using CDH3, CDH4 Distributions.
- Involved in Normalization and De-Normalization of existing tables for faster query retrieval and designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas.
- Created and maintained data model, including master data management (MDM).
- Developed a Conceptual Model and Logical Model using Erwin based on requirements analysis.
- Implemented Normalization Techniques and build the tables as per the requirements given by the business users.
- Participated in migrating the existing traditional SQL application into Microsoft cloud ADL (Azure data lake).
- Worked on Performance Tuning of the database which includes indexes, optimizing SQL Statements.
- Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
- Worked on Data load using Azure Data factory using external table approach
- Developed the required data warehouse model using Star schema for the generalized model.
- Designed SSIS packages to import data from multiple sources to control upstream and downstream of data into SQL Azure database.
- Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design
- Prepared complex T-SQL queries, views and stored procedures to load data into staging area.
- Managed database design and implemented a comprehensive Snow flake-Schema with shared dimensions.
- Developed and presented Business Intelligence reports and product demos to the team using SSRS (SQL Server Reporting Services).
- Developed and maintained data Dictionary to create Metadata Reports for technical and business purpose.
- Involved in Data profiling in order to detect and correct inaccurate data and maintain the data quality.
- Prepared process flow/activity diagram for existing system using MS Visio and re-engineer the design based on business requirements.
- Developed Python, Shell/Perl Scripts and Power shell for automation purpose and Component unit testing using Azure Emulator.
- Developed various Qlikview Data Models by extracting and using the data from various sources files Excel, Flat Files
- Generated multiple ad-hoc Python tools and scripts to facilitate map generation and data manipulation.
Environment: Erwin 9.7, Agile, Azure, OLAP, 3NF, OLTP, MDM, PL/SQL, T-SQL, SSRS, Hadoop 3.0, MS Visio 2018, ad-hoc, Python 3.7
Confidential
Data Modeler/Data Analyst
Responsibilities:
- Worked as a Sr. Data Modeler / Data Analyst role to review business requirement and compose source to target data mapping documents.
- Extensively used Agile methodology as the Organization Standard to implement the data Models.
- Analyzed the business requirements of the project by studying the Business Requirement Specification document.
- Created PL/SQL packages and Database Triggers and developed user procedures and prepared user manuals for the new programs.
- Connected to AWS Redshift through Tableau to extract live data for real time analysis.
- Generated reports using SQL Server Reporting Services from OLTP and OLAP data sources.
- Involved in complete SSIS life cycle in creating SSIS packages, building, deploying and executing the packages all environments.
- Developed the Data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
- Worked with MDM systems team with respect to technical aspects and generating reports.
- Assisted in the oversight for compliance to the Enterprise Data Standards, data governance and data quality.
- Generated comprehensive analytical reports by running SQL queries against current databases to conduct Data Analysis.
- Developed and maintained data dictionary to create metadata reports for technical and business purpose.
- Worked on S3 buckets on AWS to store Cloud Formation Templates and worked on AWS to create EC2 instances.
- Designed stunning visualization dashboards using Tableau desktop and publishing dashboards on Tableau server.
- Involved in developing the data warehouse for the database using the Ralph Kimball's Dimensional Data Mart modeling methodology.
- Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT).
- Created SSIS package to load data from Flat files, Excel and Access to SQL server using connection manager.
- Developed all the required stored procedures, user defined functions and triggers using T-SQL.
- Created the cubes with Star Schemas using facts and dimensions through SQL Server Analysis Services (SSAS).
- Updated Python scripts to match data with our database stored in AWS Cloud.
- Used reverse engineering to create Graphical Representation (E-R diagram) and to connect to existing database.
- Involved in data mapping document from source to target and the data quality assessments for the source data.
- Produced report using SQL Server Reporting Services (SSRS) and creating various types of reports.
- Worked on Data Mining and data validation to ensure the accuracy of the data between the warehouse and source systems.
- Performed GAP analysis to analyze the difference between the system capabilities and business requirements.
Environment: Agile, PL/SQL, AWS, Tableau 9.3, OLTP, OLAP, SSIS, MDM, SQL, T-SQL, SSRS, Python 3.6
Confidential - Eden Prairie, MN
Sr. Data Modeler
Responsibilities:
- Worked as a Data Modeler on Data warehouse. Application based on Oracle database.
- Extensively used agile methodology as the Organization Standard to implement the data Models.
- Worked on AWS Redshift and RDS for implementing models and data on RDS and Redshift
- Participated in requirement gathering session, JAD sessions with users, Subject Matter experts,
- Developed the Conceptual Data Models, Logical Data Models and transformed them to creating schema using E/R Studio.
- Supported the DBA in the physical implementation of the tables in both Oracle and DB2 databases.
- Performed Reverse engineering of the source systems using Oracle Data modeler.
- Identified Facts & Dimensions Tables and established the Grain of Fact for Dimensional Models.
- Resolved multiple Data Governance issues to support data consistency at the enterprise level.
- Effectively utilized TOAD to run SQL / PL-SQL / T-SQL statements on the database.
- Involved in the process design documentation of the Data Warehouse Dimensional Upgrades.
- Generated the DDL of the target data model and attached it to the Jira to be deployed in different Environments.
- Conducted user interviews, gathering requirements, analyzing the requirements using Rational Rose, Requisite pro RUP.
- Normalized the database based on the new model developed to put them into the 3NF of thedatawarehouse.
- Involved in cleaning of largedatasets using python and Created named sets, calculated member and designed scope in SSIS.
- Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.
- Developed different kind of reports such as Drill down, Drill through, Sub Reports, Charts, Matrix reports, Parameterized reports and Linked reports using SSRS.
- Designed a STAR schema for sales data involving shared dimensions (Conformed) for other subject areas using E/R Studio.
- Translated business concepts into XML vocabularies by designing XML Schemas with UML
- Implemented Snow-flake schema to ensure no redundancy in the database.
- Performed forward engineering using ER studio to generate the DDL of the target data model.
- Generated SQL scripts for retrieving the data from Teradata and SQL Server in Linux environment.
- Extensively worked on data profiling and scanning to de-duplicate records in the staging area before data gets processed.
Environment: Oracle 12c, AWS, agile, E/R Studio v14, SQL, PL-SQL, T-SQL, python 3.4, SSIS, OLTP, SSRS, XML, Teradata r13
Confidential - Des Plaines, IL
Data Analyst/ Data Modeler
Responsibilities:
- Modified the existing Data Analyst/Data modeler for data hub and info center and generated DDL commands for deployment using Sybase Power Designer.
- Actively participated in JAD sessions involving the discussion of various reporting needs.
- Designed data flows that (ETL) extract, transform, and load data by optimizing SSIS performance.
- Designed & developed OLTP models and OLAP model for the reporting requirements using Power designer.
- Performed Data Profiling to identify data issues upfront, provided SQL prototypes to confirm the business logic provided prior to the development.
- Developed and deployed quality T-SQL codes, stored procedures, views, functions, triggers.
- Involved in the creation, maintenance of Data Warehouse and repositories containing Metadata.
- Expanded Physical Data Model (PDM) for the OLTP application using Power Designer.
- Worked withDataVault Methodology Developed normalized Logical and Physical database models.
- Build and maintained SQL scripts, Indexes, and complex queries for data analysis and extraction.
- Developed scripts that automated DDL and DML statements used in creations of databases, tables, constraints, and updates.
- Widely used Normalization methods and have done different normalizations (3NF).
- Reviewed Stored Procedures for reports and wrote test queries against the source system (SQL Server-SSRS) to match the results with the actual report against the Data mart (Oracle).
- Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing.
- Involved with metadata & Data Dictionary Management, Data Profiling and Data Mapping.
- Conducted User Acceptance Testing, gathered and documented User Manuals and Business Rules
- Extracted data from Oracle and upload to Teradata tables using Teradata utilities Fast load & Multi load.
- Developed and maintain data solutions that utilize SQL, Microsoft SQL Server Reporting Services and Excel.
- Developed and maintained Data Dictionary for transactional systems and Data warehouse.
- Star schema was developed for proposed central model and normalized star schema to snowflake schema.
- Analyzed the Business information requirements and examined the OLAP source systems to identify the measures, dimensions and facts required for the reports.
Environment: Sybase Power Designer 16, JAD, SSIS, OLTP, OLAP, T-SQL, PL/SQL, Teradata r, Oracle 11g, MS Excel 2014
Confidential
Data Analyst
Responsibilities:
- Worked withDataAnalystfor requirements gathering, business analysis and project coordination.
- Performed Data Analysis and Data Profiling and worked on data transformations and data quality rules.
- Generated and documented Metadata while designing OLTP and OLAP system environment.
- Worked on the migration of data from subsystem database to system database using PL/SQL.
- Performed Data Profiling to assess the risk involved in integrating data for new applications, including the challenges of joins and to track data quality.
- Analyzed and maintained SAS programs and macros to generate SAS datasets.
- Worked with supporting business analysis and marketing campaign analytics with data mining, data processing, and investigation to answer complex business questions.
- Developed and maintained data solutions that utilize SQL, Microsoft SQL Server Reporting Services.
- Conducted design walk through sessions with Business Intelligence team to ensure that reporting requirements are met for the business.
- Wrote SQL Stored Procedures and Views, and coordinate and perform in-depth testing of new and existing systems.
- Worked closely with the ETL SSIS Developers to explain the complex Data Transformation using Logic and create and deploy reports using SSRS.
- Performed data discovery, profiling, and rules development to establish data dictionaries, lineage and supporting processes.
- Worked primarily on SQL Server, creating Store Procedures, Functions, Triggers, Indexes and Views using T-SQL
- Performed analysis and presented results using SQL, SSIS, Excel, and Visual Basic scripts.
- Identified source databases and created the dimensional tables and checked for data quality using complex SQL queries.
- Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
- Developed Unix shell programs and scripts to maximize productivity and resolve issue
- Created data visualization report in tableau for claims data, generated dashboards for the project delivering insights graphically.
- Created pivot tables, graphs, charts, macros in MS Excel and built Tableau dashboards.
Environment: OLTP, OLAP, PL/SQL, SAS, SQL, SSIS, SSRS, Unix, Tableau 6.1, MS Excel 2012