Sr. Data Modeler/ Analyst Resume
Columbus, GA
SUMMARY:
- Sr. Data Modeler/ Analyst with 7+ years of IT experience in Data Modeling, designing and data analysis.
- Proficient in data mart design, creation of cubes, identifying facts & dimensions, star & snowflake schemes and canonical model.
- Solid hands on experience with administration of data model repository, documentation in metadata portals in such as Erwin, ER Studio and Power Designer tools.
- Strong working Experience with Agile, Scrum, Kanban and Waterfall methodologies.
- Capture, validate and publish metadata in accordance with enterprise data governance policies and MDM taxonomies.
- Responsible for data governance rules and standards to maintain the consistency of the business element names in the different data layers.
- Extensive experience in development of T - SQL, Oracle PL/SQL Scripts, Stored Procedures and Triggers for business logic implementation.
- Proficient in Normalization/De-normalization techniques in relational/dimensional database environments and have done normalizations up to 3NF.
- Expertise in SQL Server Analysis Services (SSAS), SQL Server Reporting Services (SSRS) and SQL Server Integration Services
- Experience in modeling with both OLTP/OLAP systems and Kimball and Inmon Data warehousing environments.
- Strong understanding of the principles of Data warehousing, Fact Tables, Dimension Tables, star and snowflake schema modeling.
- Experience in backend programming including schema and table design, stored procedures, Triggers, Views, and Indexes.
- Conduct data analysis, mapping transformation, data modeling and data-warehouse concepts.
- Experience in designing Logical, Physical & Conceptual data models for to build the Data Warehouse.
- Experience in conducting Joint Application Development (JAD) sessions with SMEs, Stakeholders and other project team members for requirements gathering and analysis.
- Software Development Life Cycle (SDLC) experience including Requirements, Specifications Analysis/Design and Testing as per the Software Development Life Cycle.
- Solid Excellent experience in creating cloud based solutions and architecture using Amazon Web services (Amazon EC2, Amazon S3, Amazon RDS) and Microsoft Azure.
- Excellent knowledge on creating reports on Tableau, SAP Business Objects, Web reports for multiple data providers.
- Excellent experience in writing and executing unit, system, integration and UAT scripts in a data warehouse projects.
- Excellent experience in writing SQL queries to validate data movement between different layers in data warehouse environment.
- Proficient in designing Data Mart and Data Warehouse using Star and Snowflake Schemas.
- Extensive experience in using ER modeling tools such as Erwin and ER/Studio, Teradata, Power Designer, and MDM
- Experience in designing Star schema, Snowflake schema for Data Warehouse, ODS architecture.
- Hands on experience in SQL queries and optimizing the queries in Oracle, SQL Server, DB2, and Netezza.
- Experience in working with RDBMS like Oracle, Microsoft SQL Server and Teradata.
- Experience in data analysis to track data quality and for detecting/correcting inaccurate data from the databases.
- Good Knowledge on SQL queries and creating database objects like stored procedures, triggers, packages and functions using SQL and PL/SQL for implementing the business techniques.
TECHNICAL SKILLS:
Cloud Management: Amazon Web Services (AWS), Amazon Redshift
Data Modeling Tools: ER/Studio V17, Erwin 9.7, Power Sybase Designer 16.6.
OLAP Tools: Tableau, SAP Business Objects, SSAS, SSIS, and Crystal Reports 9
Programming Languages: SQL, PL/SQL, UNIX shell Scripting, PERL, AWK, SED
Databases: Oracle 12c, Teradata R15, MS SQL Server 2017, DB2.
Testing and defect tracking Tools: HP/Mercury, Quality Center, Win Runner, MS Visio 2016 & Visual Source Safe
Operating System: Windows 10/8, MAC OS, Unix, Sun Solaris
ETL/Data warehouse Tools: Informatica 9.5/9.1, SAP Business Objects XIR3.1/XIR2, Talend, Tableau
PROFESSIONAL EXPERIENCE:
Sr. Data Modeler/ Analyst
Confidential , Columbus, GA
Responsibilities:
- Gather and translate business requirements into detailed, production-level technical specifications, new features, and enhancements to existing technical business functionality.
- Part of team conducting logical data analysis and data modeling JAD sessions, communicated data-related standards.
- Worked on NoSQL databases including Cassandra. Implemented multi- data center and multi-rack Cassandra cluster.
- Coordinated with Data Architects on AWS provisioning EC2 Infrastructure and deploying applications in Elastic load balancing.
- Performed Reverse Engineering of the current application using Erwin, and developed Logical and Physical data models for Central Model consolidation.
- Translated logical data models into physical database models, generated DDLs for DBAs
- Performed Data Analysis and Data Profiling and worked on data transformations and data quality rules.
- Involved in extensive data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
- Collected, analyze and interpret complex data for reporting and/or performance trend analysis
- Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
- Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex DW using Informatica.
- Developed and maintain sales reporting using in MS Excel queries, SQL in Teradata, and MS Access.
- Involved in writing T-SQL working on SSIS, SSRS, SSAS, Data Cleansing, Data Scrubbing and Data Migration.
- Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns as part of Data Analysis responsibilities.
- Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
- Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2
- Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, with high volume data
- Written SQL scripts to test the mappings and Developed Traceability Matrix of Business
- Involved in extensive data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
- Created SQL tables with referential integrity, constraints and developed queries using SQL, SQL*PLUS and PL/SQL.
- Performed GAP analysis of current state to desired state and document requirements to control the gaps identified.
- Developed the batch program in PL/SQL for the OLTP processing and used Unix Shell scripts to run in corn tab.
- Identified & record defects with required information for issue to be reproduced by development team.
- Worked on the reporting requirements and involved in generating the reports for the Data Model using Tableau.
Environment: Erwin 9.7, PL/SQL, Business Objects XIR2, Informatica 9.5, Oracle 11g, Teradata R13, Teradata SQL Assistant 12.0, Flat Files
Sr. Data Modeler/ Data Analyst
Confidential, New York, NY
Responsibilities:
- Understanding the specification and analyzed data according to the client requirement
- Conducting requirement gathering sessions with Report Owners/SMEs to understand the current process flow of report generation and identifying the Risk Data elements.
- Ran queries to retrieve the data, uploaded data using MS SQL and downloaded the data using Oracle and verified whether the data was tampered or not.
- Coordinate with various developers, project managers and analyst to prepare designs for application and document all process to ensure effective evaluation of all applications.
- Develop the logical model & physical model for the identified elements.
- Involved with implementing Kimball methodology to design data marts.
- Mapping data to the authoritative source field and table name as part of the data migration analysis activity
- Design & develop OLTP models and OLAP model for the reporting requirements using Erwin & Power designer.
- Working together with Data stewards to follow the data governance principles.
- Generate the DDL's out of the developed physical model and execute the DDL's on the defined database.
- Capturing data elements - definitions and characteristics for information requirements.
- Extracted key data from SOR systems and built dimensional data models for data warehouse, data marts, and balanced scorecards for key managers.
- Experience in using Informatica Power Center tool for data integration and processing the data from staging tables to the relational database.
- Experience in explaining the tuning requirement changes for SQL/PL SQL packages and provide stored procedures to the design & development team.
- Experienced with metadata before loading the models into Hadoop.
- Experience in Tableau for maintaining reporting requirements and their respective dashboards.
- Prepared Testing Strategies document and documentation of Test Plans and Test Cases.
- Did troubleshoot after implementation for production problems and created projected/ user documents.
Environment: SAP Power Designer 16.6, Teradata SQL Assistant, SQL Server 2012, PostgreSQL, MS SQL Server, Teradata, Oracle 11g, SQL Developer, Hadoop, Tableau.
Data Modeler/ Data Analyst
Confidential, Albany, NY
Responsibilities:
- Massively involved in Data Modeler/ Analyst role to review business requirement and compose source to target data mapping documents.
- Gather all the analysis reports prototypes from the business analysts belonging to different Business units.
- Interacted with Business Analysts to gather the user requirements and participated in data modeling JAD sessions.
- Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
- Extensively used Agile methodology as the Organization Standard to implement the data Models.
- Actively participated in JAD sessions involving the discussion of various reporting needs.
- Created PL/SQL packages and Database Triggers and developed user procedures and prepared user manuals for the new programs.
- Interacted with Business Analyst, SMEs to understand Business needs and functionality for various project solutions.
- Planned and defined system requirements to Use Case, Use Case Scenario and Use Case Narrative using the UML (Unified Modeling Language) methodologies.
- Reverse Engineering the existing data marts and identified the Data Elements (in the source systems), Dimensions, Facts and Measures required for reports.
- Conduct Design discussions and meetings to come out with the appropriate Data Warehouse at the lowest level of grain for each of the Dimensions involved.
- Created Entity Relationship Diagrams (ERD), Functional diagrams, Data flow diagrams and enforced referential integrity constraints.
- Developed the data warehouse model (Kimball's) with multiple data marts with conformed dimensions for the proposed central model of the Project.
- Designed a STAR schema for sales data involving shared dimensions (Conformed) for other subject areas using Erwin Data Modeler.
- Created and maintained Logical Data Model (LDM) for the project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.
- Validated and updated the appropriate LDM's to process mappings, screen designs, use cases, business object model, and system object model as they evolve and change.
- Conduct Design reviews with the business analysts and content developers to create a proof of concept for the reports.
- Ensured the feasibility of the logical and physical design models.
- Worked on the Snow-flaking the Dimensions to remove redundancy.
- Wrote PL/SQL statement, stored procedures and Triggers for extracting as well as writing data.
- Worked extensively on Data Migration by using SSIS.
- Developed rule sets for data cleansing and actively participated in data cleansing and anomaly resolution of the legacy application.
- Defined facts, dimensions and designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin .
- Involved in Data profiling and performed Data Analysis based on the requirements, which helped in catching many Sourcing Issues upfront.
- Developed Data mapping, Transformation and Cleansing rules for the Data Management involving OLTP, ODS and OLAP.
- Responsible for data governance rules and standards to maintain the consistency of the business element names in the different data layers.
- Created data masking mappings to mask the sensitive data between production and test environment.
- Normalized the database based on the new model developed to put them into the 3NF of the datawarehouse.
- Created SSIS package for daily email subscriptions to alert Tableau subscription failure using the ODBC driver and PostgreSQL database.
- Designed logical and physical data models, Reverse engineering, Complete compare for Oracle and SQL server objects using E/R Studio.
Environment: PL/SQL, Erwin 9.5.2, MS SQL 2016, OLTP, ODS, OLAP, SSIS, Tableau, ODBC, Transact-SQL, TOAD, Teradata r15, SQL Assistant
Data Modeler/Data Analyst
Confidential, Atlanta, GA
Responsibilities:
- Participated in requirement gathering session with business users and sponsors to understand and document the business requirements as well as the goals of the project.
- Identified and Analysis of various facts from the source system and business requirements to be used for the data warehouse (Kimball Approach).
- Checked for all the modeling standards including naming standards, entity relationships on model and for comments and history in the model.
- Analyzed Logical Data Model (LDM) and Physical Data Model from OLTP systems included documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.
- Worked with data architects and IT architects to understand the movement of data and its storage.
- Designed star schema and Snow Flake schemas for reporting as per BRD.
- Analyzed and defined primary keys, Surrogate keys, Dimensions and Fact according to business requirements.
- Developed Logical data model using ER/studio and created physical data models using reverse engineering.
- Maintain and enhance data model with changes and furnish with definitions, notes, reference values and check lists.
- Updated and deleted column definitions as per requirement document according to referential integrity rules.
- Created technical mapping documents for the development team.
- Generated data definition language (DDL) scripts using the data modeling tool and manually for the creation of physical objects like tables, indexes, constraints, views and materialized views.
- Created DDL scripts using ERstudio and co-coordinated with DBA’s to apply the data model changes.
- Provide gap analysis to DBA for changing in data models and maintained traceability matrix for every change.
- Created and updated documents on project and team SharePoint sites for collaboration on documentation and sharing development updates with management and business groups.
- Worked on enhancing and monitoring SQL Queries using TOAD and SQL Developer.
- Co-ordinated with QA team to test and validate the reporting system and its data.
- Suggested effective implementation of the applications, which are being developed.
Environment: ERstudio 9.5, Oracle 11g, Window 7, Informatica IPC 9.5.1, Toad 12.6, ALM 11.5, Microsoft Excel.
Data Modeler/ Data Base Analyst
Confidential
Responsibilities:
- Involved in Data Modeling (Logical and Physical Design of Databases), Normalization and building Referential Integrity Constraints
- Development of physical data models and created DDL scripts to create database schema and database objects.
- Created user requirement documents based on functional specification.
- Created new tables, written stored procedures, triggers, views, functions.
- Created SSIS Packages by using transformations like Derived Column, Sort, Lookup, Conditional Split, Merge Join, Union and Execute SQL Task to load into database.
- Created SQL scripts for tuning.
- Data Extracted from Flat files, Excel and Transformed as per the logic and loaded into Data warehouse.
- Created packages to schedule the jobs for batch processing.
- Involved in performance tuning to optimize SQL queries.
- Created and Maintained Indexes for various fast and efficient reporting processes
- Understanding the DLD specifications document and working with design to develop the mappings using SQL Server Integration Services (SSIS).
Environment: MS SQL Server, Erwin 9.X/8.X, Visual Studio, T-SQL, Enterprise Manager, Query Analyzer, Windows, MS Office.
Data Analyst
Confidential
Responsibilities:
- Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
- Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center.
- Setting up of environments to be used for testing and the range of functionalities to be tested as per technical specifications.
- Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
- Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
- Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
- Troubleshoot test scripts, SQL queries, ETL jobs, data warehouse models.
- Excellent experience and knowledge on data warehouse concepts and dimensional data modelling using Ralph Kimball methodology
- Responsible for different Data mapping activities from Source systems to Teradata
- Created the test environment for Staging area, loading the Staging area with data from multiple sources.
- Used CA Erwin Data Modeler (Erwin) for data modeling ( data requirements analysis, database design etc.) of custom developed information systems, including datamarts
- Responsible for analyzing various data sources such as flat files, ASCII Data, EBCDIC Data, Relational Data (Oracle, DB2 UDB, MS SQL Server) from various heterogeneous data sources.
- Delivered file in various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.)
- Executed campaign based on customer requirements.
- Followed company code standardization rule
- Involved in Teradata SQL Development, Unit Testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports.
- Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases
- Experience in creating UNIX scripts for file transfer and file manipulation.
- Ensuring onsite to offshore transition, QA Processes and closure of problems & issues.
- Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.
Environment: Informatica 8.1, PL/SQL, SQL, Data Flux, Oracle 9i, Quality Center.