Sr.data Modeler Resume
Moline, IL
SYNOPSIS:
- Experienced as a Data Modeler / Data Analyst wif solid understanding of Business Requirements Gathering, Data warehousing, Data Modelling, Evaluating Data Sources, Translating Requirements into Specifications, Application Design.
- Deep noledge of math, probability, statistics and data mining techniques.
- Experience in collecting, organizing, interpreting, and disseminating various types of statistical figures.
- Excellent understanding of business operations and Analytics tools.
- Good understanding of RDMS and strong noledge in databases like SQL Server, Oracle, SAP Hana, DB2,Teradata.
- Having good experience wif Normalization (1NF, 2NF and 3NF) and De - normalization techniques for improved database performance in OLTP, OLAP, Data Warehouse and Data Mart environments
- Extensive experience in conducting Market Research, Feasibility Studies, Data Analysis, Data Mapping, Data Profiling, SWOT Analysis, Cost Benefit Analysis, Gap Analysis, Risk Identification, Risk Assessment, Risks Analyses, and Risk management, to identify problem and make Business Decisions for improved Customer Value.
- Strong experience in Business and Data Analysis, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Governance, Data Integration, and Metadata Management (MDM) Services, and Configuration Management.
- Implemented Data Profiling for teh Source systems tables.
- Experience in RDBMS Oracle PL/SQL, SQL, Stored Procedures, Functions, Packages, Triggers worked wif Terra bytes of Volume databases.
- Tune and troubleshoot functional and performance problems wif teh SQL for loading teh databases.
- Detailed noledge of Python, including teh development of extension types using teh Python/C interface, and scientific computing libraries (including Numpy and Scipy).
- Experience in Performance Tuning (sources, mappings, targets and sessions), SQL Tuning, Table Space Partitioning, creation of indexes for faster database access and better query performance.
- Hands-on Azure-Power BI implementation and migration services.
- Experience working on Supply chain data.
- Diverse experience in IT Services, Retail and Healthcare.
- Knowledge in Informatica Power center (Extract Transform Load) tool and experience wif Business Intelligence (BI) tools like Tableau Server, Power BI, Business Objects.
- Ability to manage multiple project tasks wif changing priorities and tight deadlines. Ability to work well in a team.
- Excellent communication skills and analytical skills and ability to perform as part of teh team.
TECHNICAL SKILLS:
Domain Knowledge: Retail, IT Services, Healthcare
Data Modeling Tools: Erwin r 9.8/9.6/9.2/7.3, ER/Studio 16/15, SQL Data modelerOracle Designer, MS Visio and SAP Power DesignerOLAP Tools Microsoft Analysis Services SSAS, Business Objects and Crystal Reports 9
Database Tools: Microsoft SQL Server 2000/2005/2008/2012/2014, MySQL, Oracle 11g/10g/9i/8i, DB2, MS Access 2000 and Teradata V2R6.1,Hive
ETL Tools: Informatica 9/8, Data Junction, Ab-Initio, Data Stage, SSIS
Programming Languages: SQL, PL/SQL, T-SQL, Base SAS,HTML,CSS,XML,C,C++,UNIX, Python
Packages: Microsoft Office Suite, Microsoft Project 2010, SAP and Microsoft Visio, Collibra
Reporting Tools: Tableau, Business Objects6.5, MS Azure-Power BI
PROFESSIONAL EXPERIENCE:
Confidential, Moline, IL
Sr.Data Modeler
Responsibilities:
- Participated in JAD sessions wif business users and sponsors to understand and document teh business requirements
- Conducted meetings wif Business analysts for analysis of Business requirements and identified teh logical names as per teh Integrated Frame Work (IFW) Standards.
- Translated teh business requirements into workable functional and non-functional requirements at detailed production level using Workflow diagrams.
- Developed a Conceptual Model using Erwin based on requirements analysis.
- Developed Normalized Logical and Physical Data models to design teh OLTP system.
- Created Dimensional models by identifying required Dimensions and Facts using Erwin.
- Extensively used Star Schema methodology in designing teh Dimensional models.
- Used Forward Engineering to create a Physical Data Model wif DDL dat best suits teh requirements from teh Logical Data Model.
- Designed Model for Supply chain data and analyzed indexes and keys.
- Integrated MDM data hubs into Supply chain for TEMPeffective usage of Master data.
- Conducted Design reviews wif Business Analysts, Database Administrators and Content Developers to validate teh models.
- Done Performance Tuning, Query Optimization, Client/Server Connectivity, and Database Consistency Checks.
- Coordinated wif source system owners, day-to-day ETL progress monitoring, Data warehouse target schema design (Star Schema) and maintenance.
- Used SSIS to create ETL packages to extract data from various sources such as Access database, Excel spreadsheet and flat files into SQL Server for further analysis and reporting by using multiple transformations.
- Worked on reverse engineering on teh existing data models and update teh data models.
- Participated wif application developers and DBAs to discuss about various De-normalization, partitioning and indexing schemas for physical model.
- Developed Data Mapping, Data Governance, Transformation and Cleansing rules.
- Used Collibra for Data Governance and Metadata Management.
- Assisted Business Subject Matter Experts and data stewards to review and Scrub their data in a timely manner and perform data audits, as necessary.
- Used Model Mart of Erwin for TEMPeffective model management of sharing, dividing and reusing old model information and design for productivity improvement.
- Created and maintained teh Data Model Repository as per company standards.
- Created summary tables based on teh reporting requirements.
- Used Excel sheet, flat files, CSV files to generated Tableau ad-hoc reports.
- Creating Dashboards, reports as needed using Tableau Desktop and Tableau Server .
- Analyzed various logs dat are been generating and predicting/forecasting next occurrence of event wif various Python libraries
- Developed extension types using teh Python/C interface, and scientific computing libraries Numpy and Scipy.
- Azure-Power BI implementation and migration services.
- Responsible for Master data Management in identifying each element of a business element and representing these instances using a standardized data model.
- Worked wif teh Application Development team to implement appropriate data strategies.
Environment: Erwin 9.8, SQL Server 2014, Informatica Power Exchange, MS Azure, Power BI, Tableau, Collibra, Toad 10, Oracle 11g, Hive, Azure ADO, PL/SQL, UNIX, Python, Workflow Manager, Workflow Monitor, MS Visio
Confidential, Northbrook, IL
Data Modeler
Responsibilities:
- Analyzed business requirements and functional specifications to guide in building an efficient model by interacting wif Business Analysts and Subject matter experts.
- Identified teh key facts and dimensions necessary to support teh business requirements for dimensional modelling.
- Developed Logical and Physical data models dat capture teh data elements and data flows using Erwin/Star Schema
- Coordinated wif DBA on database build and table normalizations and de - normalizations
- Participated in mapping logical data models to physical data models and co- ordinated business requirements for data.
- Prepared analytical and status reports and updated teh project plan as required.
- Responsible for Master data management in identifying each instance of a business element and representing these instances using a standardized data model.
- Ensured teh ease of accessing history of reference data changes by implementing Slowly Changing Dimensions Type 2, Type 3 and Type 4 of reference data.
- Developed Data Mapping, Data Governance, Transformation and Performed reverse engineering of Physical data models from database and SQL scripts.
- Developed standard methodologies for standard naming conventions and coding practices to maintain consistency of data models.
- Participated in data architecture sessions and promoted integrated systems and controlled data redundancy.
- Worked proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks.
- Tested teh ETL process for both before data validation and after data validation process. Tested teh messages published by ETL tool and data loaded into various databases.
- Imported/exported large sets of data from Oracle Database to Tableau.
- Prepared Dashboards using Calculations, parameters in Tableau.
- Performed data profiling on unstructured data collected from various sources which were used for data analysis.
Environment: Erwin, Informatica Power Center 8.1.1,Toad, Oracle 11g, SQL, PL/SQL, Tableau, Teradata V2R5 (Multiload, Fast load, Tpump, BTEQ),UNIX, Windows NT, MS SQL Server,Jira, Workflow Manager.
Confidential, Alpharetta, GA
Data Modeler/SQL Developer
Responsibilities:
- Coordinated and held Technical Peer Reviews wif Subject Matter Authorities and Stakeholder communities to discuss and perform requirements and impact analysis to understand teh data elements in their domain.
- Closely worked wif teh App Dev team and DBA to create an appropriate Conceptual, Logical and Physical Data Model.
- Participated in teh de-normalization of teh data model, ensuring dat business rules which are impacted during de-normalization are implemented via an alternate mechanism (i.e. trigger, application code).
- Worked wif peers, clients and management to establish and maintain consistent data element definitions across computing environments.
- Generated and tested DDL scripts on Erwin.
- Provided scripts for load strategies, backup and recovery procedures and performance tuning to ensure teh system was available 24/7.
- Wrote SQL and DTS packages to deal wif weekly data extracts to a data repository.
- Worked to convert teh DTS packages over to SSIS in SQL Server 2005.
- Assisted Customer Support Representatives wif ad hoc queries and data analysis as well as wif generation of reports for executive staff.
- Collaborated on teh data mapping document from source-to-target and teh data quality assessments for teh source data.
Environment: Erwin, Oracle, SQL Navigator, PLSQL, Pro DB2, Business Objects6.5, SQL server, Workflow Monitor, Sybase, Rational Requisite, Windows NT, Crystal Reports.
Confidential
Data Modeler/Analyst
Responsibilities:
- Prepared and documented detailed specifications of teh programs utilized to pull data including creation and use of documents to facilitate teh request and data integrity of teh data dat is provided to teh internal customer.
- Worked wif Data investigation, discovery and mapping tools to scan every single data record from many sources.
- Created and reviewed teh conceptual model for teh EDW (Enterprise Data Warehouse) wif business users.
- Implemented teh standard naming conventions for teh fact and dimension entities and attributes of logical and physical model.
- Used Model Mart of Erwin for TEMPeffective model management of sharing, dividing and reusing model information and design for productivity improvement.
- Participated in UAT sessions to educate teh business users about teh reports, dashboards and teh BI System
- Worked wif teh test team to provide insights into teh data scenarios and test cases.
- Worked on data profiling and data validation to ensure teh accuracy of teh data between teh warehouse and source systems.
- Identified teh most appropriate data sources based on an understanding of corporate data thus providing a higher level of consistency in reports being used by various levels of management.
- Performed complex data analysis in support of ad-hoc and standing customer requests.
- Wrote MySQL queries from scratch and created views on MySQL for reporting.
Environment: Erwin, MS Excel, Oracle 9i, MySQL, MS Visio, TOAD for Oracle. MySQL Workbench