Sr. Data Modeler Resume
Columbus, OH
SUMMARY
- Over 9+ years of extensive experience in Data Analysis, Data Modeling, Data Architecture and DBA, Development, Testing and Deployment of business applications and Analytics in Clinical, Financial and healthcare sector Highly proficient inDataModeling retaining concepts of RDBMS, Logical and PhysicalDataModeling until 3NormalForm (3NF) and MultidimensionalDataModeling Schema (Star schema, Snow - Flake Modeling, Facts and dimensions).
- Has complete knowledge ofdataware house methodologies (Ralph Kimball, Inmon), ODS, EDW and Metadata repository.
- Have solid experience with Tableau, AWS, Redshift and Pentaho Kettle
- A good experience Functional design documents and Business requirement documents with Cognos
- A good experience on project-related support in adherence to the D&B’s data architecture strategy.
- A good experience in Developing Data governance model and standard operating procedures to manage over 10,000 data elements being tracked by dashboards
- A good experience in Data Warehouse, Business Intelligence and Data Science -Big Data Analytics technologies
- Excellent SQL programming skills and developed Stored Procedures, Triggers, Functions, Packages using SQL/PL. Performance tuning and query optimization techniques in transactional and data warehouse environments.
- Expert in theDataAnalysis, Design, Development, Implementation and Testing usingData Conversions, Extraction, Transformation and Loading (ETL) and SQL Server, ORACLE and other relational and non-relational databases.
- Consolidate and audit metadata from disparate tools and sources, including business intelligence (BI), extract, transform, and load (ETL), relational databases, modeling tools, and third-party metadata into a single repository.
- Expert level understanding of using different databases in combinations forDataextraction and loading, joiningdataextracted from different databases and loading to a specific database.
- Excellent understanding and working experience of industry standard methodologies like System Development Life Cycle (SDLC), as per Rational Unified Process (RUP), AGILE and Waterfall Methodologies.
- Expertise indatacleansing for analysis, performdataquality testing for gaps, and liaising withdata origination teams.
- Consistently delivered results in various stages of Software Development Life Cycle (SDLC)
- Worked with variousdatasources such as Oracle, SQLServer, Teradata&Netezza
- Extensive experience with OLTP/OLAP System and E-R modeling, developing Database Schemas like STAR schema and Snowflake schema used in relational, dimensional and multidimensional modeling.
- Extensively experienced in using tools like SQL Plus, TOAD, SQL Developer and SQLLoader.
- Experienced inDataModeling using designed tool Erwin, Oracle SQL Developer, SQL Server Management studio, My SQL, SQL Plus and Toad. worked extensively on the Master Data Management (MDM) and application used for MDM
- Have a good expertise of Microsoft Analytics Platform System (APS)
- Expertise in developing Corporate Data Model Framework and participated in building of CDM
- Experience in the field of Data and Business analysis, ETL Development, and Project Management.
- Experience in Analysis, Design, Development, Implementation, Testing and Support of Data Warehousing and Data Integration Solutions using Ab Initio, PL/SQL, Oracle and UNIX.
- Experienced in designing the Conceptual, Logical and Physical data modeling using Erwin, IBM Unified Data Model for Healthcare
- Experience in Analysis, Design, Development, Implementation, Testing and Support of Data Warehousing and Data Integration Solutions using Ab Initio, PL/SQL, Oracle and UNIX.
- Valuable knowledge of OBIEE 11g and also having hands on experience in OBIEE 10g in integration with BI publisher Extraction, Load and Transformation (ELT) using Oracle Data Integrator (ODI) 11g, Informatica 9.x/8.x/7.x and SQL, PL/SQL.
- Good experience on Setting up and maintaining servers for development and on-line web site with Sybase.
- Excellent understanding and working experience of industry standard methodologies like System Development Life Cycle (SDLC), as per Rational Unified Process (RUP), AGILE and Waterfall Methodologies.
- A good knowledge of unstructured data stores
- Knowledge of Performance Optimization of RDBMS systems
- Knowledge of MongoDB and CouchDB
- Worked on PK, FK relationships across the entities and across subject areas
- Expertise in writing complex SQL queries and optimizing the queries in Oracle, SQL Server, Teradata & Netezza
- Extensive experience in data warehouse complete life cycle using SQL Server SSIS, SSRS, and SSAS.
- Extensively worked in creating and integrating MicroStrategy Reports and Objects (Attributes, Filters, Facts, Prompts, Templates
- Experienced working with data modeling tools like Erwin, PowerDesigner and ERStudio.
- Experienced in designing starschema, Snowflakeschema for Data Warehouse, ODSarchitecture.
- Experienced in Requirement gathering, System analysis, handling business and technical issues & communicating with both business and technical users.
- Expertise in writing SQLQueries, Dynamic-queries, sub-queries and complex joins for generating Complex Stored Procedures, Triggers, User-defined Functions, Views and Cursors.
- Deployed and scheduled Reports using SSRS to generate all daily, weekly, monthly and quarterly Reports including current status.
- Designed and deployed reports with DrillDown, DrillThrough and Drop down menu option and Parameterized and Linked reports.
- Experienced using MapReduce, and "BIgdata" work on Hadoop and other NOSQL platforms
- Excellent Experience with BIgdatatechnologies (e.g., Hadoop, BIgQuery, Cassandra)
- Experienced in understanding Stored Procedures, Stored Functions, Database Triggers, and Packages using PL/SQL.
- Extensive experience in advanced SQL Queries and PL/SQL stored procedures.
- ExcellentDataWarehousing concepts including MetadataandDataMarts.
- Experienced in Business Intelligence (SSIS, SSRS),DataWarehousing and Dashboards.
- Developed and deliver dynamic reporting solutions using SQLServer 2008 Reporting Services (SSRS)
- Experienced in creating OLAP cubes, identifying dimensions, attributes, hierarchies and calculating measures and dimension members.
- Extensive working experience in Normalization and De-Normalization techniques for both OLTP and OLAP systems in creating DatabaseObjects like tables, Constraints (Primary key, Foreign Key, Unique, Default), Indexes.
- Strong analytical, logical, Communication and problem solving skills and ability to quickly adapt to new technologies by self-learning.
TECHNICAL SKILLS
Programming Languages: SQL, PL/SQL, UNIX shell Scripting, PERL, AWK, SED
DataModeling: asr Erwin r9.5/7x/6x/5x, ER/Studio 9.7/9.0/8.0/7. x
Databases Oracle: 11g/10g/9i/8i/7.x, Teradata, DB2 UDB 8.x/7.x, DB2 Z/OS 9.x/8.2, SQL Server 2008/2005/2000 , MySQL, MS- Access, Flat Files, XML files.
Programming Skills: SQL, PL/SQL, Shell Scripting
Operating Systems: Win 95/NT/98/2000/XP, LINUX, Sun Solaris 2.X/5/7/8, IBM AIX 5.3/4.2, HP-UX, MS-DOS
Scheduling Tools: Autosys, Maestro (Tivoli)
Other Tools: Teradata SQL Assistant Toad 9.7.2/7.4/7.3 , DB Visualizer 6.0, Microsoft Office, Microsoft Visio, Microsoft Excel, Microsoft Project
PROFESSIONAL EXPERIENCE
Confidential, Columbus, OH
Sr. Data Modeler
Responsibilities:
- Participated in requirement gathering session with business users and sponsors to understand and document the business requirements.
- Responsible for technical data governance, enterprise wide data modeling and database design
- Designed, loaded and managed a Redshift cluster
- Involved in Database Setup, Design of Amazon Redshift Database
- Designed and implemented a training record database in Mpp Database
- Worked on creating of Data sources in Cognos Connection
- Created the model in Framework Manager and publish the package to Cognos connection
- Created and maintained Logical and Physical models for the data mart and created partitions and indexes for the tables in the data mart.
- Provided expertise in cloud technologies to engineering and product management and security teams.
- Designed, loaded and managed a Redshift cluster
- Involved in Database Setup, Design of Amazon Redshift Database
- Worked on Importing Manager to import data into MDM
- Designed solution using SAP MDM
- Performed Unit tests and Built MDM repository with MD
- Designed and implemented a training record database in Mpp Database
- Have worked on Microsoft Analytics Platform System (APS) for query operations and data loading performance
- Worked on loading from different data sources like Oracle and Flat files into a common analytical data model using Ab-Initio.
- Worked on analysis of CDM functional and non-functional categorized data elements for data profiling and mapping from source to target data environment
- Worked on architecture consisted of Amazon RedShift Data Warehouse
- Worked on Installing, maintaining and tuning Netezza Database
- Worked on Importing data using Sqoop to load data from RDBMS to HDFS on regular basis.
- Worked on Configuration of Sybase.
- Identified the PK, FK relationships across the entities
- Worked on loading from different data sources like Oracle and Flat files into a common analytical data model using Ab-Initio.
- Worked on transferring the data using (ETL) SSIS packages.
- Created ETL packages using SSIS to move data from various heterogeneous data sources
- Involved with all the phases of Software Development Life Cycle (SDLC) methodologies throughout the project life cycle.
- Acted as a primary interface between both technical and business stakeholders and the Enterprise Architecture Office (EAO) for matters related to data
- Produced and maintained an Enterprise Data Strategy, incorporating corporate data, analytics, linked data among others
- Worked directly with key stakeholders to capture and maintain existing elements of the enterprise architecture (AS IS)
- Evaluated proposed initiatives to ensure they meet business and technology goals and adhere to governance standards
- Assessed architectural impacts to all aspects of architecture, including security
- Analyzed server and database performance, identified problems, and implemented solutions; reviewed and tuned complex SQL statements.
- Drove big data, BI & analytic innovation, developed/maintained technology platform strategies & roadmaps
- Comprehended the creation of big data, NoSQL, Cloud platform service offerings
- Understanding business needs, emerging technologies and ensuring that solution road maps are aligned
- Maintained in-depth technical knowledge of data, BI & Analytics trends, practical solution frameworks
- Used Erwin, Created Conceptual, Logical and Physicaldatamodels.
- Used forward engineering to create a PhysicalDataModel with DDL that best suits the requirements from the LogicalDataModel.
- Used Normalization (1NF, 2NF&3NF) and De-normalization techniques for effective performance in OLTP and OLAP systems.
- Worked with Teradata Database Administrators to implement staging, integration anddatamart DDLs with respect to the warehouse modeling standards in the database.
- Designed Star and SnowflakeDataModels for EnterpriseDataWarehouse using Erwin
- Developed, enhanced and maintained Snow Flakes and Star Schemas withindatawarehouse and datamart conceptual & logicaldatamodels with also analyzed and designed XML Schema Definition (XSD)dataobjects in XML Spy to standardize MDM.
- Supported the DBA in the physical implementation of the tables in both Oracle and DB2 databases.
Environment: Oracle 12c, Teradata 14.10, Informatica, Erwin 9.6
Confidential, Tampa, FL
Sr. Data Architect/ Analyst
Responsibilities:
- Involved in data analysis and creating data mapping documents to capture source to target transformation rules.
- Worked on architecture consisted of Amazon RedShift Data Warehouse performed Queries in the Redshift environment
- Create Materialized views to improve performance of reports and point the reports to these new materialized views to pull data with Cognos
- Worked on Analyzing the report design specs with Cognos
- Worked with data governance, data security and data quality rules
- Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture involving OLTP, ODS and OLAP.
- Worked with SME's and other stakeholders to determine the requirements to identify Entities and Attributes to build Conceptual, Logical and Physical data Models.
- Worked on Standard SQL queries to access and joined Hadoop data with relational data with Microsoft Analytics Platform System performed Queries in the Redshift environment
- Developed cloud reference architectures models, governance policies, security models, and best practices
- Worked on Coded Functions and Procedures for implementing transformation Logics in Amazon Redshift.
- Facilitated Data load and syndication with MDM
- Conducted MDM unit tests and code reviews with Informatica IDD
- Worked on data migration from RDBMS to Cassandra.
- Worked on creating Corporate Data Model CDM for products and services in Teradata Environment based on the data from PDM. Worked on Metadata Repository (MRM) for maintaining the definitions and mapping rules up to mark
- Mentored and guided teams in various capacities involved in - ab-initio
- Implemented Hybrid Columnar Compression for Exadata database servers
- Worked on business Intelligence solution using Redshift DB, and Tableau
- Analyzed the existing system and prepared
- Developing efficient SSIS packages for processing fact and
- Created efficient SSIS packages to load data from CSV, Password Protected Excel and pgp encrypted files stored in SFTP and FTP servers.
- Worked with business users to gather requirements and create data flow, process flows and functional specification documents.
- Used Erwin and Visio to create 3NF and dimensional data models and published to the business users and ETL / BI teams.
- Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
- Worked on creating role playing dimensions, fact less Fact, snowflake and star schemas.
- Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks.
- Prepared Test cases based on Technical Specification document.
- Involved in fixing invalid mappings, testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
- Involved in the validation of the OLAP Unit testing and System Testing of the OLAP Report Functionality and data displayed in the reports.
- Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks
- Responsible for testing all new and existing ETL data warehouse components.
- All Mappings & workflows succeeded in Testing Environment move from Development to Production Environment.
- Performed Verification, Validation, and Transformations on the Input data (Text files, XML files) before loading into target database.
- Interacted with functional analysts to understand requirements and write high level test scripts.
- Reviewed ERD dimensional model diagrams to understand the relationships and cardinalities to help in preparation of integrity test cases.
- Written test cases for data extraction, data transformation, and reporting.
- Responsible for Testing Schemas, Joins, Data types and column values among source systems, Staging and Data mart.
- Analyzed the objectives and scope of each stage of testing process from the Test plan.
- Interacted with business analysts to gather the requirements for business and performance testing.
- Responsible for performing the data validation, process flow, dependency, Functionality Testing and User Acceptance Testing.
- Extensively used Quality Center to prepare test cases, execution of test cases and bug tracking.
Environment: Erwin 4.5/4.0, Informatica Power Center 8.1/9.1, Power Connect/ Power exchange, Oracle 11g, Main frames,DB2 MS SQL Server 2008 R2, SQL,PL/SQL, XML, Windows NT 4.0, Sun Solaris Unix 2.6, Unix Shell Scripting.
Confidential, Minneapolis, MN
Sr. DataArchitect/ Data Modeler
Responsibilities:
- Designed & Created Test Cases based on the Business requirements (Also referred Source to Target Detailed mapping document & Transformation rules document)
- Implemented Hybrid Columnar Compression for Exadata database servers
- Worked on business Intelligence solution using Redshift DB, and Tableau
- Involved in Data copy form Mysql and Oracle OLTP systems to Amazon Redshift Database using S3 Buckets and Redshift templates
- Worked on data warehousing, ETL, SQL, scripting and big data (MPP + Hadoop).
- Involved in extensiveDatavalidation using SQL queries and back-end testing
- Used SQL for Querying the database in UNIX environment
- Created Dashboards in Cognos 8 BI to communicate complex information quickly.
- Developed separate test cases for ETL process (Inbound & Outbound) and reporting
- Involved with Design and Development team to implement the requirements.
- Developed and Performed execution of Test Scripts manually to verify the expected results
- Designed and developed ETL processes using Informatica ETL tool for dimension and fact file creation
- DataAnalysis primarily IdentifyingDataSets, SourceData, Source MetaData,DataDefinitions and Dataformats.
- Created logical physicaldatamodels and MetaDatato support the requirements Analyzed requirements to develop design concept and technical approaches to find the business requirements by verifying Manual Reports.
- Performeddataanalysis anddataprofiling using complex SQL on various sources systems including Oracle and Netezza
- Developed strategies fordatawarehouse implementations,dataacquisitions, provided technical and strategic advice and guidance toseniormanagers and technical resources in the creation and implementation fordataarchitecture anddatamodeling.
- Defined corporate metadata definitions for the enterprisedatasupported databases including operational source systems,datastores anddatamarts developed logical and physicaldatamodels and documented sourcedatafrom multiple sources, internal systems, external source systems, and third partydata.
- Analyzed needs and requirements of existing and proposed IT systems to supportdatarelated requirements.
- Involved in Migrating thedatamodel from one database to Teradata database and prepared a Teradata staging model.
- Designed star schemas and bridge tables to control slowly changing dimensions. Applied the normal forms on the OLTP database.
- Tuning and optimization of Teradata Queries Created the ETLdatamapping documents between source systems and the target datawarehouse.
- Involved in dimensional modeling of thedatamodel using ER/studioDataArchitect. Tuning and optimization of Teradata Queries Created the ETLdatamapping warehouse to design the business process. Designed Logicaldatamodel and Physical Conceptualdatadocuments between source systems and the targetdatawarehouse
- Developed ER and Dimensional Models using Power Designer advanced features. Created physical and logicaldatamodels using Power Designer.
- Involved in the process design documentation of the DataWarehouse Dimensional Upgrades.
- Consolidated and generated database standards and naming conventions to enhance Enterprise DataArchitecture processes.
- Successfully created and managed a conversion testing effort which included adataquality review, two system test cycles, and user acceptance testing.
- Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.
- This involved coordination with DBA's and developers to determine the correct requirements for the application changes, preparation of the logical and physical models in accordance with the Enterprise Architecture, generation of thedatadefinition language (DDL) to create the database.
Environment: Windows XP, Informatica Power Center 6.1/7.1, QTP 9.2, Test Director 7.x, Load Runner 7.0, Oracle 10g, UNIX AIX 5.2, PERL, Shell Scripting
Confidential, San Antonio, TX
Sr. Data Analyst/ Sr. Data Modeler
Responsibilities:
- Worked ondataanalysis,dataprofiling,datamodeling,datamapping and testing for a given task
- Worked on Teradata Environment based on thedatafrom PDM. Conceived, designed, developed and implemented this model from the scratch
- Worked with DBA to create the physical model and tables. Scheduled multiple brain storming sessions with DBAs and production support team to discuss about views, partitioning and indexing schemes case by case for the facts and dimensions.
- Established and maintained comprehensivedatamodel documentation including detailed descriptions of business entities, attributes, anddatarelationships.
- Developed mapping spreadsheets for ETL team with source to targetdatamapping with physical naming standards,datatypes, volumetric, domain definitions, Transformation Rules and corporate meta-datadefinitions.
- This involved analysis of a variety of source systemdata, coordination with subject matter experts, development of standardized business names and definitions, construction of a non-relationaldatamodel using ErWin v9.2 modeling tool, publishing of adatadictionary, review of the model and dictionary with subject matter experts and generation ofdatadefinition language.
- Helped the BI, ETL Developers, Project Manager and end users in understanding theDataModel, dataflow and the expected output for each model created
- Gained Comprehensive knowledge and experience in process improvement, normalization/de-normalization,dataextraction,datacleansing,datamanipulation
- Created, documented and maintained logical and physical database models in compliance with enterprise standards and maintained corporate metadata definitions for enterprisedatastores within a metadata repository
- Managed full SDLC processes involving requirements management, workflow analysis, sourcedata analysis,datamapping, metadata management,dataquality, testing strategy and maintenance of the model.
- Consolidate and audit metadata from disparate tools and sources, including business intelligence (BI), extract, transform, and load (ETL), relational databases, modeling tools, and third-party metadata into a single repository.
- Expert level understanding of using different databases in combinations forDataextraction and loading, joiningdataextracted from different databases and loading to a specific database.
- Excellent understanding and working experience of industry standard methodologies like System Development LifeCycle (SDLC), as per Rational Unified Process (RUP), AGILE and Waterfall Methodologies.
- Extensively used Erwin as the main tool for modeling along with Visio
- Used SQL on a wide scale for analysis, performance tuning and testing
Environment: PDM, SAP, JD Edwards, Teradata 13.10, Microsoft SQL Server 2012, SQL Manager, SAP Logon, Erwin 8.0, Visio, Informatica, Business Objects XI, Teradata SQL Assistant
Confidential, Boston, MA
Data Analyst / Data Modeler
Responsibilities:
- Developed working documents to support findings and assign specific tasks.
- Excellent exposure inDataProfiling in reference toDataWarehouse and BI development.
- Analyzed Web based Apps for the digital marketing of the products over the browsers.
- Involved with data profiling for multiple sources and answered complex business questions by providingdatato business users.
- Worked withdatainvestigation, discovery and mapping tools to scan every singledatarecord from many sources.
- Wrote and executed unit, system, integration and UAT scripts in adatawarehouse projects.
- Worked on Front end Java applications fordataanalysis and providing results to Business users.
- Extensively used ETL methodology for supportingdataextraction, transformations and loading processing, in a complex EDW using Informatica.
- Performeddatareconciliation between integrated systems.
- Metrics reporting,datamining and trends in helpdesk environment using Access
- Wrote complex SQL queries for validating thedataagainst different kinds of reports generated by BusinessObjectsXIR2.
- Extensively used MS Access to pull thedatafrom variousdatabases and integrate thedata.
- Worked on SAS and IDQ forDataAnalysis.
- Assisted in the oversight for compliance to the EnterpriseDataStandards
- Worked in importing and cleansing ofdatafrom various sources like Teradata, Oracle, flat files, SQLServer2005 with high volumedata
- Worked with Excel Pivot tables.
- Created and Monitor workflows using workflow designer and workflow monitor.
- Performeddatamanagement projects and fulfilling ad-hoc requests according to user specifications by utilizingdatamanagement software programs and tools like Perl, Toad, MS Access, Excel and SQL
- Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
- Involved in extensive Data validation by writing several complex SQL queries and Involved in back-end testing and worked withdataquality issues.
- Developed regression test scripts for the application and Involved in metrics gathering, analysis and reporting to concerned team and Tested the testing programs
- Analysis on Mainframedatato generate reports for business users.
- Identify & record defects with required information for issue to be reproduced by development team.
Environment: Quality Center 9.2, MS Excel 2007, PL/SQL, Java, Business Objects XIR2, ETL Tools Informatica 8.6/9.1/9.5, SSIS, Oracle 11G, Teradata R13, Teradata SQL Assistant
Confidential, Woonsocket, RI
Data Analyst
Responsibilities:
- Involved in reviewing business requirements and analyzingdatasources form Excel/Oracle SQL Server for design, development, testing, and production rollover of reporting and analysis projects.
- Document and publish test results, troubleshoot and escalate issues
- Preparation of various test documents for ETL process in Quality Center.
- Involved in Test Scheduling and milestones with the dependencies
- Functionality testing of email notification in ETL job failures, abort ordataissue problems.
- Identify, assess and intimate potential risks associated to testing scope, quality of the product and schedule
- Created and executed test cases for ETL jobs to upload masterdatato repository.
- Responsible to understand and train others on the enhancements or new features developed
- Conducted load testing and provide input into capacity planning efforts.
- Provided support to client with assessing how many virtual user licenses would be needed for performance testing, specifically load testing using Load Runner.
- Created and execute test scripts, cases, and scenarios that will determine optimal system performance according to specifications.
- Modified the automated scripts from time to time to accommodate the changes/upgrades in the application interface.
- Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.
- Participated in meetings, reviews, and user group discussions as well as communicating with stakeholders and business groups.
Environment: Windows XP, Informatica Power Center 6.1/7.1, QTP 9.2, Test Director 7.x, Load Runner 7.0, Oracle 10g, UNIX AIX 5.2, PERL, Shell Scripting