Sr. Data Modeler/data Analyst Resume
Brea, CA
SUMMARY:
- 8+ Years of IT experience and expertise as Data Analyst/Data Modeler with solid understanding of data warehouse/data mart development, SQL and analysis of Online Transactional Processing (OLTP), data warehouse (OLAP) and Business Intelligence (BI) applications.
- Experienced with all major databases: Oracle, SQL Server 2008/2012/2016, Teradata in large data warehouse (OLAP) environments
- Solid understanding of Data Modelling, Data Collection, Data Cleansing, Data Warehouse/Data Mart Design, ETL, BI, OLAP, Client/Server applications.
- Experience driving cross - functional analytics projects from beginning to end: question formation, data model design, exploratory data analysis (EDA), validation, analysis, machine learning, visualization, and presentation.
- Strong Experience in ER & Dimensional Data Modeling to deliver Normalized ER & STAR/SNOWFLAKE schemas using Erwin r7.2, ER Studio 10.0, EA Studio 1.5.1, Sybase power designer 12.1, SQL Server Enterprise manager and Oracle designer.
- ETL Design - INFORMATICA PowerCenter 8.x/7.x for Extraction, Transformation & Loading; ETL Analysis - Source to Target Mapping
- Expertise in SQL Server Analysis Services (SSAS), SQL Server Reporting Services (SSRS) and SQL Server Integration Services.
- Experienced in Financial / Finance, Mortgage, Online Supply-Chain, Material Management and Health Care, Supplemental Insurance and Property & Casualty Insurance.
- Experienced in Database using Oracle, XML, DB2, Teradata, Netezza, SQL server, Big Data and NoSQL.
- Experienced with Data Quality Management, Metadata Management, Master Data Management, Process Modeling, Data Dictionary, Data Stewardship, Data Profiling, Data Quality, and Data Model Standards.
- Extensive experience in providing technical expertise for the team on Informatica Powercenter, Informatica MDM (Siperian), IDD (Informatica Data Director), IDQ (Informatica Data Quality), Address Doctor and Identity Matching.
- Experienced in working with Data Resource Management team (DRM), Enterprise Metadata Repository team (EMR), Corporate Data Dictionary team (CDD), Integrated Governance Board (IGB) for data quality, data Integration of enterprise data assets.
- Excellent knowledge of Bill Inmon and Ralph Kimball methodologies to design database architecture. Excellent knowledge on logical application design and physical implementation for database application.
- Excellent knowledge of complete data warehouse life cycle, testing methodologies, OLAP and OLTP. Excellent knowledge of SDLC phases (Inception, Elaboration, Construction and Transition).
- Proven skills in designed and maintained the Detail Design Document (DDD), Business requirement documents (BRD), Data Requirement Document (DRD), Data Flow Diagram (DFD), Data Management Plan Document, Data Dictionary, Meta Data Model, Logical and Physical Data Models, Full DDL, Alter DDL, Insert statement for all the applications.
- Acquired knowledge on the Big data technologies mainly Hadoop, Spark, Hive, Pig and Tableau frameworks.
- Worked with Amazon Web Services (AWS) for a multitude of applications utilizing the Amazon Web Services focusing on high-availability, fault tolerance and auto-scaling.
- Published the workbooks and data sources to the tableau server for further review as well enable users to do slicing and dicing on the data to derive more insights. Created action filters, parameters and calculated sets for preparing dishoards and worksheets. Executed and tested required queries and reports before publishing.
- Experienced in Working with product managers, project managers, Business users, Applications development team members, CM Team, DBA teams and Data Governance team daily to analyze requirements, and design and development technical solutions.
- Excellent communicative, interpersonal, intuitive, analysis, leadership skills, quick starter with ability to master and apply new concepts.
TECHNICAL SKILLS:
Frame work and methodologies: Data Modeling (ER & Dimensional), Database Design, Requirement Analysis, ETL Design, Object Oriented Design, Development, Testing, Data Mapping, Metadata Management, Master Data Management, Data Profiling, Deployment, Documentation, Project Management, Semantic Layer.
Modeling Tools: Erwin 8.x/9.x, ERStudio 9.x, Power Designer
Big Data Techs: Hadoop, Hive, HDFS, MapReduce, Pig, Kafka.
Reporting Tools: Cogno's 8 / 10, Oracle Reports, Tableau, Business Objects, IDQ.
Other Tools:: SQL Navigator, TOAD, T-SQL, Informatica Power Center, Denodo, Ab Initio, Teradata SQL Assistant
Databases: Oracle7/8i/9i/10g/11g, SQL Server2000/2005/ 2008/2012/2016, Sybase, Teradata, DB2.
Languages: C, C++, VB, Java, PHP, Python, R, PL/SQL and SQL, SAS.
Operating Systems: Windows NT, 2000 & XP, Vista, UNIX and Linux.
PROFESSIONAL EXPERIENCE:
Confidential,Brea, CA
Sr. Data Modeler/Data Analyst
Responsibilities:
- Involved in requirement gathering and data analysis and Interacted with Business users to understand the reporting requirements, analyzing BI needs for user community.
- Involved in logical and physical designs and transforming logical models into physical implementations.
- Normalized the data up to 3rd Normal form.
- Created Entity/Relationship Diagrams, grouped and created the tables, validated the data, identified PKs for lookup tables.
- Involved in modeling (Star Schema methodologies) in building and designing the logical data model into Dimensional Models.
- Documented the source to target mappings for both data integration as well as web services
- Experience working with MDM team with various business operations involved in the organization
- Worked on different data models for Claims, Members and Providers for different claim types, for different Health Partner Incentive programs.
- Formulated the ER diagrams of Property and Casualty data and data flow diagrams in order to interact with database developers.
- Utilized Erwin’s forward/reverse engineering tools and target database schema conversion process.
- Designed the data marts in dimensional data modeling using star and snowflake schemas.
- Redefined attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns as part of data analysis responsibilities.
- Involved in collaborating with ETL/Informatica teams to source data, perform data analysis to identify gaps
- Developed ETL routines using SSIS packages, to plan an effective package development process, and design the control flow within the packages.
- Presented the Dashboard to Business users and cross functional teams, define KPIs (Key Performance Indicators), and identify data sources.
- Examined key Performance Indicators KPI for Property and Casualty such as Loss Ratio, General Operating Expense Ratio.
- Implement and update data governance procedures, standards, training documents, presentations, and Data Governance detail plans .
- Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
- Designed and developed IDQ solution to support MDM Projects
- Defined Validation rules in MDM system by analyzing excel sheet part master data and input from the business users
- Created and maintained logical, dimensional data models for different Claim types.
- Designed data flows that extract, transform, and load data by optimizing SSIS performance
- Worked with slowly changing dimensions (SCDs) in implementing custom SCD transformations.
- Involved in loading the data from Source Tables to Operational Data Source tables using Transformation and Cleansing Logic
- Created the conceptual model for the data warehouse with emphasis on insurance (life and health), mutual funds and annuity using Erwin data modeling tool.
- Cleansed, extracted and analyzed business data on daily basis and prepared ad-hoc analytical reports using Excel and T-SQL
- Participated in cross functional internal and external training programs covering leadership, ethics & professionalism, retirement benefits, property & casualty insurance, taxation, financial reporting, client management and international actuarial developments
- Perform annually and quarterly Vendor Management due diligence reviews and risk assessments on potential new relationships.
- Involved in data modelling to define the table structure (Domain and Reference Masters) in MDMsystem.
- Building, publishing customized interactive reports and dashboards, report scheduling using Tableau server.
- Worked on all data management activities on the project data sources, data migration.
- Worked on creating DDL, DML scripts for the data models.
- Worked on stored procedures for processing business logic in the database.
- Performance query tuning to improve the performance along with index maintenance.
- Created support documentation and worked closely with production support and testing team.
Confidential,Austin,Texas
Data Modeler/Data Analyst
Responsibilities:- Designed, developed and delivered / implemented data solutions to include: architecture design, prototyping of concepts to proof of concept, development of standards, design and development of test plans, code and module design, development and testing, data solution debugging, design and implementation of a solution that follows efficient design techniques and development that meets and exceeds the intent of the design of the data solution.
- Complete study of the in-house requirements for the data warehouse. Analyzed the DW project database requirements from the users in terms of the dimensions they want to measure and the facts for which the dimensions need to be analyzed.
- Performed Logical & Physical Data Modeling - ER Modeling & Dimensional Modeling.
- Normalizing the tables/relationships to arrive at effective Relational Schemas.
- Identifying the facts & dimensions; grain of fact, aggregate tables for Dimensional Models.
- Dimensional Data Modeling to deliver Multi-Dimensional STAR schemas.
- Worked with data compliance teams, Data governance team to maintain data models, Metadata, Data Dictionaries; define source fields and its definitions.
- Participate in biweekly technical huddle meeting with development and DBA team. Participate in weekly data analyst meting and submit weekly data governance status.
- Rapid Audit of the requirements and existing systems; Requirements & Business Process Analysis.
- Gathered products information, collected shopping history data and conducted analysis for sales planning.
- Perform other routine daily and monthly responsibilities including reconciliations, forecasting, and other ad hoc projects.
- Produced and presented quarterly attribution analysis of price-testing reserve P&L to senior management representing Trading, Valuations, and Market Risk groups, substantiating market, methodology, and portfolio drivers as necessary.
- Imported data from multiple data sources into Excel workbook, created relationships between heterogeneous data, created calculated columns and measures using VLookup and formulas, built PivotTables and Pivot Charts using PowerPivot skill.
- Generated periodic reports based on the statistical analysis of the data from various time frame and division using SQL Server Reporting Services (SSRS).
- Designed data stage ETL jobs for extracting data from heterogeneous source system, transform, and finally load into the Data Marts.
- Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
- Involved in Data profiling of source system as part of technical feasibility study for the business requirements. Also collect the Data Demographics from source system.
- Wrote lots of metadata driven macros that are used across multiple applications and saved the macros in permanent SAS catalog for reuse.Design & Implementation of Data Marts; DBA coordination; DDL & DML Generation & usage.
- Designed the Data Warehouse and MDM hub Conceptual, Logical and Physical data models
- Analyzed the business data and defined the Match and Merge rules and Trust Scores for MDM Hub.
- Convert models from CA ERwin to ER/Studio 8.2 (ERStudio); Developed numerous macros.
- Used SAS/ACCESS and SAS/SQL to extract Data from different RDBMS like TeraData, Oracle, SQL Server and merging them efficiently
- Implemented business modeling, data modeling, and object modeling using MS Visio to develop the business architecture for the application. Created reports using formulas such as Vlookup, goal seek and pivot tables.
- Worked in Tableau environment to create dashboards like weekly, monthly, daily reports using tableau desktop & publish them to server.
- Used Python to extract weekly bed availability information from XML files using underscore JS.
- Automate various data extraction, transformation, and loading tasks with Python
- Created DataStage Server jobs to load data from sequential files, flat files and MS Access
- Architecting Work Flows, Activity Hierarchy & Process Flows; Design EDW / CDW components.
- Metadata Repository, Data Dictionary; Documentation using Interface Diagrams, Flow Charts.
- Assist developers in Performance Tuning & discussing Design Patterns/high level application design.
- Created UML diagrams including context, Business Rules Flow, and Class Diagrams.
Confidential, Seattle, WA
Data Analyst
Responsibilities:- Conducted JAD sessions, wrote meeting minutes and also documented the requirements.
- Collected requirements from business users and analyzed based on the requirements.
- Designed and built Data marts by using Star Schema.
- Involved in designing Context Flow Diagrams, Structure Chart and ER- diagrams.
- Extensive system study, design, development and testing were carried out in the Oracle environment to meet the customer requirements.
- Serve as a member of a development team to provide business data requirements analysis services, producing logical and Physical data models using ERwin 4.0 and Power Designer.
- Created tables, views, procedures and SQL scripts.
- Utilized organizational standards for data naming, structuring and documentation.
- Responsible for defining database schemas to support business data entities and transaction processing requirements.
- Analysis of Trends, Forecasting, and KPI's that directly/indirectly affect overall consumer awareness and product management
- Used Denodo implemented BI reporting solutions across various databases and excel spreadsheets.
- Performed data queries to generate reports and used advanced excel functions to generate spreadsheets and pivot tables. Create dashboards in power pivot.
- Fine-tuned these solutions per requirements and optimized the performance of Complex views in Denodo.
- Worked with team in Creating and Monitoring all components of Denodo (Administration, VDP, Scheduler, Custom Views, and Caching) to make sure the service fit the needs.
- Facilitated business requirements gathering sessions with the client, and overlooked the creation of Use Cases/Activity Diagrams, Functional Specifications using UML/Visio
- Ensure the business metadata definitions of all data attributes and entities in a given data model are documented to meet standards.
- Ensure the first cut physical data model includes business definitions of the fields (columns) and records (tables) were generated.
- Integrated high - level business rules (constraints, triggers and indexes) with the code.
- Closely worked with ETL process development team. Created SSIS packages to perform ETL and automated them using VBScripts, windows scheduler
- Maintained current documentation for all primary and backup responsibilities.
- Worked as part of a team of Data Management professionals supporting a Portfolio of development projects both regional and global in scope.
- Conducted design reviews and validated data models and was involved in MDM (Master DataManagement)
- Implemented metadata standards, data governance and data stewardship, master data management(MDM), ETL, ODS, data warehouse, data marts, reporting, dashboard, analytics, segmentation, and predictive modelling
- Applied organizational best practices to enable application project teams to produce data structures that fully meet application needs for accurate, timely, and consistent data that fully meets its intended purposes.
- Conducted peer reviews of completed data models and plans to ensure quality and integrity from data capture through usage and archiving.
Confidential,Kansas City, KS
Data Modeler/ Data Analyst
Responsibilities:- Studied the Requirements Specifications, use cases and analyzed the data needs of the Business users.
- Converted the Logical data models to Physical data models to generate DDL.
- Extensively worked with Teradata database as part of the enterprise warehouse development and the data mart, staging and prestaging environment build outs.
- Worked on numerous activities like monitoring Teradata platform using Viewpoint, remediating excessive statistics, tuning production ETL queries, resolving spool space issues, tuning high impact analytical queries, applying compression for space reclamation, Data Mover Development and Support etc.
- Worked in enhancement of the existing Teradata processes running on the Enterprise DataWarehouse
- Performed GAP analysis with Teradata MDM and Drive (SQL) data models to get clear understanding of requirements.
- Updated the Naming and version control standard documents and implemented version controlling using the Model Manager.
- Migrated several models in ERWin 4.1/7.1 into ERwin7.2 and updated the naming standards.
- Created complex mappings and mapplets using Lookup, Expression, Aggregator, Sequence Generator, Union, Normalizer, and Router transformations.
- Data governance functional and practical implementation and responsible for designing common Data governance frameworks.
- Involved in writing the PL/SQL validation scripts to identify the data inconsistencies in the sources.
- Worked with Business Analysts to design weekly reports using Cognos.
- Created Design documents, Source Target Mappings and Sign-off documents.
- Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems.
- Performed match/merge and ran match rules to check the effectiveness of MDM process on data.
- Worked on the Enterprise Meta data repositories for updating the xml and involved in Master Data Management MDM .
- Configured MDM system includes configuring data model, hierarchy, business rules, workflow rules, process model, and UI model
- Worked with Data virtualization tool Denodo and also have experience working on Informatica Power Center to analyze the work flows and document the existing Cognos data models.
- Used Denodo to create custom Denodo views by joining tables from multiple data-sources and later pushing these reports to tableau.
- Built the Transformation Rules engine for use of all the designers across the project.
- Documented the designs so as to facilitate the personnel to understand the process and in corporate the changes as and when necessary.
- Responsible for detailed verification, validation and review of the design specifications.
- Conducted review walk through of data models involving SME, developers, testers and analysts.
Confidential,Conway,AR
Data Analyst/ Data Modeler
Responsibilities:- Data Analysis, logical and physical. Data Modeling for OLAP systems.
- ER Modeling - Developing Entity Relationship Diagrams (ERD).
- Normalizing the tables/relationships to arrive at effective Relational Schemas.
- Identifying the facts & dimensions; grain of fact, aggregate tables for Dimensional Models.
- Developing Snowflake Schemas by normalizing the dimension tables as appropriate.
- Implementation of Business Rules in the Database using Constraints & Triggers.
- Dimensional Data Modeling to deliver Multi-Dimensional STAR schemas.
- Requirements & Business Process Analysis; Rapid Audit of the requirements and existing systems.
- Design & Implementation of Data Mart; DBA coordination; DDL & DML Generation & usage.
- Metadata & Data Dictionary Management; Data Profiling; Data Mapping.
- Normalization techniques to arrive at effective Relational Schemas.
- Performed data quality in Talend Open Studio
- Conducted design reviews and validated data models and was involved in MDM (Master DataManagement)
- Analyzed, verified, and modified UNIX and Python scripts to improve data quality and performance.
- Used python script to generate the data and load them to hdfs and create the tables in hive
- Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
- Applied Data Governance rules (primary qualifier, Class words and valid abbreviation in Table name and Column names).
- Worked with statisticians in analyzing the credit risk and modeling on Basel Parameter Model (PD, LGD, EAD).
- Developed a data collection systems and other strategies that optimize statistical efficiency and data quality.
- Involved in capturing data lineage, table and column data definitions, valid values and others necessary information in the data models.
- Worked in importing and cleansing of data from various sources like Teradata, Oracle12c, Netezza, flat files, PL/SQL Server with high volume data.
- Performed Data Massaging and Analysis using R, Automated all the logics in R environment such as creating contracts, Pricing, at header level and adding SKU'S to original Data.
- Create SAS Proc, SQL code to transform the original datasets to target datasets for complex requirements using ETL process
- Conducted machine learning based predictive model to drive business decision and predict the future market
- Use machine learning (random forests) algorithm to determine optimal predictors for identifying broker-dealers
- ETL Design & Implementation - Data Extraction, Transformation & Loading (using Oracle Warehousing Builder, SQL & PL/SQL).
- Performance Tuning (Database Tuning, SQL Tuning, Application/ETL Tuning).
- Maintained and documented all create and alter SQL statements for all release.
- Designing Data Flows & System Interfaces.
Confidential,St. Louis, MO
Data Analyst
Responsibilities:- Worked closely with business units to develop a Business Requirement Document (BRD). Involved in all phases of the SDLC and acted as the main liaison between business managers and IT division.
- Led and conducted JAD sessions, Interviews, focus groups, Surveys for requirements gathering, performed various analysis like feasibility and impact and design of the system.
- Involved in Data Modeling using ERwin (Logical and Physical Design of Databases), Normalization and building Referential Integrity Constraints
- Built Fast Load and Fast Export scripts to load data into Teradata and extract data from Teradata.
- Used Talend ETL tool to move data from Data Warehouse to Data Marts. Also, used to load datafrom files to staging tables.
- Produce actuarial deliverables pertaining to Medicaid, for review by Centers for Medicare and Medicaid
- Assist Actuaries with query development and Excel spreadsheet design to reduce calculation time and manual work
- Exploited power of Teradata to solve complex business problems by data analysis on a large set of data.
- Used Teradata advanced techniques like OLAP functions CSUM, MAVG, MSUM MDIFF etc.
- Familiar with using Set, Multiset, Derived, Volatile and Global Temporary tables in Teradata for larger Adhoc SQL requests.
- Participate in documenting the data governance framework including processes for governing the identification, collection, and use of data to assure accuracy and validity.
- Development of physical data models and created DDL scripts to create database schema and database objects.
- Created user requirement documents based on functional specification.
- Created new tables, written stored procedures, triggers, views, functions.
- Created SSIS Packages by using transformations like Derived Column, Sort, Lookup, Conditional Split, Merge Join, Union and Execute SQL Task to load into database.
- Created SQL scripts for tuning.
- Data Extracted from Flat files, Excel and Transformed as per the logic and loaded into Data warehouse.
- Created packages to schedule the jobs for batch processing.
- Involved in performance tuning to optimize SQL queries.
- Created and Maintained Indexes for various fast and efficient reporting processes
- Understanding the DLD specifications document and working with design to develop the mappings using SQL Server Integration Services (SSIS).
Confidential, Hyderabad, AP
Data Analyst
Responsibilities:- Involved in gathering user requirements along with the Business Analyst.
- Participated in creating the logical model of an online processing system for a large financial institution using Erwin.
- Worked with DBAs to generate physical model.
- Worked on Bill inmon methodologies for modeling.
- Created tables, views, procedures and SQL scripts and mapping documents.
- Worked on slowly changing dimensions (SCD) and hierarchical dimensions
- Worked on conversion process of data, which is stored in flat files into oracle tables.
- Designed and Developed SQL procedures, functions and packages to create Summary tables.
- Generating ad-hoc reports using crystal reports 9.
- Developed database backup and restore policy.
- Expertise in Creating Report Models for building Ad-hoc Reports Using SSRS.
- Expertise in Generating Reports using SSRS and Excel Spreadsheet.
- Worked with different types of Sources like DB Tables source, Flat File Source, Excel Source and Destinations like DB Tables
- Expertise in Creating Various Parameterized, Cascaded, Linked, Drill-through and Drill-down Reports.
- Hands on Experience in creating ETL Packages using SQL Server 2005 Integration Services (SSIS).
- Good Understanding in Database and Data Warehousing Concepts.
- Also carried database admin duties with extensive data import & export experience with web platforms such as online shopping, .