Sr. Data Modeler/ Data Analyst Resume
Springfield, IL
SUMMARY:
- Over 7+ years of working experience as a Data Modeler and Data Analyst with high proficiency in requirement gathering and data modeling including design and support of various applications in OLTP, Data Warehousing, OLAP and ETL Environment.
- Hands on experience in importing, cleaning, transforming, and validating data and making conclusions from the data for decision - making purposes.
- Worked on various databases Oracle, Sql server, Teradata andDB2.
- Expertise in writing SQL Queries, Dynamic-queries, sub-queries and complex joins for generating Complex Stored Procedures, Triggers, User-defined Functions, Views and Cursors.
- Ability to use custom SQL for complex data pulls.
- Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export using multiple ETL tools such as Informatica Power Center.
- Experience in testing, data validation and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.
- Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional modeling.
- Worked on Data Model design, Data Extraction, Transformations and Loading, Mappings & Workflows, Customized Analytics Reports.
- Highly skilled in Tableau Desktop for data visualization, Reporting and Analysis, Cross Map, Scatter Plots, Geographic Map, Pie Charts and Bar Charts, Page Trails and Density Chart.
- Experience with data warehousing techniques like Slowly Changing Dimensions, Surrogate key, Snow flaking etc. Worked with Star Schema, Data Models, E-R diagrams and Physical data Models.
- Experience in Performance Tuning and Debugging of mappings and sessions. Strong in optimizing the Mappings by creating/using Re-usable transformations, Mapplets and PL/SQL stored procedures.
- Experience in Data warehousing, Data Architecture &Extraction, Transformation and loading (ETL) data from various sources into Data Warehouse and Data Marts using Informatica Power Center 9.x/ 8.6.0/8.1.1/7. x/6.x/5.x.
- Profound knowledge of Data modeling including Star Schema and Snow Flake Dimensional Data modeling, 3NF Normalized Data Modeling.
- Worked on Tableau and created Adhoc reports and dashboards.
- Involved in Design of the Enterprise Data Visualization Architecture. Defined best practices for Tableau report development.
- Expert knowledge of UNIX Shell Scripting and understanding of PERL & Korn scripting.
- Creating Checklists for Coding, Testing and Release for a smooth, better & error free project flow.
- Experience in Integration and extraction of data from various sources like DB2, SQL Server, Oracle, Sybase, and Teradata, MS Access, Flat files into a staging area.
- Experience in using ETL methodologies for supporting Data Extraction, Data Migration, Data Transformation and loading using Informatica Power center 9.6.1/9.1/8.6.1, IDQ.
- Strong SQL Skills related to information retrieval and analysis. Exposure to Client Interaction, User requirement Analysis and Support.
- Extensive experience in creating data scope and requirements. Strong expertise in understanding various data sources.
- Experience working with data analysis tools for data lineage, metadata, and data profiling.
- Proficient in working with senior executives and multiple stakeholders to understand key data issues and identifying their resolution.
- Extensive experience in developing Unit, Integration and UAT Test plans and Cases and also has experience in generating/executing SQL Test Scripts and Test results.
- Experienced in managing, monitoring and investigating cases such as trouble tickets from customers and fixing the issues.
TECHNICAL SKILLS:
Data analysis: Requirements Gathering, JAD sessions, Process/Production Model analysis, Data Normalization, Cleansing, Profiling, System Design, Data architecture internal Standards development, Metadata and Reports, Source and Target system Analysis, Project Management.
Languages: SQL, T: SQL, PL/SQL, Basic C &Unix.
Database Systems: SQL Server, Oracle, Teradata, DB2, Hive (Hadoop), NOSQL
MS Office Suite: MS Word, MS PowerPoint, MS Excel, MS Access
Operating Systems: Linux, Windows, Mac OS.
ETL and Reporting Environment: SQL Server, SSIS, SSRS, Informatica, Data Stage, Tableau
Data Modeling Tools: Power Designer 16.5, Erwin, ER/Studio
Additional QA Skills: Business and Software Process and Procedure Definition, Quality Models, data standards, Measures and Metrics, Project Reviews, Audits and Assessments. Provide Assistance to Production Issues
WORK EXPERIENCE:
Sr. Data Modeler/ Data Analyst
Confidential, Springfield, IL
Responsibilities:
- Developed full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
- Created Logical & Physical Data Model on Relational (OLTP) on Star schema for Fact and Dimension tables using Erwin.
- Performed GAP analysis to analyze the difference between the system capabilities and business requirements.
- Developing SSIS packages to integrate them into Azure Data Factory V2.
- Used Agile Method for daily scrum to discuss the project related information.
- Prepared ETL technical Mapping Documents along with test cases for each Mapping for future developments to maintain Software Development Life Cycle (SDLC)
- Involved in Data flow analysis, Data modeling, Physical database design, forms design and development, data conversion, performance analysis and tuning.
- Created sampling specification, performed quality assurance, created multiple datasets to communicate with Medicare Advantage plans and analyzed Medicare plans response.
- Created and maintained data model standards, including master data management (MDM) and Involved in extracting the data from various sources like Oracle, SQL, Teradata, and XML.
- Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
- Worked with medical claim data in the Oracle database for Inpatient/Outpatient data validation, trend and comparative analysis.
- Design and development of the data warehouse data ETL Jobs using azure data factory and azure databricks.
- Worked on normalization techniques, normalized the data into 3rd Normal Form (3NF)
- Worked Normalization and De-normalization concepts and design methodologies like Ralph Kimball and Bill Inmon approaches and implemented Slowly Changing Dimensions.
- Implemented Forward engineering to create tables, views and SQL scripts and mapping documents.
- Responsible for all BI Team Azure based solutions along with developing strategy on implementation
- Used reverse engineering to connect to existing database and create graphical representation (E-R diagram)
- Performance tuning and stress-testing of NoSQL database environments to ensure acceptable database performance in production mode.
- Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
- Assisted in defining business requirements and created BRD (Business Requirements Document) and functional specifications documents.
- Managed database design and implemented a comprehensive Star-Schema with shared dimensions.
- Analyzed the data which is using the maximum number of resources and made changes in the back-end code using PL/SQL stored procedures and triggers
- Developed and maintained stored procedures, implemented changes to database design including tables and views and Documented Source to Target mappings as per the business rules.
- Conducted Design reviews with the business analysts and the developers to create a proof of concept for the reports.
- Performed detailed data analysis to analyze the duration of claim processes and created the cubes with Star Schemas using facts and dimensions through SQL Server Analysis Services (SSAS)
- Deployed SSRS reports to Report Manager and created linked reports, snapshots, and subscriptions for the reports and worked on scheduling of the reports.
- Generated parameterized queries for generating tabular reports using global variables, expressions, functions, and stored procedures using SSRS.
- Developed, and scheduled variety of reports like cross-tab, parameterized, drill through and sub reports with SSRS.
- Directing and overseeing data quality tests, including providing input to quality assurance team members.
Environment: Erwin v9.7, Agile, NoSQL, Big Data, Azure, Redshift, MDM, Oracle 12c, SQL, Teradata r15, XML, Python 3.6, PL/SQL, Tableau, SSRS.
Sr. Data Modeler/ Data Analyst
Confidential, San Francisco, CA
Responsibilities:
- Massively involved in Data Modeler/ Analyst role to review business requirement and compose source to target data mapping documents.
- Gather all the analysis reports prototypes from the business analysts belonging to different Business units.
- Interacted with Business Analysts to gather the user requirements and participated in data modeling JAD sessions.
- Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
- Extensively used Agile methodology as the Organization Standard to implement the data Models.
- Actively participated in JAD sessions involving the discussion of various reporting needs.
- Created PL/SQL packages and Database Triggers and developed user procedures and prepared user manuals for the new programs.
- Interacted with Business Analyst, SMEs to understand Business needs and functionality for various project solutions.
- Planned and defined system requirements to Use Case, Use Case Scenario and Use Case Narrative using the UML (Unified Modeling Language) methodologies.
- Reverse Engineering the existing data marts and identified the Data Elements (in the source systems), Dimensions, Facts and Measures required for reports.
- Conduct Design discussions and meetings to come out with the appropriate Data Warehouse at the lowest level of grain for each of the Dimensions involved.
- Created Entity Relationship Diagrams (ERD), Functional diagrams, Data flow diagrams and enforced referential integrity constraints.
- Developed the data warehouse model (Kimball's) with multiple data marts with conformed dimensions for the proposed central model of the Project.
- Designed a STAR schema for sales data involving shared dimensions (Conformed) for other subject areas using E/R Studio Data Modeler.
- Created and maintained Logical Data Model (LDM) for the project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.
- Validated and updated the appropriate LDM's to process mappings, screen designs, use cases, business object model, and system object model as they evolve and change.
- Conduct Design reviews with the business analysts and content developers to create a proof of concept for the reports.
- Ensured the feasibility of the logical and physical design models.
- Worked on the Snow-flaking the Dimensions to remove redundancy.
- Wrote PL/SQL statement, stored procedures and Triggers for extracting as well as writing data.
- Worked extensively on Data Migration by using SSIS.
- Developed rule sets for data cleansing and actively participated in data cleansing and anomaly resolution of the legacy application.
- Defined facts, dimensions and designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using E/R Studio.
- Involved in Data profiling and performed Data Analysis based on the requirements, which helped in catching many Sourcing Issues upfront.
- Developed Data mapping, Transformation and Cleansing rules for the Data Management involving OLTP, ODS and OLAP.
- Responsible for data governance rules and standards to maintain the consistency of the business element names in the different data layers.
- Created data masking mappings to mask the sensitive data between production and test environment.
- Normalized the database based on the new model developed to put them into the 3NF of the datawarehouse.
- Created SSIS package for daily email subscriptions to alert Tableau subscription failure using the ODBC driver and PostgreSQL database.
- Designed logical and physical data models, Reverse engineering, Complete compare for Oracle and SQL server objects using E/R Studio.
Environment: PL/SQL, E/R Studio v17, MS SQL 2016, OLTP, ODS, OLAP, SSIS, Tableau, ODBC, Transact-SQL, TOAD, Teradata r15, SQL Assistant
Data Modeler/ Data Analyst
Confidential, San Diego, CA
Responsibilities:
- Gathered and translated business requirements into detailed, production-level technical specifications, new features, and enhancements to existing technical business functionality.
- Part of team conducting logical data analysis and data modeling JAD sessions, communicated data-related standards.
- Worked on NoSQL databases including Cassandra. Implemented multi- data center and multi-rack Cassandra cluster.
- Coordinated with Data Architects on AWS provisioning EC2 Infrastructure and deploying applications in Elastic load balancing.
- Performed Reverse Engineering of the current application using Erwin, and developed Logical and Physical data models for Central Model consolidation.
- Translated logical data models into physical database models, generated DDLs for DBAs
- Performed Data Analysis and Data Profiling and worked on data transformations and data quality rules.
- Involved in extensive data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
- Collected, analyze and interpret complex data for reporting and/or performance trend analysis
- Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
- Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex DW using Informatica.
- Developed and maintain sales reporting using in MS Excel queries, SQL in Teradata, and MS Access.
- Involved in writing T-SQL working on SSIS, SSRS, SSAS, Data Cleansing, Data Scrubbing and DataMigration.
- Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns as part of Data Analysis responsibilities.
- Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
- Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2
- Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, with high volume data .
- Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Involved in extensive data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
- Created SQL tables with referential integrity, constraints and developed queries using SQL, SQL*PLUS and PL/SQL.
- Performed GAP analysis of current state to desired state and document requirements to control the gaps identified.
- Developed the batch program in PL/SQL for the OLTP processing and used Unix Shell scripts to run in corn tab.
- Identified & record defects with required information for issue to be reproduced by development team.
- Worked on the reporting requirements and involved in generating the reports for the Data Model using crystal reports .
Environment: Erwin 9.0, PL/SQL, Business Objects XIR2, Informatica 8.6, Oracle 11g, Teradata R13, Teradata SQL Assistant 12.0, PL/SQL, Flat Files
Data Analyst/ Data Modeler
Confidential
Responsibilities:
- Involved in Data mapping specifications to create and execute detailed system test plans.
- The datamapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
- Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center.
- Setting up of environments to be used for testing and the range of functionalities to be tested as per technical specifications.
- Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
- Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS,Data warehouse, data mart reporting system in accordance with requirements.
- Troubleshoot test scripts, SQL queries, ETL jobs, data warehouse/ data mart/ data store models.
- Responsible for different Data mapping activities from Source systems to Teradata.
- Created the test environment for Staging area, loading the Staging area with data from multiple sources.
- Used CA Erwin Data Modeler (Erwin) for data modeling ( data requirements analysis, database design etc.) of custom developed information systems, including databases of transactional systems and datamarts.
- Responsible for analyzing various data sources such as flat files, ASCII Data, EBCDIC Data, RelationalData (Oracle, DB2 UDB, MS SQL Server) from various heterogeneous data sources.
- Delivered file in various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.)
- Involved in Teradata SQL Development, Unit Testing and Performance Tuning and to ensure testing issues are resolved based on using defect reports.
- Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases
- Experience in creating UNIX scripts for file transfer and file manipulation.
- Provide support to client with assessing how many virtual user licenses would be needed for performance testing.
- Ensuring onsite to offshore transition, QA Processes and closure of problems & issues.
- Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.
Environment: Erwin, Informatica 8.1, Data Flux, Oracle 9i, Quality Center 8.2, SQL, TOAD, PL/SQL, Flat Files, Teradata
Data Analyst
Confidential
Responsibilities:
- Performed Data analysis and Data profiling using complex SQL on various sources systems.
- Developed SAS macros for data cleaning, reporting and to support routing processing.
- Created SQL scripts to find Data quality issues and to identify keys, Data anomalies, and Data validation issues.
- Actively involved in writing T-SQL Programming for implementing Stored Procedures and Functions and cursors, views for different tasks.
- Performed Gap Analysis to check the compatibility of the existing system infrastructure with the new business requirement.
- Used MS Visio for business flow diagrams and defined the workflow.
- Performed Data analysis for the existing Data warehouse and changed the internal schema for performance.
- Wrote SQL Stored Procedures and Views, and coordinate and perform in-depth testing of new and existing systems.
- Experienced in developing business reports by writing complex SQL queries using views, volatile tables.
- Extensively use SAS procedures like means, frequency and other statistical calculations for Datavalidation.
- Performed Data Analysis and extensive Data validation by writing several complex SQL queries.
- Involved in design and development of standard and ad-hoc reporting using SQL/SSRS
- Identified source databases and created the dimensional tables and checked for data quality using complex SQL queries.
- Responsible for data lineage, maintaining data dictionary, naming standards and data quality.
- Performed Data Manipulation using MS Excel Pivot Sheets and produced various charts for creating the mock reports.
- Used SQL Server and MS Excel on daily basis to manipulate the data for business intelligence reporting needs.
- Performed Data Analysis and Data Profiling and worked on data transformations and data quality rules.
- Extracted data from different sources like Oracle and text files using SAS/Access, SAS SQL procedures and created SAS datasets.
- Developed SQL Joins, SQL queries, tuned SQL, views, test tables, scripts in development environment.
Environment: SQL, SAS macros, T-SQL, MS Visio 2010, MS Excel 2010, SQL Server 2010