Senior Data Modeler/ Data Analyst Resume
Chicago, IL
CAREER SUMMARY:
- Over 7 Years of IT experience as Senior Data Analyst, Data Modeler, Business Analyst and UAT Analyst in various domains like Health care, Insurance, Telecommunication, Consumer Banking and Retail. Proven ability in working with enterprise level large Data Warehouses, RDBMS systems, ETL and modeling tools/techniques.
- Studio Data modeling tools.
- Good in system analysis, ER Dimensional Modeling, Database design and implementing RDBMS specific features.
- Experience in modeling with both OLTP/OLAP systems and Kimball and Inmon Data Warehousing environments.
- Good in Normalization / Demoralization techniques for effective and optimum performance in OLTP and OLAP environments.
- Having good experience with Normalization (1NF, 2NF and 3NF) and Denormalization techniques for improved database performance in OLTP, OLAP, Data Warehouse and Data Mart environments.
- Well - versed in designing Star and Snowflake Database schemas pertaining to relational and dimensional data modeling.
- A good expertise in Extract Transform and Load (ETL) data from spreadsheets, database tables and other sources using Microsoft Data Transformation Service (DTS) and Informatica
- Good working knowledge of Meta-data management in consolidating metadata from disparate tools and sources including Data warehouse, ETL, Relational Databases and third-party metadata into a single repository to get information on data usage and end-to-end change impact analysis.
- Having experience in writing complex SQL queries to perform end-to-end ETL validations and support Ad-hoc business requests. Also, good in developed Stored Procedures, Triggers, Functions, Packages using SQL/PLSQL.
- Strong knowledge and understanding of Data warehousing concepts: Star Schema, Snowflake Schema, Fact and Dimension Tables. Good knowledge of Dimensional Modeling and Normalization approaches.
- Proficiency, including architecture, with different RDBMS platforms like Oracle, Teradata and SQL Server.
- Explored No SQL Databases such as Mongo DB & DynamoDB (AWS), TALEND
- Extensive experience working with enterprise level data warehouses, writing advanced SQL queries including OLAP functionality using both OVER CLAUSE (PARTITION BY and ORDER BY) and OLAP functions to perform Ad-Hoc analysis and reporting. experience on OBIEE Administration Tool covering all three Layers namely Physical, layer, BMM Layer and Presentation Layer.
- Experienced with Teradata utilities: BTEQ, FastLoad, MultiLoad, FastExport, and TPump. Familiar in writing Multi Load, FastLoad, FastExport and BTEQ scripts.
- Proficient in Data Quality Management techniques like Data Profiling, Data Cleaning, Data Integrity, Data Mining, Data Reference and Security etc.
- Experience with Tableau in analysis and creation of dashboard and user stories.
- Built reports into interactive dashboards in Tableau Desktop that were presented to Business Users, Program Managers, and End Users.
- Develop TSQL scripts to create database objects and perform DML and DDL tasks and wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
- Strong knowledge of Software Development Life Cycle (SDLC) including Waterfall and Agile models.
- Knowledge and working experience on big data tools like Hadoop, Azure Data lake, AWS Redshift.
- Extracted Data from Amazon Redshift, AWS, and Elastic Search engine using SQL Queries to create reports.
- Excellent understanding of an Approach to MDM to creating a data dictionary, Using Informatica or other tools to do mapping from sources to the Target MDM Data Model. Experience working with multiple/cross-functional groups, conducting brainstorming/root cause analysis sessions.
- Strong computing skills including Microsoft Office suite: Excel, PowerPoint, Access, Word, Microsoft Outlook.
- Extensive experience in shell scripting, Python, Perl scripting languages.
- Provided improvement recommendations MDM tools and worked closely with the company master data services.
TECHNICAL SKILLS:
Analysis/Modeling Tools: UML, JAD, RUP, Waterfall, Agile
Defect Tracking and Testing Tools: Quality Center, JIRA, HP ALM, Service Now
Operating Systems/Platforms: Windows 7/XP/2000/2003/NT, Unix, Linux, Mainframes
Office Applications and other tools: MS Office Suite, Visio, SharePoint, MS Project
Databases: Teradata 13, MS SQL Server 2008 R2, Oracle 11g
DB Tools: SQL Plus, SQL Developer, Teradata SQL Assistant, Toad for Oracle and DB2, Mongo DB, SSMS, CA Erwin DM, Rational Rose, ER/Studio, MS Visio, SAP Power designer, Embarcadero, Hadoop, Podium
Programming languages: C, C++, Java, SQL, T-SQL, PL/SQL, SAS, Perl, Shell Scripting, JavaScript, Python
Reporting Tools: Crystal reports XI, Business Intelligence, SSRS, Business Objects 5.x / 6.x, Cognos7.0/6.0, Tableau, OBIEE, Informatica Power Center 7.1/6.2, Ab-Initio, Data Stage, Tivoli Workload Scheduler, Control-M
Web Technologies: H TML, DHTML, XML, CSS, JavaScript, Angular Js
PROFESSIONAL EXPERIENCE:
Confidential, Chicago, IL
Senior Data Modeler/ Data Analyst
Technologies: Erwin, Informatica, Oracle 11g, DB2, Hadoop
Responsibilities:
- Gathering and analyzing the business and technical requirements. Create Functional Specification Document
- Attend program design, coding and test walk-through meetings to provide input regarding adherence to technical standards and customer requirements.
- Conducting Functional Requirement Specification Artifacts meetings with the Project Teams and analyzing the risks involved.
- Responsible for data mapping, metadata maintenance and enhancing existing Logical and Physical data models.
- Conducted sessions with architects, business SME and development teams to understand the Change Requests (CR’s), and effects caused on the system by the change requests.
- Determined data rules and conducted Logical and Physical design reviews with business analysts, development team and DBAs.
- Worked on large datasets based on Mongo DB
- Maintained and updated Metadata repository based on the change requests
- Analyzed existing Logical and Physical data models and altered them using Erwin to support enhancements.
- Conducted walkthroughs with the DBA to convey the changes made to the data models.
- Implemented SQL Scripts to modify the data to resolve assigned defects.
- Worked with the ETL team to document the transformation rules for data migration from OLTP to Warehouse environment for reporting purposes.
- Developed source to target mapping documents to support ETL design.
- Responsible for full data loads from production to AWS Redshift staging environment.
- Created data models for AWS Redshift, Hive and Hbase from dimensional data models.
- Data validation & development of reports from SQL, Power BI, MS. Access, OBIEE warehouse and Oracle with advanced excel functions like (Vlookup, Hlookup. Index, macros, etc.), along with structural and user documentation
- Experience in Big Data technologies like Hadoop and it supported databases like hive, Mongo DB and NoSQL.
- Created tables, views, sequences, indexes, constraints and generated sql scripts for physical implementation.
- Reverse Engineered databases schema into ERwin Physical Data Model for reusability and then forward engineered them to Teradata using Erwin.
- Performed ad-hoc analyses, ETL data transformations, and automation's using TSQL Scripts from start to finish, including defining business objectives, analysis approach, datamining and analysis
- Performed operations such as Fine-tuning highly complex TSQL stored procedures for maximum efficiency and performance.
- Resolved the data inconsistencies between the source systems and the target system using the Mapping Documents.
- Connected to various sources of data via tableau to validate and build dashboards and created story lines and made presentations on the findings.
- Defining best practices for Tableau reporting development. Hands-on development assisting users in creating and modifying worksheets and data visualization dashboards.
- Involved in data validations of the results in Tableau by validating the numbers against the data in the database
- Worked on migrating SQL Server databases, data warehouse & reporting to AWS using Redshift, DynamoDB, Antiunity & Tableau
- Designed the data sourcing, data staging and ETL process. Mapped the data between the source and targets.
- Reviewed user test cases, defects and validated test results during user acceptance testing (UAT).
- Prepared/Reviewed weekly testing status and reports. Presented program-level testing information at meetings as requested.
Confidential, Lansing, MI
Senior Data Modeler/ Data Analyst
Technologies: Erwin 9x, Oracle 11g, Python, Hadoop
Responsibilities:
- Conducting the meetings with business users to gather data warehouse requirements
- Conducted team meetings and Joint Application Design (JAD) session
- Gathered requirements and created use case diagrams as part of requirements analysis.
- Developed a Conceptual model using Erwin based on requirements analysis.
- Developed normalized Logical and Physical database models to design OLTP system for education finance applications.
- Created dimensional model for the reporting system by identifying required dimensions and facts using ERwin r8.0
- Modeled a database to store customer demographic and other data.
- Produced functional decomposition diagrams and defined logical data model.
- Used forward engineering to create a Physical Data Model with DDL, based on the requirements from the Logical Data Model.
- Implemented Referential Integrity using primary key and foreign key relationships.
- Worked with the Business Analyst, QA team in their testing and DBA for requirements gathering, business analysis, testing and project coordination.
- Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetch data from legacy Oracle and SQL Server database systems
- Translated business concepts into XML vocabularies by designing XML Schemas with UML
- Exhaustively collected business and technical Metadata and maintained naming standards
- Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information.
- Created physical data models using forward engineering
- Identified and tracked the slowly changing dimensions and determined the hierarchies in dimensions.
- Worked with ETL teams and used Informatica Designer, Workflow Manager and Repository Manager to create source and target definition, design mappings, create repositories and establish users, groups and their privileges.
- Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement, Involved in Data Mapping.
- Consulted with client management and staff to identify and document business needs and objectives, current operational procedures for creating the logical data model.
- Created and implemented MDM data model for Consumer/Provider for HealthCare MDM product
- Involved in several facets of MDM implementations including Data Profiling, Metadata acquisition and data migratio
- Designed extract, load and transform (ELT) components and process flow using Python
- C reated Parameters, customized Calculations, Conditions and Filters for various analytical reports and dashboards in Tableau configuration, database design, application module, data retrieval or related SQL/TSQL code.
- Building, publishing customized interactive reports and dashboards, report scheduling using Tableau server.
- Writing SQL query in Teradata and validate the result in report(OBIEE)
- Participated in performance management and tuning for stored procedures, tables and database servers
- Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT)
- I ncluded migration of existing applications and development of new applications using AWS cloud services.
- Responsible in integrated the work tasks with relevant teams for smooth transition from testing to implementation phase.
Confidential, Columbus, Ohio
Data Modeler/ Data Analyst
Technologies: Informatica Power Center 9.1, Oracle 11g, Teradata 13 & SAS
Responsibilities:
- Interacted with users and business analysts to gather requirements.
- Understood existing data model and documented suspected design affecting performance of the system
- Initiated and conducted JAD sessions inviting various teams to finalize the required data fields and their formats.
- Developed Logical and Physical Data models by using ERWin r7.0
- Created logical data model from the conceptual model and its conversion into the physical database design.
- Extensively used Star Schema methodologies in building and designing the logical data model into Dimensional Models
- Applied second and third NFR to normalize existing ER data model of OLTP system.
- Used Denormalization technique in DSS to create summary tables and improved complex Join operation.
- Worked with the project team to perform gap analysis, design, development, testing, implementation and production support tasks.
- Understanding and articulating the scope, risks, issues, project dependencies, deliverables and milestones and preparing presentations for senior management understanding and attention towards high risks/issues.
- Extensively involved in requirements gathering and data gathering to support developers in handling the design specification.
- Work independently on data modeling/architecture and data aggregation design.
- Extracted data from existing data source and performed Ad-Hoc queries by using SQL, MS Access and UNIX.
- Migrated data from Flat Files to Teradata database using Teradata utilities (FastLoad, MultiLoad, TPump) and SAS PROC and D steps.
- Designed and developed various business reports by using advanced techniques with SAS (PROC Report, PROC Tabulate, PROC SQL), Teradata SQL and OLAP functions (csum, msum, mavg, mdiff, cube and rollup), and MS Excel.
- Created volatile and global temporary tables to load large volumes of data into Teradata database
- Designed/Developed scripts to move data from the staging tables to the target tables.
- Performed UAT on the Data Marts and validated the models against the business requirements.
- Prepared the ETL mapping specifications to extract data from the existing source systems to the Data marts.
- Assisted ETL developers with the design and development of ETL workflows for extracting, cleansing, transforming, integrating, and loading data into different Data Marts.
- Used TOAD to view Database Objects, view and develop Database Scripts.
- Maintained data consistency and database integrity and attended team meetings to identify requirements for data loading and reporting.
- Created and granted access to limited views for BAs across departments to share useful data without compromising customer data confidentiality.
- Documented data sources and transformation rules required populating and maintaining data warehouse content.
- Created and managed project templates, use case project templates, requirement types and traceability relationships in Requisite Pro and managed the concurrence of the parties involved with respect to the evaluation criterion.
- Maintenance of Tableau dashboards and make changes or re -design as per new business rules and changes in the requirement.
- Modified data sources and wrote complex SQL's in custom SQL in Tableau to get data in the required form or layout so that it can be used for visualization purposes.
- Master Data Management for Customer Master, Location Master. Defined conceptual, Logical, Physical architecture. Defined data sync patterns, access patterns, governance framework. Implemented using IBM MDM 11.5
- Conducted the Data Analysis and identified the Data quality issues using Data profiling methodologies.
- Prepared Data Cleansing methodologies by relating them to DQ issues.
- C reate TSQL stored procedure to support legacy application to retrieve data.
- Developed strategies with Quality Assurance group to implement Test Cases in Quality Center for stress testing and UAT (User Acceptance Testing).
- Wrote Test Plan and created Test cases and scenarios for the various Reports and Dashboards in accordance with the business requirements. Was responsible for writing and updating test plan documents for every version update to the Data Mart.
- Prepared/Reviewed weekly testing status and reports. Presented program-level testing information at meetings as requested.
Confidential
Data Modeler/Data Analyst
Technologies: SQL, ERWIN r7.3, INFORMATICA, VISIO, Oracle
Responsibilities:
- Created documentation and test cases, worked with users for new module enhancements and testing.
- Participated in different phases of projects starting from Business Walk through for requirements gathering, business analysis, Design, Coding Testing and project coordination.
- Worked with the Business analyst for requirements gathering, business process analysis and project coordination.
- Extensively used Agile methodology as the Organization Standard to implement the data Models.
- Resolved the revolving issues by conducting and participating in JAD sessions with the users, modelers, and developers.
- Conducted design discussions and meetings to come out with the appropriate data mart.
- Experience in Data Transformation and Data Mapping from source to target database schemas and also data cleansing.
- Developed the Star Schema for the proposed warehouse model to meet the requirements.
- Identified the objects and relationships between the objects to develop a logical model using ERWin and later translated the model into physical model.
- Defined and processed the facts and dimensions.
- Resolved the data type inconsistencies between the source systems and the target system using the Mapping Documents.
- Created documentation, worked with users for new module implementation.
- Involved in assigning indexes.
- Created tables, views, sequences, triggers, tablespaces, constraints and generated DDL scripts for physical implementation.
- Created domains, which supports Target Database data types.
- Created Erwin template to support the organization standards and Database standards.
- Determined data rules and conducted Logical and Physical design reviews with business analysts, developers and DBAs.
- Created secure views to mask the data.
- Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted table/columns as part of data analysis responsibility.
Confidential
Business Data Analyst
Technologies: SQL Server 2008 R2, Teradata
Responsibilities:
- Involved in gathering, analyzing and documenting business and technical requirements. Analyzed project requirements to Confidential gaps, provide technical and functional solutions.
- Worked closely with BAs to gather and provide specific and crucial information through data mining using advanced SQL queries, OLAP functions etc.
- Discovered Data Lineage and read application material to understand workflows and complexity. Mined data from numerous source systems.
- Collected SQL queries and re-wrote them as necessary for pattern recognition and generation for events.
- Conducted workshops for user acceptance testing, training and support. Created Pivot tables in Microsoft excel for further analysis.
- Developed user test cases and validated test results during user acceptance testing (UAT).
- Created detailed documentation to explain the results achieved after performing validation.
- Performed data cleansing by analyzing and eliminating duplicate and inaccurate data.
- Proficient at working on LINUX/UNIX platforms and have good knowledge about Databases (SQL).
- Exceptional troubleshooting and problem-solving skills including TWS end-to-end including incident and problem management with responsibility for root cause analysis and resolution.
- Part of the AT&T U2L (Unix to Linux) Migration Project and successfully completed the migration of both the applications. Key resource in the process of the Configuring new TWS masters and DB2 server.
- Handling day to day troubleshooting and maintenance requests.
- Implementing and supporting a highly available and scalable TWS domain manager on LINUX in a TWS End-to-End configuration supporting Fault Tolerant Agents (FTA) on Windows and Linux and z-Centric Agents across Windows and Linux platforms.
- Developed Shell, Perl scripts to automate regular tasks