We provide IT Staff Augmentation Services!

Data Modeler/analyst Resume

3.00/5 (Submit Your Rating)

Troy, MI

PROFESSIONAL SUMMARY:

  • 8 years of IT experience in the field of Data/Business analysis, ETL Development, Data Modeling, and Project Management.
  • Strong experience in Business and Data Analysis, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Integration, Metadata Management Services and Configuration Management.
  • Extensive Experience in Relational Modeling, Dimensional Modeling, Conceptual, Logical and Physical modeling for Online Transaction Processing and Online Analytical Processing (OLTP & OLAP) systems, Star Schema, Snowflakes Schema, ERD (IDEF1X and IE notation) ER Diagrams, Granularity, Cardinality and Database Reengineering using Erwin and Embarcadero.
  • Ability to collaborate with peers in both, business, and technical areas, to deliver optimal business process solutions, in line with corporate priorities.
  • Knowledge in Business Intelligence tools like Business Objects, Tableau, Micro strategy, Cognos and OBIEE.
  • Strong experience in interacting with stakeholders/customers, gathering requirements through interviews, workshops, and existing system documentation or procedures, defining business processes, identifying and analyzing risks using appropriate templates and analysis tools.
  • Experience in various phases of Software Development life cycle (Analysis, Requirements gathering, Designing) with expertise in documenting various requirement specifications, functional specifications, Test Plans, Source to Target mappings, SQL Joins.
  • Experience in conducting Joint Application Development (JAD) sessions for requirements gathering, analysis, design and Rapid Application Development (RAD) sessions to converge early toward a design acceptable to the customer and feasible for the developers and to limit a project’s exposure to the forces of change.
  • Good experience in data transformation, data mapping from source to target database schemas and also data cleansing.
  • Hands on experience in Normalization and De - Normalization techniques for optimum performance in relational and dimensional database environments.
  • Experience in coding SQL/PL SQL using Procedures, Triggers and Packages.
  • Good understanding of Relational Database Design, Data Warehouse/OLAP concepts and methodologies.
  • Implemented Optimization techniques for better performance on the ETL side and also on the database side.
  • Excellent Communication, interpersonal, analytical skills and strong ability to perform in a team as well as individually.

TECHNICAL SKILLS:

DATA MODELING: Erwin 9.2,9.64, Power Designer, Microsoft Visio, and ER Studio.

DATABASES: Oracle 11g/10g/9i,MS SQL Server 2012/2008/2005 , MS Access, Teradata

ETL TOOLS: Informatica Power Center, Metadata Manager, Informatica Analyzer, Ascential Data Stage

OPERATING SYSTEMS: Windows, UNIX, Sun Solaris, AIX, HP

PROGRAMMING LANGUAGES: SQL, PL/SQL, UNIX Shell Scripting, XML, P

REPORTING TOOLS: Tableau, Cognos, Domo, Business Objects, Microstrategy, Crystal Reports, OBIEE.

PROFESSIONAL EXPERIENCE:

Data Modeler/Analyst

Confidential, Troy, MI

Responsibilities:

  • Performed forward engineering in Erwin tool to generate schemas and implement them in DB.
  • Developed a Conceptual model using Erwin based on business requirements.
  • Performed reverse engineering SQL db to get physical data models.
  • Performing product development process in agile sprint methodology and even involved in project management to deliver the sprints with deadlines.
  • Acted as strong Data Analyst analyzing the data from low level in conversion projects, provided mapping documents between legacy, production and user interface systems.
  • Reviewed Entities and relationships in the engineered model and cleansed unwanted tables/ columns as part of data analysis responsibilities.
  • Developed the data warehouse dimensional models & schema for the proposed central model for the project.
  • Responsible for defining the naming standards for data warehouse
  • Worked with Business Analysts and Solution Designers to conduct design reviews and validate the developed models.
  • Worked on Snow-flake and Star Schema to build the Dimensions and Facts to remove redundancy.
  • Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable.
  • Worked on data mapping process from source system to target system by creating Source to Target mapping documents which were used for loading data into data warehouse where these documents were referred by all the cross functional teams.
  • Reviewed Data Mapping documents and provide guidance to Business System Analysts.
  • Testing the target table structures to validate them against the source data.
  • Created table structures in Stage and data store areas to get the source data into the new platform.
  • Worked with the test team to provide insights into the data scenarios and test case
  • Worked on data profiling and data validation to ensure the accuracy of the data between the warehouse and source systems.

Environment: CA Erwin Data Modeler, Microsoft SSMS, SQL Azure, Teradata, Oracle SQL Developer, Cognos, Microsoft Office, Azure Devops.

Data Modeler/Analyst

Confidential, Atlanta GA

Responsibilities:

  • Data Modeler/Analyst in Data Guardians Team, responsible for building Logical and Physical model, and enhancements for MAC Project.
  • Integrated the Customer Account & Subscription Data and metadata related data sources into a single big data platform to support analytical capabilities
  • Ensure sensitive data is tagged in all data sources to adhere data privacy policies.
  • Handled entire source to target mapping lineage for all the schemas in the data lake and also enhanced the logical models.
  • Captured and processed vast amounts of multi-structured data to data lake to tag them for GDPR Confidential OLTP level.
  • Support Production feeds of all sources established to data lake and also Capture data operations data flow from source to the product.
  • Worked as a Data Modeler to generate Data Models using Erwin and developed relational database system.
  • Developed, managed and validated existing data models including logical and physical models of the data warehouse and source systems utilizing a 3NFmodel.
  • Used DDL scripts in TOAD to get the schema structure and logical model to implement the model with enhancements in Erwin.
  • Ensured to get all required information from business stakeholders and follow standard naming conventions abbreviations to create CDM, LDM and PDM.
  • Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships as well as the definition of business rules governing the integrity, archiving, and audit requirements of the data
  • Used documented procedures to create data dictionaries capturing business and technical metadata and Data Definition Language (DDL) scripts.
  • Maintaining the naming standard files across the enterprise for defining the entities, attributes for all the data models.
  • Worked on enhancing current models using Excel macros and Erwin increasing the model standards.
  • Merged the duplicate records and ensure that the information is associated with company records.
  • Standardized company names, addresses, and ensure that necessary data fields are populated.
  • Reviewed the database proactively to identify inconsistencies in the data, conduct research using internal and external sources to determine information is accurate.
  • Coordinated activities and workflow with other data Stewards to ensure data changes are done effectively and efficiently
  • Reviewed the database to identify and recommend adjustments and enhancements, including external systems and types of data that could add value to the system.
  • Extracted the data from database and provide data analysis using SQL to the business user based on the requirements.

Environment: CA Erwin Data Modeler, Toad Data Point, SQL Developer, Heidi SQL/Maria DB, Hive, Tortoise SVN, MicroStrategy, Microsoft Office, Global ID Inventory Tool.

Data Modeler/Analyst

Confidential, Mississippi

Responsibilities:

  • Worked with the IV&V team, the DOM project PM and BA, MMIS-eligibility production support and other DOM management staff, and the Xerox team.
  • Assessed the degree to which records conform to the approved business rules.
  • Provided data analysis of data as requested by the business for business efficiencies and productivity.
  • Formulated queries to extract, manipulate, format and present data stored in relational databases, while maintaining strict confidentiality and security of the patient health information contained in the data sources.
  • Used SQL tools to run SQL queries and validate the data loaded in to the target tables
  • Helped in migration and conversion of data from the Oracle 10 database into Oracle11g database, preparing mapping documents and developing partial SQL scripts as required.
  • Have strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted
  • Developed and provided regular statistical reports as requested based on user requirements/needs.
  • Performed Data Analysis and extensive Data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Identified& recorded defects with required information for issue to be reproduced by development team.
  • Analyzed and tested converted data to help ensure data quality
  • Assisted DOM with data cleanup and data quality issues by recommending specific production cleanup actions
  • Validated data from the legacy MEDS system into the Modernized Medicaid Eligibility System
  • Validated the extraction, transformation and loading of data into the databases for completeness and integrity
  • Conducted testing to ensure accurateness of data, compared to the source
  • Worked with the requirements analyst and ETL developers to review unit test cases via test driven development
  • Evaluated and analyzed test results to identify areas of regression testing and system vulnerability
  • Provided the necessary documentation and reporting to track and record the status of testing, monitoring, maintaining and documenting test results
  • Reported and managed defects using defect management tools
  • Guided junior DOM testers by assisting with complex testing scenarios and issues

Environment: MS Excel, MS Access, Oracle 11g, Toad, IBM Rational CQ, Windows XP, SQL, PL/SQL.

Senior Data Modeler/Analyst

Confidential, Honolulu, HI

Responsibilities:

  • Worked with users to identify the most appropriate source of record required to define the asset data for financing.
  • Performed data profiling in the source systems that are required for financing.
  • Validated and updated the appropriate LDM's to process mappings, screen designs, use cases, business object model, and system object model as they evolve and change.
  • Developed logical data models and physical database design and generated database schemas using Erwin.8.
  • Documented the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.
  • Collaborated with ETL developers in the creating the Data Functional Design documents.
  • Ensured that models conform to established best practices (including normalization rules) and accommodate change in a cost-effective and timely manner.
  • Involved in migration projects to migrate data from data warehouses on Oracle and migrated those to DB2.
  • Responsible for Generating Weekly ad-hoc reports.
  • Planned, coordinated, and monitored project levels of performance and activities to ensure project completion in time.
  • Designed logical and physical data models for multiple OLTP applications.
  • Understood business and system requirements to architect data and database solutions through logical and physical data modeling to meet project requirements.
  • Utilized tools to efficiently build data models and maintain metadata and data model repository.
  • Performed data auditing, validation and cleansing as well as metadata mapping and migrations. Documented, developed and implemented data retention and purging policies for centralized, full lifecycle data management
  • Developed, managed and validated existing data models including logical and physical models of the data warehouse and source systems utilizing a 3NFmodel.
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Designed the ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) for Oracle and Teradata as per business requirements using Erwin.
  • Worked on building the data model using Erwin as per the requirements, discussion and approval of the model from the BA.
  • Generated DDL from Erwin and make the same available to DBA for deployment in DEV.
  • Developed Naming standards for data warehouse and Data Mart tables.
  • Interacted with computer systems end-users and project business sponsors to determine, document, and obtain signoff on business requirements.
  • Extensively worked on SQL in analyzing the database and querying the database as per the business scenarios.
  • Coordinated with the business users in providing appropriate, effective and efficient way to design the new reporting needs based on the user with the existing functionality.
  • Worked in Agile environment.
  • Remained knowledgeable in all areas of business operations in order to identify systems needs and requirements.

Environment: MS Excel, MS Access, DB2, Teradata, Oracle 10g, Tableau, UNIX, Windows XP, SQL, PL/SQL, Power Designer, Informatica, Linux, Erwin.

Senior Data Modeler/Analyst

Confidential, Hartford, CT

Responsibilities:

  • Met Customers to determine User requirements and Business Goals.
  • Blended technical and business knowledge with communication skills to bridge the gap between internal business and technical objectives and serve as an IT liaison with the business user constituents.
  • Worked closely with the claims processing team to obtain patterns in filing of fraudulent insurance claims.
  • Involved in understanding Logical and Physical Data model using Erwin Tool.
  • Data profiling of source to determine target column data type
  • Enhanced data model in Erwin to enhance the Claim Foundation tables.
  • Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks
  • Worked on claims data and extracted data from various sources such as flat files, Oracle and Mainframes.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Used SQL tools to run SQL queries and validate the data loaded in to the target tables
  • Validated the data of reports by writing SQL queries in PL/SQL Developer against ODS.
  • Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted
  • Performed Data Analysis and extensive Data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Identified& recorded defects with required information for issue to be reproduced by development team.
  • Wrote Teradata Macros and used various Teradata analytic functions.
  • Worked extensively in documenting the Source to Target Mapping documents with data transformation logic.
  • Interacted with the SME’s to analyze the data extracts from Legacy Systems (Mainframes and COBOL Files) and determine the element source, format and its integrity within the system.
  • Transformations of requirements into data structures, which can be used to efficiently store, manipulate and retrieve information.
  • Worked with internal architects and, assisting in the development of current and target state enterprise data architectures.
  • Worked in Agile environment in a scrum team.
  • Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines.
  • Ensured that models conform to established best practices (including normalization rules) and accommodate change in a cost-effective and timely manner.
  • Patterns were observed in fraudulent insurance claims using text mining in R.

Environment: Oracle 10g, Sql-2008, Teradata, Microsoft Excel, Microstrategy, MS Visio, MS Project, Informatica, SAS.

Data Modeler/Analyst

Confidential, Auburn Hills, MI

Responsibilities:

  • Worked with users to identify the most appropriate source of record required to define the asset data for financing.
  • Documented the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.
  • Involved in defining the trumping rules applied by Master Data Repository
  • Defined the list codes and code conversions between the source systems and MDR.
  • Worked with internal architects and, assisting in the development of current and target state enterprise data architectures.
  • Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines.
  • Involved in defining the source to target data mappings, business rules, and business and data definitions.
  • Documented, clarified, and communicated requests for change requests with the requestor and coordinated with the development and testing team.
  • Created the DDL scripts using ER Studio and source to target mappings (S2T- for ETL) to bring the data from JDE to the warehouse.
  • Created the dimensional logical model with approximately 10 facts, 30 dimensions with 500 attributes using ER Studio.
  • Extensive experience in Relational Data modeling for creating Logical and Physical Design of Database and ER Diagrams using multiple data modeling tools like ERWIN and ER Studio.
  • Interpreted logical and physical data models for Business users to determine common data definitions and established referential integrity of the system using Erwin.
  • Designed different type of STAR schemas using ERWIN with various Dimensions like time, services, customers and FACT tables.
  • Worked with end users to collect business data quality rules and worked with the development team to establish technical data quality rules.
  • Involved in configuration management in the process of creating and maintaining an up-to-date record of all the components of the development efforts in coding and designing schemas.
  • Developed the financing reporting requirements by analyzing the existing business objects reports.
  • Interacted with computer systems end-users and project business sponsors to determine, document, and obtain signoff on business requirements.
  • Responsible in maintaining the Enterprise Metadata Library with any changes or updates.
  • Documented data quality and traceability documents for each source interface.
  • Coordinated with the business users in providing appropriate, effective and efficient way to design the new reporting needs based on the user with the existing functionality.

Environment: SQL/Server, Oracle 9i, MS-Office, Teradata, Informatica, ER Studio, XML, Business Objects, OBIEE

Data Modeler/Analyst

Confidential

Responsibilities:

  • Performed data profiling in the source systems that are required for Dual Medicare Medicaid Data mart.
  • Documented the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.
  • Defined the list codes and code conversions between the source systems and MDR.
  • Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines.
  • Responsible for defining the functional requirement documents for each source to target interface.
  • To Document, clarify, and communicate requests for change requests with the requestor and coordinate with the development and testing team.
  • Created logical and physical data using best practices to ensure high data quality and reduced redundancy. Defined, documented and articulated design goals and standards.
  • Took ownership of data models, both logical and physical, inclusive of version control.
  • Optimize and update logical and physical data models to support new and existing projects.
  • Perform reverse engineering of physical data models from databases and SQL scripts.
  • Provide the technical documentation to the ETL developers for developing the data loading techniques & the design details to the Reporting team.
  • Documented data quality and traceability documents for each source interface.
  • Designed and implemented data integration modules for Extract/Transform/Load (ETL) functions
  • Involved in data warehouse and Data mart design.
  • Gained Experience with various ETL, data warehousing tools and concepts.
  • Have good Experience with Mainframe enterprise-billing systems involved in defining the business/transformation rules applied for Dual Medicare Medicaid data.
  • Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines.
  • Used data analysis techniques to validate business rules and identify low quality for Missing data in the existing Enterprise data warehouse (EDW).
  • Also worked on some impact of low quality and/or missing data on the performance of data warehouse client.

Environment: SQL/Server, Oracle10&11g, MS-Office, Teradata, Enterprise Architect, Informatica, ER Studio, XML, Informatica, OBIEE.

We'd love your feedback!