We provide IT Staff Augmentation Services!

Sr. Data Modeler Resume

5.00/5 (Submit Your Rating)

San Ramon, CA

SUMMARY

  • Over 10 years of professional IT experience and 8+ years in data modeling and data analysis as a Proficient in gathering business requirements and handling requirements management.
  • Experienced in designing the Conceptual, Logical and Physical data modeling using Erwin, Power Designer and ER Studio Data modeling tools.
  • Good in system analysis, ER Dimensional Modeling, Database design and implementing RDBMS specific features.
  • Experience in modeling with both OLTP/OLAP systems and Kimball and Inmon Data Warehousing environments.
  • Good in Normalization / Demoralization techniques for effective and optimum performance in OLTP and OLAP environments.
  • Having good experience with Normalization (1NF, 2NF and 3NF) and Denormalization techniques for improved database performance in OLTP, OLAP, Data Warehouse and Data Mart environments.
  • Having experience and created high level interacted models and developed with Data Virtualization Tools as per the requirement.
  • Well - versed in designing Star and Snowflake Database schemas pertaining to relational and dimensional data modeling.
  • Experienced in designing the data mart and creation of cubes.
  • A good expertise in Extract Transform and Load (ETL) data from spreadsheets, database tables and other sources using Microsoft Data Transformation Service (DTS) and Informatica.
  • Hands on experience in Hadoop tools such as Hive, Hbase, Sqoop and Pig.
  • Hands on experience writing Bteq scripts, MLOAD’s in Teradata
  • Experience in creating Info Cubes, ODS with SAP BW/BI and used different T-Codes.
  • Good working knowledge of Meta-data management in consolidating metadata from disparate tools and sources including Data warehouse, ETL, Relational Databases and third-party metadata into a single repository to get information on data usage and end-to-end change impact analysis.
  • Having experience in writing complex SQL queries to perform end-to-end ETL validations and support Ad-hoc business requests. Also good in developed Stored Procedures, Triggers, Functions, Packages using SQL/PLSQL.
  • Efficient in analyzing and documenting business and functional requirements along with Use Case Modeling and UML.
  • Experience in data transformation, data mapping from source to target database schemas, data cleansing procedures.
  • Well versed in conducting Gap analysis, Joint Application Design (JAD) session, User Acceptance Testing (UAT), Cost benefit analysis and ROI analysis.
  • Experience by using finance model and Bloomberg application to capture selling or divesting assets in the banking domain.
  • Experience in understanding of syndicated data (Oracle, Nielsen, etc) and business tools including Tableau.
  • A good familiarity and experience in the work environment consisting of Business analysts, Production/Support teams, Subject Matter Experts, Database Administrators and Database developers.
  • Excellent problem solving and analytical skills with exceptional ability to learn and master new technologies efficiently

TECHNICAL SKILLS

Modeling Tools: ERStudio, ERwin, Power Designer, MS Visio and Oracle Data Modeler.

Reporting Tools: Cogno's 8 / 10, OBIEE, Tableau, Business Objects, Denedo

Databases: Oracle7/8i/9i/10g/11g, MS SQL Server 2000/2005/2008/2012 , MS Access, Teradata, DB2.

Operating Systems/Platforms: Windows NT, 2000 & XP, Vista, AIX, UNIX, Linux and Sun Solaris.

Other Tools: SQL Navigator, TOAD, PVCS, JIRA, Aqua Data Studio and Informatica Power Center, SAP BI/BW.

Languages: C, C++, VB, Java, .NET, PL/SQL and SQL.

PROFESSIONAL EXPERIENCE

Confidential, San Ramon, CA

Sr. Data Modeler

Responsibilities:

  • Interacting with Business Analysts to gather the user requirements and participated in data modeling sessions.
  • Working with DevOps team, if any necessary changes like modeling issues and enhancements has done as part of the requirements.
  • Performed data analysis and profiling of source data to better understand the sources.
  • Created 3 NF business area data modeling with de-normalized physical implementation; data and information requirements analysis.
  • Created E/R Diagrams, Data Flow Diagrams, grouped and created the tables, validated the data, identified PK/ FK for lookup tables.
  • Using 3rd NF created metadata collection and validation; standards compliance verification; ODS schema maintenance; database-to-database data type.
  • Involved in non-transactional data entities of an organization of MDM that has the objective of providing processes for collecting, aggregating, matching, consolidating, quality-assurance, persistence and distribution.
  • Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables and columns as part of Data Analysis responsibilities.
  • Developed the data warehouse model (Kimball’s) with multiple data marts with conformed dimensions for the proposed central model of the Project.
  • Implemented Star Schema methodologies in modeling and designing the logical data model into Dimensional Models.
  • Conducted design sessions with Business Analysts and ETL developers to come up with a design that satisfies the organization’s requirements.
  • Working with Database Administrators (DBAs) to finalize the physical properties of the tables such as Partition key, based on volumetric.
  • Analyzed the data from the sources, designed the data models and then generated scripts to create necessary tables and corresponding records for DBAs using Informatica.
  • Developed number of Data correction scripts using PL/SQL to handle any manual adjustments/corrections in System.
  • Member of Data model and Design Review group that oversees data model changes and system design across Enterprise Data Warehouse.
  • Interact with subject matter expert for IT personnel on Data Virtualization software (Denedo), including performance tuning, troubleshooting, gathering and decisions about configuration. Designs and develops high quality integration solutions consistent with given requirements, Involved in creating Base views, interface views and derived views.
  • Collaborates with other application development teams to design, develop and deploy solutions leveraging the data virtualization tool. Supports the definition of, and adherence to, Data Virtualization standards, conventions, principles.
  • Plans and coordinate with DBAs to installation of upgrades, enhancements, scheduling and changes to the Data Virtualization software. Defines the testing and quality assurance for all upgrades, enhancements, and changes to the Data Virtualization software.

Environment: Oracle 12C, Teradata 12.0, Teradata Sql Assistant, Informatica 9.6.1, Denedo Platform 6.0, Toad for Oracle 11.5 Expert, Erwin Data Modeler 9.5, Hadoop, MS Visio, OBIEE, Python, Copybooks, Jira, Java.

Confidential, Bellevue, WA

Sr. Data Modeler

Responsibilities:

  • Worked as a Data Architect as part of IDW (Integrated Data Warehouse) project under Agile DevOps methodology
  • Worked with SME, Data stewards
  • Working on different subject areas like Party, Account, Line of Service, Finance and Product
  • Working with Business Analyst to gather business attributes as part of the report and creating LDM based on the business attributes using Power Designer 16.
  • Review LDM with Lead Data Architects and then review with entire team.
  • Creating PDM and Source to Target Mapping documents and review with Lead Data Architects
  • Generating DDLs using Power Designer and executing them on Teradata and Hadoop
  • Data Profiling of source using Oracle and Hue (Hadoop / Hive UI) and supporting developers by testing the target on Teradata and Hadoop.
  • Validating scripts transformations and joins in PIG.
  • Actively participate in backlog grooming session with scrum master, product owner and solution owner
  • Created a standard document for the client to list out steps on “How to create DDL for Teradata and Hadoop using Power Designer”
  • Using Agile Central (Rally) to enter tasks and associated hours for each task
  • Using GitHub and Accurev to actively manage and promote DDLs
  • Working with different source systems like RPX, Samson, Chub, Ericsson, SAP ECC etc.
  • Designs and develops high quality integration solutions consistent with given requirements using Denedo.

Environment: Power Designer 16, Hadoop, HIVE, HDFS, Oracle SQL Developer, Microsoft SQL Server 2008, Teradata SQL Assistant 14.0, AccuRev, Cntrl M, SAP BW, SAP ECC, Denedo, Business Objects.

Confidential, Phoenix, AZ

Data Modeler

Responsibilities:

  • Conducted one-on-one sessions with business users to gather data warehouse requirements
  • Analyzed database requirements in detail with the project stakeholders by conducting Joint Requirements Development sessions
  • Created Logical and Physical models by using ERwin based on requirements analysis.
  • Developed normalized Logical and Physical database models De Normalize design for HR applications
  • Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model
  • Worked with Database Administrators, Business Analysts and Content Developers to conduct design reviews and validate the developed models.
  • Identified, formulated and documented detailed business rules and Use Cases based on requirements analysis.
  • Facilitated development, testing and maintenance of quality guidelines and procedures along with necessary documentation.
  • Responsible for defining the naming standards for data warehouse.
  • Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetch data from legacy Oracle and SQL Server database systems
  • Translated business concepts into XML vocabularies by designing XML Schemas with UML
  • Exhaustively collected business and technical metadata and maintained naming standards
  • Used IDA for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information
  • Worked with ETL teams and used Informatica Designer, Workflow Manager and Repository Manager to create source and target definition, design mappings, create repositories and establish users, groups and their privileges.
  • Participated in data anomaly resolution, data cleansing and development of cleansing rules for ongoing data synchronization.
  • Used Rational Team Concert (RTC) 3.0.1 for effective model management of sharing, dividing and reusing model information and design for productivity improvement
  • Conducted impact analysis for changing business requirements as per user needs
  • Studied in-house requirements for the Data warehouse to be developed.
  • Participated in performance management and tuning for stored procedures, tables and database servers.
  • Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT)
  • Developed database design in HADOOP and Hands on experience in writing queries in HIVE, validating transformation logics in PIG.
  • Integrated the work tasks with relevant teams for smooth transition from testing to implementation phase.
  • Developed necessary support documentation for end users, operations and testing teams.
  • Worked with UltiPro assist with testing, troubleshooting, support and involved in pulling HRIS payroll information from PeopleSoft to UltiPro.
  • Day-to-day responsibilities was configuration (HR/Benefits/Payroll tables), data conversion testing/validation, issue resolution, integration testing,
  • Understand on workflow/process, reporting, testing, and training/change management.

Environment: Erwin Data Modeler 9.x, Informatica Power Center 9.1/8.6.1, Windows XP/NT/2000, SQL Server 2008/2012, SQL, Oracle, Hadoop, Cogno’s 10, Framework Manager, Report Studio, Query Studio, Tableau, UltiPro, MS Excel, MS Visio, Rational Rose, Requisite Pro, SSIS, SSRS

Confidential, Livermore, CA

Data Modeler

Responsibilities:

  • Working with business users to know the business requirements.
  • Reverse Engineer Guidewire data model.
  • Enhance/customize Guidewire database and maintain Guidewire database xml based changes in change log.
  • Designed the ODS with core tables and now working on enhancing this model for additional master data.
  • Worked with ACORD XSD structure and Property and Casualty Insurance data model.
  • Used schema slicer to slice schemas (XSD) needed from ACORD and will be the first step in building the canonical XSD / ACORD Compliance.
  • Created End to End database and service data mapping.
  • Created reference data architecture.
  • Created Logical Data Models and Physical Data Models using ERStudio Data Modeler.
  • Created process flow diagrams by using MS Visio and maintained design document.
  • Extensively using the Erwin design tool & Erwin model manager to create and maintain the versions of the Inland Marine data model. Created integrity rules and defaults.
  • Created documentation and test cases, working with users for new module enhancements and testing.
  • Used Toad for Data Analyst extensively for data analysis and writing stored procedures, sql prototypes as needed.

Environment: Sq1 Server 2005/2008, DB2, OBIEE, Oracle 10g/11i, Toad for Oracle 9.1 Expert, ER/Studio Data Architect 8.5.3, MS Visio, JAVA, XML, XML SPY, Windows XP, QC Explorer.

Confidential

Data Modeler / Data Analyst

Responsibilities:

  • Gathered and translated business requirements into detailed, production-level technical specifications detailing new features and enhancements to existing business functionality.
  • Developed the UML class diagrams for the proposed system using Erwin Software Architect.
  • Performed data modeling using data modeling tool Embarcadero Data Architect.
  • Created the data models for OLTP and Analytical systems.
  • Performed in depth data analysis on Oracle and Teradata systems.
  • Extensively worked with the Teradata SQL Assistant and Toad.
  • Involved in data profiling to integrate the data from different sources.
  • Extensively performed the gap analysis and impact analysis.
  • Interacted with SME, developer to gather database requirements, actively participated in functional requirement gathering, system requirement gathering, application design
  • Created the data mapping document from source to target and the data quality assessments for the source data.
  • Worked with DBAs on support needs and provide guidance on architectural issues.
  • Assisted development teams with loading data, extracting data (ETL) etc.,
  • Assisted QA for developing test plans and test cases for Unit Testing, System Testing and Enterprise testing.

Environment: Oracle Database 10g, Teradata 12.0, Windows XP, TOAD 9.6, Power Designer, Erwin 8.x, Portal 1.5.2, Teradata SQL Assistant 13.0, JIRA.

Confidential

Data Analyst

Responsibilities:

  • Gathered and translated business requirements into detailed, production-level technical specifications detailing new features and enhancements to existing business functionality.
  • Developed project plan with project manager’s assistance for the first two phases of the project.
  • Developed system flow and data flow diagrams for the proposed system.
  • Designed conceptual and logical data models.
  • Identified objects and relationships and how those all fit together as logical entities, these are then translated into physical design using ERwin tool.
  • Normalized the tables up to 3NF.
  • Involved in the critical design review of the finalized database model.
  • Developed test plans and test cases for QA Unit Testing, System Testing and Enterprise testing.
  • Involved in designing and implementing the security for the databases.
  • Helped in migration and conversion of data from the Sybase database into Oracle 10g database, preparing mapping documents and developing partial SQL scripts as required.
  • Created stored procedures, functions, database triggers and packages as per the business needs for developing adhoc and robust reports. Incorporated Dynamic SQL, for generation of the where clauses dynamically based on the lexical parameters passed.
  • Responsible for development and testing of conversion programs for importing data from text files into Oracle database utilizing PERL shell scripts & SQL*Loader.
  • Involved in the daily maintenance of the database that involved monitoring the daily run of the scripts as well as troubleshooting in the event of any errors in the entire process.

Environment: Oracle Database 10g, SYBASE, Linux, Windows XP, Erwin, TOAD, SQL*PLUS, SQL*LOADER

Confidential

BI Developer

Responsibilities:

  • Part of the team responsible for the analysis, design and implementation of the business solutions.
  • Gathered and translated business requirements into detailed, production-level technical specifications, new features and enhancements to existing technical business functionality.
  • Created Logical and Physical Data Models using Erwin.
  • Prepared business case for the data mart and then developed and deployed it.
  • Experienced in creating Info Area’s. Info Cube’s, ODS from different source systems
  • Used Star schema methodology in building and designing the logical data model in the dimensional Models.
  • Mapped the data between Source and Targets.
  • Used MS Visio for creating of process flow diagrams.

Environment: Oracle Database 8i, SQL, SAP ECC, SAP BW, BEx, MS Visio, ERwin, TOAD.

We'd love your feedback!