We provide IT Staff Augmentation Services!

Data Architect/data Modeler Resume

4.00/5 (Submit Your Rating)

Livonia, MI

SUMMARY:

  • Over 10 years of experience in Information Technology as Data Analyst and Data Modeler wif advanced analytical and problem solving skills.
  • Extensive experience wif OLTP/OLAP System and E - R modeling, developing Database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional modeling.
  • Experience in understanding the business needs and gathering user requirements by conducting and participating in JAD sessions, interacting wif end-users and training them on the applications developed for the organization.
  • Expert noledge in SDLC (Software Development Life Cycle) and was involved in all phases in projects.
  • Experience in Design, Development and implementation of the enterprise-wide architecture for structured and unstructureddataproviding architecture and consulting services to Business Intelligence initiatives anddatadriven business applications.
  • Has experience in Dimensional Modeling using Star and Snowflake schema methodologies of Data Warehouse and Integration projects.
  • Expertise in using T-SQL for developing complex Stored Procedures, Triggers, Tables, Views, User Functions, User profiles, Relational Database Models andDataIntegrity, SQL joins and Query Writing.
  • Experienced in creating MOLAP Warehouse on SQL 2005/2008 Server Analysis Services and created multiple cubes wif dimensions and facts.
  • Expertise in writing SQL Queries, Dynamic-queries, sub-queries and complex joins for generating Complex Stored Procedures, Triggers, User-defined Functions, Views and Cursors.
  • Worked in Data warehouseData&ETL Architecture, Oracle Database platform, ErwinData Modeler, Informatica Power center and SSIS, End to End solutions architecting, Implementing for DWBI Engagements.
  • Solid experience in working and creating 3rd Normal Forms (ODS) and Dimensional models (OLAP)
  • Has good exposure on working in offshore/onsite model wif ability to understand and/or create functional requirements working wif client and also has Good experience in requirement analysis and generating test artifacts from requirements docs.
  • An excellent team player& technically strong person who TEMPhas capability to work wif business users, project managers, team leads, architects and peers, thus maintaining healthy environment in the project.

TECHNICAL SKILLS:

Data Warehousing: Informatica 9.6/9.5/9.1/8.6 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor),Ab-Initio, Talend, Pentaho

Reporting Tools: Business Objects6.5, XIR2, Tableau, TIBCO Spotfire

Data Modeling: Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables, Erwin

RDBMS: Netezza Twin fin, Teradata R14, R13, R12, Oracle 11g/10g/9i/8i/7.x

Programming: SQL, PL/SQL, UNIX Shell Scripting, VB Script

Environment: Windows (95, 98, 2000, NT, XP), UNIX

Databases: Oracle 11g/10g/9i/8i, SQL Server 2000, 2005

Web Technologies: HTML, DHTML, XML, XSD, SOAP, WSDL, Web services

Other Tools: TOAD, MS-Office suite (Word, Excel, Project and Outlook), BTEQ, Teradata V2R6/R12/R13/R14 SQL Assistant, Aginity

PROFESSIONAL EXPERIENCE:

Confidential, Livonia, MI

Data Architect/Data Modeler

Roles& Responsibilities:

  • Perform complex data analysis in support of ad-hoc and standing customer requests
  • Delivered data solutions in report/presentation format according to customer specifications and timelines
  • UsedReverse Engineering approach to redefine entities, relationships and attributes in the data model as per new specifications in Erwin after analyzing the database systems currently in use.
  • Enforce referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.
  • Creating the test environment for Staging area, loading the Staging area wifdatafrom multiple sources.
  • Created data models for AWS Redshift, Hive and Hbase from dimensional data models.
  • Worked wif Bigdata technologies such as ELK stack and Kibana
  • Setup best practices around meta-data to ensure an integrated definition of data for enterprise information, and to ensure the accuracy, validity, reusability, and consistent definitions for common reference data, Collaborated Business and IT worlds to has common ground by setting up metadata management as a core foundation between technical and business glossary.
  • Advocate Data Quality best practices and standards, process flow, architectural roadmap, thresholds and tools that promote and facilitate a seamless process and act as change agent wif end to end solutions.
  • Analytically analyze large volume of data to find data trends, facts, figures, data metrics governing data criticality and baseline analysis. Interact wif data owners and presenting finding and translating data reports into an understandable document for business community and stakeholders. Develop in depth analysis to has a complete 360-degree view of data.
  • Handle performance requirements for databases in OLTP and OLAP models.
  • Reverse engineered from Toad database to Erwin and generated SQL script through forward engineer in Erwin.
  • Define and process the facts, dimensions. Designed thedatamarts using the Ralph Kimball's DimensionalDataMart modeling methodology using ER Studio.
  • Documented logical, physical, relational and dimensionaldatamodels. Designed theDataMarts in dimensionaldatamodeling using star and snowflake schemas.
  • Prepare documentation for all entities, attributes,datarelationships, primary and foreign key structures, allowed values, codes, business rules, and glossary evolve and change during the project.
  • Coordinated wif DBA ondatabase build and table normalizations and de-normalizations.
  • Created, documented and maintained logical & physical database models.
  • Identified the entities and relationship between the entities to develop Conceptual Model using ERWIN.
  • Developing Logical Model from the conceptual model.
  • Work wif DBA's to create a best-fit Physical Data Model from the logical data model.
  • Identify source systems, their connectivity, related tables and fields and ensuredatasuitably for mapping.
  • Work wif Teradata and/or Oracle in a multi terabytes environment.
  • Designed & Developed logical & physical data model using data warehouse methodologies, including Star schema - Star-joined schemas, conformed dimensions’ data architecture, early/late binding techniques, data modeling, designing & developing ETL applications using Informatica.

Confidential, NYC, NY

Data Architect/Data Modeler

Roles & Responsibilities:

  • Involved inDatamapping specifications to create and execute detailed system test plans. Thedata mapping specifies watdatawill be extracted from an internaldatawarehouse, transformed and sent to an external entity.
  • Responsible for Enterprise Data Governance and Quality Services layer wif an objective of discovering, planning, defining, monitor, collect, synthesize and distribute data governance and quality matrices across enterprise.
  • Defined Data Quality Architecture which provides seamless integration of data quality metrics between multiple architectural touch points that share the workload, ensure scalability in terms of data volumes, frequency of data stream and complexity of processing.
  • Accurately captured data quality metrics in compliance wif data governance rules and polices and immediately make them available to the network of business domain and lines of business.
  • Balance short-term versus long-term actions, strategic versus tactical requirements, while continuing to move forward towards the strategic vision; participate in the road map to achieve the vision.
  • Identify and interact wif stakeholders, subject matter expert (SMEs) and data owners to establish decision rights and clarify accountability.
  • Defined Data Governance framework and stewardship to define, approve, track and enforce conformance and communicate date strategies, polices, standards, architecture, procedures and metrics
  • Responsible for differentDatamapping activities from Source systems.
  • Involved in data model reviews wif internal data architect, business analysts, and business users wif explanation of the data model to make sure it is in-line wif business requirements.
  • Involved wif Data Profiling activities for new sources before creating new subject areas in warehouse.
  • Extensively worked Data Governance, me.e. Metadata management, Master data Management, Data Quality, Data Security.
  • Updated existing models to integrate new functionality into an existing application.
  • Conducted one-on-one sessions wif business users to gatherdatawarehouse requirements.
  • Developed normalized Logical and Physical database models to design OLTP system
  • Created dimensional model for the reporting system by identifying required dimensions and facts using Power Designer.
  • Created DDL scripts for implementingDataModeling changes. Created Power Designer reports in HTML, RTF format depending upon the requirement, PublishedDatamodel in model mart, created naming convention files, co-coordinated wif DBAs' to apply thedatamodel changes.
  • Designed different type of STAR schemas like detaileddatamarts and Plandatamarts, Monthly Summarydatamarts using ER studio wif various Dimensions Like Time, Services, Customers and various FACT Tables.
  • Developed and maintaineddatadictionary to create metadata reports for technical and business purpose.
  • Involved in extensiveDatavalidation by writing several complex SQL queries and Involved in back-end testing and worked wifdataquality issues.
  • Used forward engineering to generate DDL from the Physical Data Model and handed it to the DBA.
  • Developed dimensional model for Data Warehouse/OLAP applications by identifying required facts and dimensions.
  • Designed STAR schema for the detailed data marts and plan data marts consisting of confirmed dimensions.
  • Defined data management policies including enterprise naming and definition standards, specification of business rules, data quality analysis, standardized calculations and summarization definitions, and retention criteria.
  • Performed logical data modeling, physical data modeling (including reverse engineering) using the Erwin Data modeling tool.
  • Did data modeling and involved wif design and development of conceptual, logical and physical data models using AllFusion Data Modeler (Erwin).

Confidential, Cincinnati, OH

Data Analyst/Data Modeler

Roles & Responsibilities:

  • Worked wif Database Administrators, Business Analysts and Content Developers to conduct design reviews and validate the developed models.
  • Responsible for defining the naming standards fordatawarehouse.
  • Possessed strong Documentation skills and noledge sharing among Team, conducteddata modeling review sessions for different user groups, participated in sessions to identify requirement feasibility.
  • Extensive experience in PL/SQL programming - Stored Procedures, Functions, Packages and Triggers.
  • Massaged the existing model to create new logical and physical models that formed the basis for the new application.
  • Used Power Designer for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information.
  • Performed in depth analysis indata& prepared weekly, biweekly, monthly reports by using SQL, MsExcel, MsAccess, and UNIX.
  • Document the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.
  • Document variousDataQuality mapping document, audit and security compliance adherence
  • Excellent DataAnalytical / User interaction and PresentationSkills.
  • Good Understanding of advanced statistical modeling and logical modeling using SAS.
  • Performeddatamanagement projects and fulfilling ad-hoc requests according to user specifications by utilizingdatamanagement software programs and tools like Perl, Toad, MS Access, Excel and SQL.
  • Participated in development of core application and data warehouse data models.Work wif application developers to map process models to data models.
  • Perform logical and physical OLAP / OLTP schema design.
  • Worked wif data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle.
  • Worked on SQL queries in a dimensional data warehouse as well as a relational data warehouse.
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.

Confidential, Green Bay, WI

Data Modeler / Data Analyst

Roles & Responsibilities:

  • Worked on Informatica TDM for data masking and Informatica DVO for data validation between different systems
  • Worked on Trillium Data Quality tool for monitoring Production systems for Data Anomalies and resolve issues.
  • Worked wif data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
  • Troubleshoot test scripts, SQL queries, ETL jobs and data warehouse/data mart/data store models.
  • Determineddatarules and conducted Logical and Physical design reviews wif business analysts, developers and DBAs.
  • Experienced inDataTransformation andDataMapping from source to target database schemas and alsodatacleansing.
  • Created various PhysicalDataModels based on discussions wif DBAs and ETL developers.
  • Worked ondatamapping process from source system to target system.
  • Created dimensional model for the reporting system by identifying required facts and dimensions using Erwin.
  • Extensively used Star and Snowflake Schema methodologies.
  • Developed and maintainedDataDictionary to create Metadata Reports for technical and business purpose.
  • Worked on Performance Tuning of the database which includes indexes, optimizing SQL Statements.
  • Performed Data modeling using TOAD Data Modeler. Identified objects and relationships and how those all fit together as logical entities, these are then translated into physical design using forward engineering TOAD Data Modeler tool.
  • Performeddatacleaning anddatamanipulation activities using NZSQL utility.
  • Excellent noledge and experience in Technical Design and Documentation.
  • Designed and Developed Oracle PL/SQL Procedures and UNIX Shell Scripts for Data Import/Export and Data Conversions.
  • Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2
  • In depth analyses of data report was prepared weekly, biweekly, monthly using MS Excel, SQL&UNIX.

Confidential

Data Modeler/Data Analyst

Roles & Responsibilities:

  • Eliminated errors in ERwin models through the implementation of Model Mart (a companion tool to ERwin that controls the versioning of models).
  • Used Model Mart of Erwin for TEMPeffective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Created the dimensional logical model wif approximately 10 facts, 30 dimensions wif 500 attributes using ER Studio.
  • Implemented the Slowly changing dimension scheme (Type II) for most of the dimensions.
  • Implemented the standard naming conventions for the fact and dimension entities and attributes of logical and physical model.
  • Reviewed the logical model wif Business users, ETL Team, DBA's and testing team to provide information about thedatamodel and business requirements.
  • Worked at conceptual/logical/physical data model level using Erwin according to requirements.
  • Performed data mining on data using very complex SQL queries and discovered pattern.
  • Used SQL for Querying the database in UNIX environment.
  • Involved in extensiveDatavalidation by writing several complex SQL queries and Involved in back-end testing and worked wifdataquality issues.
  • Performed data analysis, Data Migration and data profiling using complex SQL on various sources systems including Oracle and Teradata.
  • Designed semantic layer data model. Conduct performance optimization for BI infrastructure.
  • Identified & recorded defects wif required information for issue to be reproduced by development team.
  • Refining the logical design so that it can be translated into a specific data model.
  • Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies wat data will be extracted from an internal data warehouse, transformed and sent to an external entity.

We'd love your feedback!