Sr. Data Modeler/data Architect Resume
Madison, WI
SUMMARY
- 9+ years of working experience as a Data Modeler and Data Analyst with high proficiency in requirement gathering and data modeling including design and support of various applications in Online Transactional processing (OLTP) and Online Analytical Processing (OLAP), Data Warehousing and ETL Environment.
- Experienced in developing Conceptual, logical models and physical database design for OLTP and OLAPsystems using ERstudio, ERwin and Sybase Power Designer.
- Expertise in ERwin, Star schema and Snowflake schema, FACT & Dimensions tables, Physical & logical data modeling and Oracle Designer, Power Designer and ER Studio.
- Excellence in delivering Quality Conceptual, Logical and Physical Data Models for Multiple projects involving various Enterprise New and Existing Applications and Data Warehouse.
- Experienced in identifying entities, attributes, metrics and relationships; also assigning keys and optimizing the model.
- Expert in the Data Analysis, Design, Development, Implementation and Testing using Data Conversions, Extraction, Transformation and Loading (ETL) and ORACLE, SQL Server, Greenplum and other relational and non - relational databases.
- Experienced in conducting JAD sessions for understanding and gathering the business requirements to create Data mapping documents, use cases, work flows and Power Point presentations, write functional specifications and queries.
- Expert in developing transactional enterprise data models that strictly meet normalization rules, as well as Enterprise Data Warehouses using Kimball and Inmon Data Warehouse methodologies.
- Excellent SQL programming skills and developed Stored Procedures, Triggers, Functions, Packages using SQL/PL SQL, PG Admin.
- Performance tuning and query optimization techniques in transactional and data warehouse environments.
- Strongly capable of handling VLRDB (Very Large Relational Data Bases) of about 5TB with expert level working knowledge of the architecture involved.
- Expertise in Normalization/Denormalization techniques for optimum performance in relational and dimensional database environments.
- Highly proficient in Data Modeling retaining concepts of RDBMS, Logical and Physical Data Modeling until 3NormalForm (3NF) and Multidimensional Data Modeling Schema (Star schema, Snow-Flake Modeling, Facts and dimensions).
- Complete knowledge of data ware house methodologies (Ralph Kimball, Inmon), ODS, EDW and Metadata repository.
- Managed full SDLC processes involving requirements management, workflow analysis, source data analysis, data mapping, metadata management, data quality, testing strategy and maintenance of the model.
- Consolidate and audit metadata from disparate tools and sources, including business intelligence (BI), extract, transform, and load (ETL), relational databases, modeling tools, and third-party metadata into a single repository.
- Strong experience in development and design of ETL methodology designing various sources to target mappings for supporting data transformations and processing, in a corporate wide ETL Solution using Informatica Power Mart/Power Center and SSIS.
- Expert level understanding of using different databases in combinations for Data extraction and loading, joining data extracted from different databases and loading to a specific database.
- Extensive experience with Normalization (1NF, 2NF, 3NF and BCNF) and Denormalization techniques for improved database performance in OLTP and Data Warehouse/Data Mart environments.
- Created/ modified reports and dashboards using BI Analytics.
- Well versed in system analysis, Relational and Dimensional Modeling, Database design and implementing RDBMS specific features.
- Extensive experience analyzing and documenting business requirements and system functional specifications including use cases facilitated and participated in Joint Application Development (JAD)
- Excellent analytical and communication skills with clear understanding of business process flow, SDLC life cycle.
TECHNICAL SKILLS
Data Modeling: ER/Studio 16.0/9.7/9.6/9.5/9.1, Erwin 4.5/7.2/7.0 r7,r8,r9.64, XMLSpy, OLAP Cubes, Power Designer 15.0, Oracle SQL developer Data Modeler 4.1/3.0
Databases: Oracle 11g/10g/9i/8i/7.3, MS SQL Server 14/12/08/7.0/6.5, IBM DB2 9.8/9.7, MS Access 07/10/13, SAP, Greenplum, Teradata TD12/TD8 with M-Load.
Other tools: Ultra Edit, TOAD, Quest Central for DB2, SQL Developer, Informatica Power Center, SSIS, OBIEE, Microsoft Visual Studio 2008/2010.
Environment: UNIX, Windows 8/7
Others: Java, XML, HTML, DHTML, C++, VBScript, CSS, SQL, PL/SQL, JSON.
Functional Knowledge: Insurance, Banking, and Telecom
PROFESSIONAL EXPERIENCE
Confidential, Madison, WI
Sr. Data Modeler/Data Architect
Responsibilities:
- Gathered Business requirements by organizing and managing meetings with business stake holders, development teams and analysts on a scheduled basis.
- Involved in design and development of data warehouse environment, liaison to business users and technical teams gathering requirement specification documents and presenting and identifying data sources, targets and report generation.
- Worked as OLTPDataArchitect &DataModelerto develop the Logical and Physical 'Entity RelationalDataModel' for Claim system (Claim & Adjustments) with entities & attributes and normalized them up to 3rd Normal Form using ER/Studio.
- Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.
- Used different ER Model techniques such as Parent Child, 'Associative', Sub Type Super Type, Union, Key Value Pair concept etc to develop thedatamodel.
- Identified and Analysis of various facts from the source system and business requirements to be used for the data warehouse (Kimball Approach).
- Translated business requirements into detailed system specifications and developed use cases and business processflow diagrams employingunifiedmodelinglanguage(UML).
- Checked for all the modeling standards including naming standards, entity relationships on model and for comments and history in the model.
- Developed the conceptual and logical data models representing Data models and generating the DDL statements and working with the Database team in creating the tables, views, keys in the database.
- Generating data requirements, including primary keys, foreign keys, referential integrity, etc. for the migration of existing data extracts, data structures, and data processes from the current un-architected data environment to a fully-architected data environment including a Data Warehouse and Operational Data Stores.
- Createdphysicaldatamodels with the details necessary for a completephysicaldatamodel, including the appropriate specifications for keys, constraints, Indexes and otherphysicalmodel attributes.
- Developed and maintained the data dictionaries, Naming Conventions, Standards, and Class words Standards Document as per the Data Architecture Team standards.
- Extensively worked with Embarcadero ER/StudioRepositoryCheckout and Check in Models and sub models, using Compare and Merge Utility and Reverse engineering the Data Models from the Database.
- Followed Enterprise standards for both design and quality management of the project data models
- Extensively worked managing various Sub Models according to different subject areas which allowed logical organization of model and Security.
- Implemented four step design process comprising of analyzing business process, identify the grain, used star schema design for building fact and dimension tables
- Involved with Full Data warehouse Lifecycle Implementation upgrading the existing Legacy Data Warehouse to Enterprise Data warehouse using the Kimball’s Four Fixes approach by conforming the Nonconformed Dimensions, creating surrogate keys, delivering the atomic details and reducing redundancies and also designing from the scratch.
- Involved in project data requirements (including data definitions) and for custom solutions, also developed the project physical data models using data modeling tool Embarcadero/ER studio and XMLSpy.
- Conducted Data Model Peer Review meetings to get approval from cross-functional teams.
- Expertise in creating valuable reports, to enhance decision-making and operational capabilities.
- Involved in projects with both Waterfall and SCRUM methodologies meeting the deadlines every Sprint.
- Divided model into subject areas for reflecting understandable view to business as well as data model reviewers.
Environment: Embarcadero ER/studio 16.0, XMLSpy, Oracle 11g, DB2 Z/OS, JSON, IBM DB2 UDB, SQL Server 2012/14, Informatica Power Center 9.6, Toad, PL/SQL, XML Files, XML Spy, Windows, MS Office Tools.
Confidential, Cincinnati, OH
Sr. Data Modeler
Responsibilities:
- Analyze the OLTP Source Systems and Operational Data Store and research the tables/entities required for the project.
- Designing the measures, dimensions and facts matrix document for the ease while designing.
- Created data flowcharts and attribute mapping documents, analyzed the source meaning to retain and provide proper business names following the very stringent FTB’s data standards.
- Analyzed the specifications and identified the source data from disparate data sources like Oracle, MS SQL Server, DB2, XML, Flat files, COBOL files that needs to be moved to Teradata data warehouse.
- Participated in several JAD (Joint Application Design/Development) sessions in order to track end to end flow of attributes starting from source systems to all the downstream systems.
- Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment.
- Developed working documents to support findings and assign specific tasks.
- Used data modeling tool Embarcadero/ER studio for creating UML methodology, Use Case Diagrams, DDL’s (Data Definition Languages), Logical and Physical data models.
- Designed and maintained the Logical /Physical Dimensional Data models and generating the DDL statements and working with the Database team in creating the tables, views, keys in the database.
- Worked with data architects and IT architects to understand the movement of data and its storage.
- Conducted design reviews with the business analysts and content developers to create a proof of concept for the reports.
- Collaborated with DBAs to ensure that any business issues, rules or concepts are reflected on the physical database design.
- Developed Source to Target Matrix which contains the transformation logic and handed it over to ETL Team.
- Implemented the SCD type 1 and type 2 mappings to insert/update records into Dimension Tables.
- Created clustered indexes on primary keys for all tables along with more indices as need based.
- Range Partitioned the very large tables based on account numbers and their dependent tables as well, for implementing parallel processing.
- Created materialized views for rendering aggregate data and to speed up the performance.
- Created complex views to support the reporting needs.
- Checked for all the modeling standards including naming standards, entity relationships on model and for comments and history in the model.
- Conducted design walkthrough with project team and got it signed off.
- Created technical mapping documents for the development team to develop mapping workflows.
- Designed a STAR schema for the detailed data marts and planned data marts involving shared dimensions (Conformed).
- Used Teradata utilities such as TPT (Teradata Parallel Transporter), FLOAD (Fastload) and MLOAD (Multiload) for handling various tasks.
- Created and maintained Logical Data Model (LDM) for the project. Included documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.
- Developed Logical data model using Erwin and created physical data models using forward engineering.
- Validated and updated the appropriate LDM's to process mappings, screen designs, use cases, business object model, and system object model as they evolve and change.
- Worked on snow-flaking the Dimensions to remove redundancy.
- Worked with the implementation team to ensure a smooth transition from the design to the implementation phase.
- Co-ordinated with QA team to test and validate the reporting system and its data.
- Suggested effective implementation of the applications, which are being developed.
Environment: Erwin 8.0, Teradata 13, TOAD, Oracle 10g/11g, MS SQL Server 2008, Teradata SQL Assistant, XML Files, Flat files, SQL/PL SQL, UNIX Shell Script, TOAD and Autosys.
Confidential
Data Modeler
Responsibilities:
- Analyzed the business requirements by dividing them into subject areas and understood the data flow within the organization.
- Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable.
- Created data trace map and data quality mapping documents.
- Created Use Case Diagrams using UML to define the functional requirements of the application.
- Initiated and conducted JAD sessions inviting various teams to finalize the required data fields and their formats.
- Extensively used Agile methodology as the Organization Standard to implement the data Models.
- Worked on Erwin Data Modeler in designing and maintaining the Logical /Physical Dimensional Data models and generating the DDL statements and working with the Database team in creating the tables, views, keys in the database.
- Createdphysicaldatamodels with the details necessary for a completephysicaldatamodel, including the appropriate specifications for keys, constraints, Indexes and otherphysicalmodel attributes.
- Reverse engineeredphysicaldatamodels from databases and SQL scripts.
- Comparedatamodels andphysicaldatabases and keep changes in sync.
- Developed and maintained the data dictionaries, Naming Conventions, Standards, and Class words Standards Document.
- Applied Master Data Management to create and maintain consistent, complete, contextual, and accurate business data for all stakeholders.
- Developed and maintained Enterprise data naming standards by interacting with data steward, data governance team.
- Divided model into subject areas for reflecting understandable view to business as well as data model reviewers.
- Performed UAT testing before Production Phase of the database components being built. Introduced a Data Dictionary for the process, which simplified a lot of the work around the project.
- Worked with DBA extensively and recommended Oracle bitmap indexes and bitmap joins for star schema optimization and Oracle table partitioning for performance. Implemented materialized views.
- Identified and tracked the slowly changing dimensions (SCD I, II, III & Hybrid/6) and determined the hierarchies in dimensions.
- Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
- Assisted the ETL Developers and Testers during the development and testing phases.
Environment: Erwin Data Modeler r 7.3, Erwin Model Manager, TOAD, Oracle9i/10g, XML Files, Flat files, SQL/PL SQL and UNIX Shell Scripts
Confidential
Data Modeler
Responsibilities:
- Attended and participated in information and Requirements Gathering sessions.
- Ensured that Business Requirements can be translated into Data Requirements.
- Created Business Requirement documents (BRD’s), such as SRS & FRS and integrated the requirements and underlying platform functionality.
- Translated Business Requirements into working Logical and Physical Data Models.
- Developed the Logical and physical data model and designed the data flow from source systems to Oracle tables and then to the Target system.
- Attendant architecture meeting and data governance meeting to understand the project.
- Identified and mapped various data sources and their targets successfully to create a fully functioning data repository.
- Designed the technical specifications document for Oracle ETL processing of data into master data ware house and strategized the integration test plan and implementation.
- Used advanced data modeling concepts such as Family of Stars, Confirmed Dimensions, and Bus Matrix in order to handle complex situations.
- Used Degenerate Dimensions in order to create unique policy number for insurance claims.
- Designed and Developed Use Cases, Activity Diagrams, and Sequence Diagrams using UML.
- Involved in the analysis of the existing claims processing system, mapping phase according to functionality and data conversion procedure.
- Performed Normalization of the existing OLTP systems (3rd NF), to speed up the DML statements execution time.
- Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ER Studio.
- Data modeling in Erwin; design of target data models for enterprise data warehouse (Oracle).
- Created and Maintained Logical Data Model (LDM) for the project. Includes documentation of all Entities, Attributes, Data Relationships, Primary and Foreign key Structures, Allowed Values, Codes, Business Rules, Glossary Terms, etc.
- Worked extensively on Transactional- grain, Periodic snap shot grain and Accumulating snap shot grain while designing dimensional models.
- Created Accumulating Snapshot Table to analyze end of year financial reporting of all claims collected and due.
- Validated and updated the appropriate LDM's to Process Mappings, Screen Designs,Use Cases, Business Object Model, and System Object Model as they evolve and change.
- Designed the Database Tables & Created Table and Column Level Constraints using the suggested naming conventions for constraint keys.
- Maintained Data Model and synchronized it with the changes to the database.
- Involved with all the phases of Software Development Life Cycle (SDLC) methodologies throughout the project life cycle.
Environment: Sybase Power Designer, Oracle 9i, Toad, Windows XP, SQL Server
Confidential
ETL Developer / Data Analyst
Responsibilities:
- Interacted with Business analysts and systems analysis, gathering, design, testing and documentation to understand the business requirements.
- Involved in developing the logical and physical data models.
- Created users/groups and folders using repository manager.
- Involved in staging the data from external sources and was responsible for moving the data into the warehouse.
- Created reusable transformations and mapplets and used them in mappings.
- Created complex mappings in Power Center designer using Aggregate, Expression, Filter, and Sequence generator, Update Strategy, Union, Lookup, Joiner and Stored Procedure Transformations.
- Written SQL override queries in Source Analyzer to customize mappings.
- Used workflow manager for creating, validating, testing the sequential and concurrent sessions.
- Used workflow monitor to monitor the progress of the workflow.
- Involved in performance tuning at source, target and session and database connection level.
- Created E-mail notification tasks using post-session scripts.
- Debug mappings to gain troubleshooting information about data and error conditions using Informatica debugger.
- Involved in project planning and coordinating business, source and development teams to meet the project deadlines.
- Involved in creating sales dashboards/Reports to compare current cycles and previous cycle sales using pivot table views.
- Involved in a project with OBIEE team.
- Developed Oracle forms screens for entering the customer data.
- Worked in database structure changes, index sizing and data conversion.
- Designed and developed various UNIX shell scripts.
- Extensively worked on SQL*LOADER, import and export utilities.
Environment: Informatica, Oracle 9i, PL/SQL, SQL*Loader, SQL, Windows XP, UNIX and Shell programming.