We provide IT Staff Augmentation Services!

Sr.data Modeler/data Architect Resume

5.00/5 (Submit Your Rating)

Madison, WI

SUMMARY

  • 9+ years of working experience as a Data Modeler and Data Analyst with high proficiency in requirement gathering and data modeling including design and support of various applications in Online Transactional processing (OLTP) and Online Analytical Processing (OLAP), Data Warehousing and ETL Environment.
  • Experienced in developing Conceptual, logical models and physical database design for OLTP and OLAPsystems using ERstudio, ERwin and Sybase Power Designer.
  • Expertise in Erwin Star schema and Snowflake schema, FACT & Dimensions tables, Physical & logical data modeling and Oracle Designer, Power Designer and ER Studio.
  • Excellence in delivering Quality Conceptual, Logical and Physical Data Models for Multiple projects involving various Enterprise New and Existing Applications and Data Warehouse.
  • Experienced in identifying entities, attributes, metrics and relationships; also assigning keys and optimizing the model.
  • Expertise in MasterDataManagement concepts, Methodologies and ability to apply this knowledge in buildingMDMsolutions, DimensionalDataModeling forDataWarehousing/DataMart.
  • Experienced in conducting JAD sessions for understanding and gathering the business requirements to create Data mapping documents, use cases, work flows and Power Point presentations, write functional specifications and queries.
  • Expert in developing transactional enterprise data models that strictly meet normalization rules, as well as Enterprise Data Warehouses using Kimball and Inmon Data Warehouse methodologies.
  • Excellent SQL programming skills and developed Stored Procedures, Triggers, Functions, Packages using SQL/PL SQL.
  • Performance tuning and query optimization techniques in transactional and data warehouse environments.
  • Data Migration from Db2 to Oracle.
  • Familiar withdataarchitecture includingdataingestion pipeline design,Hadoopinformation architecture,datamodeling anddatamining and advanceddataprocessing.
  • Strongly capable of handling VLRDB (Very Large Relational Data Bases) of about 5TB with expert level working knowledge of the architecture involved.
  • Expertise in Normalization/Denormalization techniques for optimum performance in relational and dimensional database environments.
  • Experience in various phases of Software Development life cycle (Analysis, Requirements gathering, Designing) with expertise in documenting various requirement specifications, functional specifications, Test Plans, Source to Target mappings, SQL Joins.
  • Highly proficient in Data Modeling retaining concepts of RDBMS, Logical and Physical Data Modeling until 3NormalForm (3NF) and Multidimensional Data Modeling Schema (Star schema, Snow - Flake Modeling, Facts and dimensions).
  • Complete knowledge of data ware house methodologies (Ralph Kimball, Inmon), ODS, EDW and Metadata repository.
  • Managed full SDLC processes involving requirements management, workflow analysis, source data analysis, data mapping, metadata management, data quality, testing strategy and maintenance of the model.
  • Consolidate and audit metadata from disparate tools and sources, including business intelligence (BI), extract, transform, and load (ETL), relational databases, modeling tools, and third-party metadata into a single repository.
  • Strong experience in development and design of ETL methodology designing various sources to target mappings for supporting data transformations and processing, in a corporate wide ETL Solution using DMExpress/Power Center and SSIS.
  • Involved withDataSteward Team for designing, documenting and configuring InformaticaData Director for supporting management ofMDMdata.
  • Expert level understanding of using different databases in combinations for Data extraction and loading, joining data extracted from different databases and loading to a specific database.
  • Extensive experience with Normalization (1NF, 2NF, 3NF and BCNF) and Denormalization techniques for improved database performance in OLTP and Data Warehouse/Data Mart environments.
  • Created/ modified reports and dashboards using BI Analytics.
  • Well versed in system analysis, Relational and Dimensional Modeling, Database design and implementing RDBMS specific features.
  • Excellent analytical and communication skills with clear understanding of business process flow, SDLC life cycle.

TECHNICAL SKILLS

Data Modeling: ER/Studio 16.0/9.7/9.6/9.5/9.1, Erwin 4.5/7.2/7.0 r7,r8,r9.64, XMLSpy, OLAP Cubes, Power Designer 15.0, DMExpress, Oracle SQL developer Data Modeler 4.1/3.0

Databases: Oracle 11g/10g/9i/8i/7.3, MS SQL Server 14/12/08/7.0/6.5, IBM DB2 9.8/9.7, MS Access 07/10/13, SAP, AWS Redshift, Teradata TD12/TD8 with M-Load.

Other tools: Ultra Edit, TOAD, Quest Central for DB2, SQL Developer, Informatica Power Center, SSIS, OBIEE, Microsoft Visual Studio 2008/2010.

Environment: UNIX, Windows 8/7

Others: Java, XML, HTML, DHTML, C++, VBScript, CSS, SQL, PL/SQL, JSON, REST

Functional Knowledge: Insurance, Banking, and Telecom

PROFESSIONAL EXPERIENCE

Confidential, Madison, WI

Sr.Data Modeler/Data Architect

Responsibilities:

  • Gathered Business requirements by organizing and managing meetings with business stake holders, development teams and analysts on a scheduled basis.
  • Involved in design and development of data warehouse environment, liaison to business users and technical teams gathering requirement specification documents and presenting and identifying data sources, targets and report generation.
  • Worked as OLTPDataArchitect &DataModelerto develop the Logical and Physical 'Entity RelationalDataModel' for Claim system (Claim & Adjustments) with entities & attributes and normalized them up to 3rd Normal Form using ER/Studio.
  • Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.
  • Involved in defining the source to targetdatamappings, business rules anddata definitions.
  • Responsible for defining the key identifiers for each mapping/interface.
  • Used different ER Model techniques such as Parent Child, 'Associative', Sub Type Super Type, Union, Key Value Pair concept etc to develop thedatamodel.
  • Identified and Analysis of various facts from the source system and business requirements to be used for the data warehouse (Kimball Approach).
  • Involved in loadingdatafrom HDFS to hive using Hive queries.
  • Used Hive to transferdatabetween RDBMS and HDFS.
  • Designed a bigdatasolution fordataingestion with HDFS.
  • Translated business requirements into detailed system specifications and developed use cases and business processflow diagrams employingunifiedmodelinglanguage(UML).
  • Checked for all the modeling standards including naming standards, entity relationships on model and for comments and history in the model.
  • Developed the conceptual and logical data models representing Data models and generating the DDL statements and working with the Database team in creating the tables, views, keys in the database.
  • Generating data requirements, including primary keys, foreign keys, referential integrity, etc. for the migration of existing data extracts, data structures, and data processes from the current un-architected data environment to a fully-architected data environment including a Data Warehouse and Operational Data Stores.
  • Createdphysicaldatamodels with the details necessary for a completephysicaldatamodel, including the appropriate specifications for keys, constraints, Indexes and otherphysicalmodel attributes.
  • Developed and maintained the data dictionaries, Naming Conventions, Standards, and Class words Standards Document as per the Data Architecture Team standards.
  • Extensively worked with Embarcadero ER/StudioRepositoryCheckout and Check in Models and sub models, using Compare and Merge Utility and Reverse engineering the Data Models from the Database.
  • Followed Enterprise standards for both design and quality management of the project data models
  • Extensively worked managing various Sub Models according to different subject areas which allowed logical organization of model and Security.
  • Implemented four step design process comprising of analyzing business process, identify the grain, used star schema design for building fact and dimension tables
  • Involved with Full Data warehouse Lifecycle Implementation upgrading the existing Legacy Data Warehouse to Enterprise Data warehouse using the Kimball’s Four Fixes approach by conforming the Nonconformed Dimensions, creating surrogate keys, delivering the atomic details and reducing redundancies and also designing from the scratch.
  • Involved in project data requirements (including data definitions) and for custom solutions, also developed the project physical data models using data modeling tool Embarcadero/ER studio and XMLSpy.
  • Conducted Data Model Peer Review meetings to get approval from cross-functional teams.
  • Expertise in creating valuable reports, to enhance decision-making and operational capabilities.
  • Involved in projects with both Waterfall and SCRUM methodologies meeting the deadlines every Sprint.
  • Divided model into subject areas for reflecting understandable view to business as well as data model reviewers.

Environment: Embarcadero ER/studio 16.0, XMLSpy, Oracle 11g, DB2 Z/OS, JSON, IBM DB2 UDB, SQL Server 2012/14, AWS Redshift, Toad, PL/SQL, XML Files, XML Spy, Windows, MS Office Tools.

Confidential

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Analyze the OLTP Source Systems and Operational Data Store and research the tables/entities required for the project.
  • Designing the measures, dimensions and facts matrix document for the ease while designing.
  • Created data flowcharts and attribute mapping documents, analyzed the source meaning to retain and provide proper business names following the very stringent FTB’s data standards.
  • Developed several scripts to gather all the requireddatafrom different databases to build the LAR file monthly.
  • Implementeddataquality checks on the monthly LAR file to make sure thedatais within the federal regulations.
  • Developed numerous reports to capture the transactionaldatafor the business analysis.
  • Developed complex SQL queries to bringdatatogether from various systems.
  • Organized and conducted cross-functional meetings to ensure linearity of the phase approach.
  • Collaborated with a team of Business Analysts to ascertain capture of all requirements.
  • Created multiple reports on the daily transactionaldatawhich involves millions of records.
  • Used Joins like Inner Joins, Outer joins while creating tables from multiple tables.
  • Created Multiset, temporary, derived and volatile tables in Teradata database.
  • Implemented Indexes, Collecting Statistics, and Constraints while creating tables.
  • Utilized ODBC for connectivity to Teradata via MS Excel to retrieve automatically from Teradata Database.
  • Developed various ad hoc reports based on the requirements
  • Designed & developed various Ad hoc reports for different teams in Business (Teradata and Oracle SQL, MS ACCESS, MS EXCEL)
  • Developed SQL Queries to fetch complexdatafrom different tables in remote databases using joins, database links and formatted the results into reports and kept logs.
  • Involved in writing complex SQL queries using correlated sub queries, joins, and recursive queries.
  • Delivered the artifacts within the time lines and excelled in the quality of deliverables.
  • Validated thedataduring UAT testing.
  • Performing source to target Mapping
  • Involved in Metadata management, where all the table specifications were listed and implemented the same in Ab Initio metadata hub as perdatagovernance.
  • Developed Korn Shell scripts to parallel extract and processdatafrom different sources simultaneously to streamline performance and improve execution time in a parallel process for better time, resource management and efficiency.
  • Used Teradata utilities such as TPT (Teradata Parallel Transporter), FLOAD (Fastload) and MLOAD (Multiload) for handling various tasks.
  • Developed Logical data model using Erwin and created physical data models using forward engineering.

Environment: Erwin 8.0, Teradata 13, TOAD, Oracle 10g/11g, MS SQL Server 2008, Teradata SQL Assistant, XML Files, DMExpress, Flat files, SQL/PL SQL, UNIX Shell Script.

Confidential

Data Modeler

Responsibilities:

  • Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable.
  • Created Use Case Diagrams using UML to define the functional requirements of the application.
  • Extensively used Agile methodology as the Organization Standard to implement the data Models.
  • Worked on Erwin Data Modeler in designing and maintaining the Logical /Physical Dimensional Data models and generating the DDL statements and working with the Database team in creating the tables, views, keys in the database.
  • Createdphysicaldatamodels with the details necessary for a completephysicaldatamodel, including the appropriate specifications for keys, constraints, Indexes and otherphysicalmodel attributes.
  • Reverse engineeredphysicaldatamodels from databases and SQL scripts.
  • Comparedatamodels andphysicaldatabases and keep changes in sync.
  • Developed and maintained the data dictionaries, Naming Conventions, Standards, and Class words Standards Document.
  • Applied Master Data Management to create and maintain consistent, complete, contextual, and accurate business data for all stakeholders.
  • Developed and maintained Enterprise data naming standards by interacting with data steward, data governance team.
  • Divided model into subject areas for reflecting understandable view to business as well as data model reviewers.
  • Performed UAT testing before Production Phase of the database components being built. Introduced a Data Dictionary for the process, which simplified a lot of the work around the project.
  • Identified and tracked the slowly changing dimensions (SCD I, II, III & Hybrid/6) and determined the hierarchies in dimensions.
  • Assisted the ETL Developers and Testers during the development and testing phases.

Environment: Erwin Data Modeler r 7.3, Erwin Model Manager, TOAD, Oracle9i/10g, XML Files, Flat files, SQL/PL SQL and UNIX Shell Scripts

Confidential

Data Modeler

Responsibilities:

  • Translated Business Requirements into working Logical and Physical Data Models.
  • Developed the Logical and physical data model and designed the data flow from source systems to Oracle tables and then to the Target system.
  • Attendant architecture meeting and data governance meeting to understand the project.
  • Identified and mapped various data sources and their targets successfully to create a fully functioning data repository.
  • Designed the technical specifications document for Oracle ETL processing of data into master data ware house and strategized the integration test plan and implementation.
  • Used advanced data modeling concepts such as Family of Stars, Confirmed Dimensions, and Bus Matrix in order to handle complex situations.
  • Used Degenerate Dimensions in order to create unique policy number for insurance claims.
  • Designed and Developed Use Cases, Activity Diagrams, and Sequence Diagrams using UML.
  • Performed Normalization of the existing OLTP systems (3rd NF), to speed up the DML statements execution time.
  • Designed Star and Snowflake Data Models for Enterprise Data Warehouse using Sybase Power Designer.
  • Data modeling in Erwin; design of target data models for enterprise data warehouse (Oracle).
  • Created and Maintained Logical Data Model (LDM) for the project. Includes documentation of all Entities, Attributes, Data Relationships, Primary and Foreign key Structures, Allowed Values, Codes, Business Rules, Glossary Terms, etc.
  • Worked extensively on Transactional- grain, Periodic snap shot grain and Accumulating snap shot grain while designing dimensional models.
  • Validated and updated the appropriate LDM's to Process Mappings, Screen Designs, Use Cases, Business Object Model, and System Object Model as they evolve and change.
  • Maintained Data Model and synchronized it with the changes to the database.

Environment: Sybase Power Designer, Oracle 9i, Toad, Windows XP, SQL Server

Confidential

ETL Developer / Data Analyst

Responsibilities:

  • Interacted with Business analysts and systems analysis, gathering, design, testing and documentation to understand the business requirements.
  • Involved in developing the logical and physical data models.
  • Created users/groups and folders using repository manager.
  • Involved in staging the data from external sources and was responsible for moving the data into the warehouse.
  • Created reusable transformations and mapplets and used them in mappings.
  • Created complex mappings in Power Center designer using Aggregate, Expression, Filter, and Sequence generator, Update Strategy, Union, Lookup, Joiner and Stored Procedure Transformations.
  • Written SQL override queries in Source Analyzer to customize mappings.
  • Used workflow manager for creating, validating, testing the sequential and concurrent sessions.
  • Used workflow monitor to monitor the progress of the workflow.
  • Involved in performance tuning at source, target and session and database connection level.
  • Created E-mail notification tasks using post-session scripts.
  • Debug mappings to gain troubleshooting information about data and error conditions using Informatica debugger.
  • Involved in project planning and coordinating business, source and development teams to meet the project deadlines.
  • Involved in creating sales dashboards/Reports to compare current cycles and previous cycle sales using pivot table views.
  • Involved in a project with OBIEE team.
  • Developed Oracle forms screens for entering the customer data.
  • Worked in database structure changes, index sizing and data conversion.
  • Designed and developed various UNIX shell scripts.
  • Extensively worked on SQL*LOADER, import and export utilities.

Environment: Informatica, Oracle 9i, PL/SQL, SQL*Loader, SQL, Windows XP, UNIX and Shell programming.

We'd love your feedback!