We provide IT Staff Augmentation Services!

Data Warehouse Architect/sr.data Modeler Resume

2.00/5 (Submit Your Rating)

Austin, TX

SUMMARY:

  • I have 9+ years of hands on experience in Information Technology (IT) as Data Warehouse Architect, MDM Data Architect, Sr. Data Modeler and Database Analyst in heterogeneous business domains.
  • Proficient in System and Software Development Life Cycle (SDLC) models and Project Management (PM) methodologies.
  • Experienced in Enterprise Architecture (EA) frameworks like Zachman and TOGAF.
  • Skillful in gathering business requirements and handling project traceability matrix.
  • Well versed in Normalization (1NF, 2NF and 3NF) and De - Normalization techniques for optimum performance in relational and dimensional database environments.
  • Experience in developing Entity Relationship Diagrams (ERD) and modeling Transactional Databases and Data Warehouses using tools like Erwin, Infosphere Data Architect (IDA), ER Studio and Power Designer.
  • Vast experience in developing conceptual, logical and physical data models for OLTP and OLAP systems.
  • Solid knowledge in Data Governance, Data Cleansing, Data Scrubbing, Reference Data Management (RDM) for the Master Data Management (MDM) projects .
  • Good experience in Data Transformation, Data Masking, Metadata, Reference Data and Test Data Management (MDM, RDM and TDM).
  • Experienced building Master Data Management (MDM) solutions.
  • Extensively worked with Trillium and Informatica Data Quality (IDQ) tools.
  • Strong experience with Customer Relationship Management (CRM) and Enterprise Resource Planning (ERP) applications.
  • Experience in modeling OLAP systems using Kimball, Inmon and Data Vault methodologies and Business Intelligence (BI) applications.
  • Good experience in data mining, designing OLAP, ROLAP and MOLAP databases.
  • Have vast experience in cloud technologies like Azure Data Lake Store, Azure Data Lake Analytics and Azure Data Warehouse architecture.
  • Exposure to big data technologies Hadoop, Spark.
  • Solid knowledge in identifying Dimensions and Facts in Star and Snowflake schema approaches.
  • Experienced in generating and documenting Metadata while designing OLTP and OLAP systems.
  • Good experience in ETL and ELT concepts, Data Transformations, Data Mapping from Source to Target Databases.
  • Expertise in Extract Transform and Load (ETL) process, load data into target from spreadsheets, database tables and other sources using Microsoft Data Transformation Service (DTS), Pentaho Kettle and Informatica.
  • Experienced in building various logics to handle Slowly Changing Dimensions (SCD) and Change Data Capture (CDC) functions.
  • Good in creating Indexes, Stored Procedures, Triggers, Functions and Packages.
  • Hands on experience in Data Stewardship functions like creation and maintaining of metadata and data dictionary.
  • Good knowledge in Metadata and Reference Data Management (RDM).
  • Having experience in writing SQL queries to perform end-to-end ETL data validations and support Ad-Hoc business requests.
  • Solid hands on experience in Quality Assurance (QA), performed different testing’s like User Acceptance, Unit, System, Manual, Automated and ETL Testing’s against Client and Server based applications.
  • Hands on experience in creating Dashboards, Score Cards and many other report formats.
  • Having experience with Windows, UNIX and Linux operating systems.
  • Quick learner, self-motivating and enthusiastic to work with new technologies. Possess excellent verbal and written communication.

CORE TECHNOLOGY:

Data Modeling Tools: Erwin 9.6, Infosphere Data Architect (IDA), ER/Studio 9, Power Designer 16.5, Oracle SQL Data Modeler, UML and Microsoft Visio 2010.

Data Quality Tools: Query Surge 4, HPQC-ALM, Informatica Data Quality (IDQ), Data Quality Services (DQS), Data flux, Trillium, IBM InfoSphere Quality Stage and IBM Info Sphere Information Analyzer.

ETL Tools: Pentaho Data Integration (PDI), Kettle, Informatica 9.5, SQL Server Integration Services (SSIS), SQL BI, OBIEE, Data Stage.

OLAP Tools: SQL Server Analysis Services (SSAS), TIBCO Spotfire, Congo’s, Cogno’s Power Play Studio, Tableau, PowerBI, Crystal Reports and SQL Server Management Studio

Database Tools: Microsoft SQL Server 2012/2008, MS Azure SQL Server, Postgre SQL, MDS MDM, Toad for DB2,Oracle SQL Developer 4.1, Hadoop, Spark, SAP HANA, Db Visualizer Pro 9.2, Oracle 11g/10g,IBM DB2, Teradata and MS Access.

Packages: Microsoft Office Suite 2013/10/07, Microsoft Project 2010, SVN Version Control, Team Foundation Server (TFS), JIRA Software, IBM Rational Clear Case and Clear Quest.

Programming Languages: ASCII-SQL, XML, XSD, WSDL,NIEM schema and SAS, Python, Spark SQL, R Studio, HTML

Operating Systems: Microsoft Windows (10, 8, 7), NT/2000/Vista, Unix, Linux, Shell scripting.

PROFESSIONAL EXPERIENCE:

Confidential, Austin, TX

Data Warehouse Architect/Sr.Data Modeler

Responsibilities:

  • Coordinate with business architect team and IRRIS (Institutional Reporting, Research and Information Systems) team to get high level business requirements.
  • Designed Business Process Diagrams (BPD) and circle diagrams using Microsoft Visio tool.
  • Defined Grain, identified Dimensions and Measures for each subject area in an Enterprise Data Warehouse (EDW) project.
  • Performed Data Analysis and Data Profile on source tables to identify Critical Data Elements (CDE).
  • Validated source data to identify data anomalies, data redundancy and understand data quality.
  • Worked with Collibra Reference Data Accelerator to create reference data.
  • Created and maintained Reference Data tables in university data environment.
  • Designed conceptual, logical and physical data models using Erwin v9.6 tool.
  • Maintained different versions of data models using Erwin Mart feature.
  • Created reference tables with reference data, conformed dimensions, Slowly Changing Dimensions (SCD) and different type’s fact tables.
  • Updated Metadata and User Defined Properties (UDP) in Erwin dimensional model.
  • Performed Forward Engineering (FE) to create Physical data model DDL script and deployed script into target schema.
  • Reverse Engineered (RE) legacy systems using Erwin tool.
  • Worked with big data technologies Hadoop, Spark implementation team.
  • Designed Data Lake centralized repository system to store structured and unstructured data.
  • Designed Data Lake system using Data and Analytics as a Service (DAaaS) model.
  • Created data model design specifications and Source to Target Mapping (STTM) documentation.
  • Created data base objects like Indexes, Views, Stored Procedures and Tablespaces.
  • Reviewed business user reports and cubes using Cogno’s Power Play studio tool.
  • Interacted in testing requirements gathering and participated in preparing unit test cases.
  • Executed Unit testing test cases and performed Data Validation against target table’s data to measure Data Accuracy and Data Quality.
  • Worked with cross functional testing teams and developers to resolve defects.
  • Updated daily work status with all the team members including team lead, manage and project director.

Environment: Erwin9.6,ER/Studio, Oracle SQL developer data modeler 4.0, Oracle11g, Oracle Exadata X6, Hadoop, Spark, Mainframe Database, FASET System, SIS System, IBM InfoSphere Quality Stage, Oracle SQL, Collibra, PowerBI, Tableau and Cogno’s Power Play Studio, JIRA Software, Unix, Linux, UML and Microsoft Visio 2010, Microsoft Project 2010, Microsoft Office Suite.

Confidential, Morrisville, NC

Data Warehouse Architect/Sr.MDM Data Modeler

Responsibilities:

  • Accumulated data model business requirements, documented functional and technical specifications.
  • Designed Conceptual Data Model (CDM), Logical Data Model (LDM) Entity Relationship Diagrams (ERD) using Erwin tool.
  • Designed Star schema data marts and designed dimensional logical and physical models using Kimball and Data Vault methodology.
  • Conducted data model review sessions with Business architect team.
  • Worked on Metadata Management across heterogeneous models.
  • Transformed logical data model into physical data model using Infosphere Data Architect (IDA) tool.
  • Compared data model to database physical schema using compare feature in Infosphere Data Architect (IDA) tool.
  • Generated physical data model from database by using reverse engineering feature in Infosphere Data Architect (IDA) tool.
  • Documented design specifications of data marts, data warehouse logical and physical data models.
  • Created and maintained Reference Data tables in clinical trials data environment.
  • Enforced enterprise naming standards and data type standards in logical and physical data models using Infosphere Data Architect (IDA) tool.
  • Hands on experience in maintaining and working on different versions of data models in a team environment.
  • Designed Slowly Changing Dimension (SCD) tables.
  • Created Source to Target Mapping (STTM) document for ETL development.
  • Deployed Physical data models in target database and also deployed in virtualization environment using Db Visualizer Pro tool.
  • Designed and updated metadata for Data Lake model.
  • Executed data definition and data manipulation operations for data analysis.
  • Created query pairs in Query surge tool to test data in virtualization layer.
  • Used Informatica Data Quality (IDQ 9.5.x) tool for data verification and data cleansing.
  • Developed mappings using Informatica Data Quality (IDQ 9.5.x) transformations.
  • Performed data quality comparison between development and virtualization environment using Query surge tool
  • Executing unit and system testing for dimensional and fact tables using HPQC- ALM tool.
  • Developed SQL scripts to validate data in target database.
  • Executed Inserting, Updating and Deleting (I, U, D) test case against target database.
  • Exported and prepared detailed unit and system test execution, bugs and defect reports using HPQC-ALM tool.

Environment: Erwin9.6, Infosphere Data Architect (IDA), Oracle SQL developer data modeler 4.0 , HPQC 12, Query Surge 4.0, Oracle11g, Oracle Exadata X6, Teradata12, Hadoop, Spark, IBM IMS database, Postgre SQL, MS Azure SQL server, SAP HANA, Informatica Powercenter 9.5, OBIEE, Oracle SQL developer 4.1, Db Visualizer Pro 9.2, PowerBI, TIBCO Spotfire, Collibra, CTMS system, Spark SQL, Oracle Clinical system, Python, R Studio, Unix, Shell script, Cisco Tidal scheduler, Microsoft office suite 2013.

Confidential, Raleigh, NC

MDM Data Architect/Data Modeler

Responsibilities:

  • Interacted with the project architect team on regular basis to gather requirements and discuss project milestones.
  • Created Entity Relationship Diagrams (ERD), Functional Diagrams and Data Flow Diagrams (DFD’s) .
  • Created Logical and Physical data models using Erwin and Power designer tools.
  • Document Mater Data Management (MDM) Master Data Services (MDS) model standards and naming conventions.
  • Designing Enterprise Information Management (EIM) Logical data model using Erwin tool and deploying Physical data model on target database.
  • Worked in customer and product Master Data Management (MDM) projects.
  • Deployed Master Data Services (MDS) model on target database.
  • Creating XML, XSD, WSDL and NIEM schemas for Service Oriented Architecture (SOA) services.
  • Heavily worked on data validation and Data quality methods, performed data profiling using SQL and other methods.
  • Worked with Data Quality Services (DQS) tool to create data quality rules.
  • Performed data stewardship roles like Implement data standards, create metadata and manage databases.
  • Maintained security and data integrity of the database.
  • Executed queries against data base.
  • Involved in the maintenance of the database.
  • Document the conversion processes, ensuring data accuracy and Integrity
  • Heavily used data conversion process techniques and methods.
  • Worked with the business in helping define & develop business rules, value maps, source to target data mapping documents and assisting ETL development team in development activities.

Environment: Erwin 9.5, ER/Studio, Oracle SQL developer data modeler, SQL server 2012, Teradata, PostgreSQL, Oracle10g, SAP HANA, MS Azure SQL server, SQL BI, Data Quality Services (DQS), SQL server management studio 2012, Spark SQL, R Studio, MS Visio, Python, XML, XSD,WSDL and NIEM schema, Unix, Linux, Shell scripting, Microsoft Excel, Access 2010.

Confidential, Moline, IL

Data Architect/Sr.Data Modeler

Responsibilities:

  • Interacted with the business users on regular basis to consolidate and analyze project requirements.
  • Created conceptual, logical and physical data models using Infosphere Data Architect (IDA) tool .
  • Used Normalization methods up to 3NF and De-Normalization techniques for effective performance in OLTP systems.
  • Enforced Referential Integrity (R.I) for consistent relationship between parent and child tables
  • Generated Data Definition Language (DDL) script from physical data model using Infosphere Data Architect (IDA) tool.
  • Extensively used Infosphere Data Architect (IDA) tool to create web reports from models .
  • Reverse Engineered (R.E) target databases, identified new data elements in the source systems then added new data elements to the existing data models
  • Compared data with original source documents and validated for Data Accuracy.
  • Extracted large volumes of data feed from different data sources, performed transformations and loaded the data into various targets.
  • Worked with Data Stewardship team to maintain data standards, performing data profiling operations against databases.
  • Developed Data Migration and Data Cleansing rules for the integration architecture (OLAP, ODS, DW).
  • Experienced in development of ETL mapping and scripts .
  • Heavily used Pentaho Data Integrator (PDI) and Informatica Powercenter for migrating data from legacy database to new database.
  • Involved in Star and Snow flake schema for dimensional data models.
  • Worked on multiple data marts in Enterprise Data Warehouse project (EDW)
  • Involved in designing OLAP data models extensively used Slowly Changing Dimensions (SCD).

Environment: Infosphere Data Architect (IDA),Power designer 16.5, IBM DB2, Teradata, Oracle10g, PostgreSQL, SQL server 2012, MS Azure SQL server, SAP HANA, Informatica power center 8.6, Pentaho Data Integrator (PDI), OBIEE, Visual studio, Crystal reports, SQL server management studio 2012, Informatica Data Quality (IDQ),Trillium, Spark SQL, R Studio, Unix, Linux, Toad for DB2,Microsoft Excel, Access 2010.

Confidential, Columbus, OH

MDM Data Architect/Data Modeler

Responsibilities:

  • Demonstrated strong analytical skills in identifying and resolving data exchange issues.
  • Used E.F Codd’s Normalization methods1NF, 2NF, 3NF and De-Normalization techniques for effective performance in OLTP and OLAP systems.
  • Designed data models in Dimensional Modeling (D.M) environment.
  • Executed SQL queries to retrieve data from databases for analysis purpose.
  • Created data model reports and Data Dictionary (D.D) by using Erwin.
  • Have deep understanding of Know your Customer (KYC) procedures and standards.
  • Executed user acceptance testing for Know your Customer (KYC) applications and KYC project’s.
  • Worked with Anti-Money Laundering (AML) partners and AML advisory team.
  • Designed CDM, LDM, PDM data models for AML applications.
  • Reviewed KYC documentation, worked as liaison between business and compliance AML teams.
  • Created relational data models and populated reference data.
  • Developing normalized Logical and Physical database models to design OLTP system for enterprise applications.
  • Performing Forward Engineering (FE) operations to create a Physical data model with DDL that best suits requirements from the Logical data model.
  • Developed data mapping, data governance and transformation and cleansing rules for the Master Data Management (MDM) Process.
  • Executed enterprise data governance strategies in line with the organization’s business strategy and objectives.
  • Conducted meetings with DRB team for metadata approval.
  • Extensively used data quality tools to maintain consistent data.
  • Convert data from multiple sources and run validation test on converted data.
  • Thorough understanding and experience with the entire data migration process from analysis of existing data, cleansing, validation, translation tables, conversions and subsequent upload into new platform.

Environment: CA Erwin 9.5,Power Designer 16.5, Oracle10g, SQL server 2012, PostgreSQL, IBM IMS database, IBM DB2, Teradata, Informatica power center 8.6, OBIEE, Visual studio, Crystal report’s, SQL Server Management Studio (SSMS) 2012, Trillium, Informatica Data Quality (IDQ),IBM discovery, R Studio, Toad for DB2, Unix, Microsoft Excel, Access 2010.

Confidential

Sr. Data Modeler/Database Analyst

Responsibilities:

  • Data analysis, development, implementation & support of BSS Application in Telecommunication industry.
  • Implemented various interfaces across Telecom industry.
  • Detailed understanding of Telecommunication data models.
  • Created logical and physical data models using Power Designer tool
  • Data mapping experience with Telecom tools Oracle BRM and RBM.
  • Good understanding of CDR concepts in Telecom domain.
  • Executed data stewardship functions to maintain quality of data.
  • Proficient in legacy conversion from Telecommunication systems to VOIP.
  • Performed data mapping between two distinct Telecom data models.
  • Solid understand of data analysis techniques and processes.
  • Designed data models in Master Data Management (MDM) environment.
  • Proficient in Enhanced Telecom Operations Map (eTOM) and New Generation Operation Support Systems (NGOSS)
  • Detailed knowledge in Wire line/Land line/Wireless communications and DSL. Solid understanding of Telecom protocols, Wi-Fi and Wi-MAX concepts
  • Consolidation of multiple data models into a single data base
  • Data mapped from legacy systems to target systems, Involved in data migration projects with full life cycle implementation.
  • Analyzed the data and created dashboards for reporting.

Environment: ER/Studio, MS SQL Server 2008 R2, Oracle 9i, Teradata, Oracle BRM and RBM, VOIP, Wi-Fi, Wi-Max, Informatica power center, Pentaho Data Integrator (PDI), Trillium, Power designer, Unix, DTS, MS analysis services, Crystal reports, Visual studio, Microsoft Excel, Access 2007.

Confidential

Data Modeler/Database Analyst

Responsibilities:

  • Gathered high level requirements then converted into technical and functional requirements.
  • Involved in all phases of SDLC including analysis, designing, developing, testing, implementation and maintenance.
  • Created logical and physical data models using Erwin tool.
  • Performed forward engineering (F.E) to generate DDL script using Erwin tool.
  • Initiated use case analysis using UML tool.
  • Extensively used normalization techniques to design OLTP data models.
  • Mapped the data between source and targets databases.
  • Worked in data extraction and conversion in data migration project.
  • Created process flow diagrams using MS Visio tool.
  • Strong understanding of data quality assurance process and procedures.
  • Created and maintained data dictionaries.

Environment: Erwin 6.0, ER/Studio, SQL Server 2005, Visual Paradigm for UML 6.0, SQL Server Management Studio (SSMS) 2005, MS Visio, Toad for DB2, Unix, MS Access 2003, SQL navigator, Windows NT, Microsoft office 2003.

We'd love your feedback!