Sr. Data Architect/data Modeler Resume
Chicago, IL
PROFILE SUMMARY:
- Around 7 years of IT experience in Data Architect/Modeling, designing and data analysis.
- Proficient in data mart design, creation of cubes, identifying facts & dimensions, star & snowflake schemes and canonical model.
- Solid hands on experience with administration of data model repository, documentation in metadata portals in such as Erwin, ER Studio and Power Designer tools.
- Strong working Experience with Agile, Scrum, Kanban and Waterfall methodologies.
- Capture, validate and publish metadata in accordance with enterprise data governance policies and MDM taxonomies.
- Responsible for data governance rules and standards to maintain the consistency of the business element names in the different data layers.
- Extensive experience in development of T - SQL, Oracle PL/SQL Scripts, Stored Procedures and Triggers for business logic implementation.
- Proficient in Normalization/De-normalization techniques in relational/dimensional database environments and have done normalizations up to 3NF.
- Expertise in SQL Server Analysis Services (SSAS) and SQL Server Reporting Services (SSRS) tools.
- Experience in modeling with both OLTP/OLAP systems and Kimball and Inmon Data warehousing environments.
- Strong understanding of the principles of Data warehousing, Fact Tables, Dimension Tables, star and snowflake schema modeling.
- Experience in backend programming including schema and table design, stored procedures, Triggers, Views, and Indexes.
- Conduct data analysis, mapping transformation, data modeling and data-warehouse concepts.
- Experience in designing Logical, Physical & Conceptual data models for to build the Data Warehouse.
- Experience in conducting Joint Application Development (JAD) sessions with SMEs, Stakeholders and other project team members for requirements gathering and analysis.
- Software Development Life Cycle (SDLC) experience including Requirements, Specifications Analysis/Design and Testing as per the Software Development Life Cycle.
- Solid Excellent experience in creating cloud based solutions and architecture using Amazon Web services (Amazon EC2, Amazon S3, Amazon RDS) and Microsoft Azure.
- Excellent knowledge on creating reports on SAP Business Objects, Web reports for multiple data providers.
- Excellent experience in writing and executing unit, system, integration and UAT scripts in a data warehouse projects.
- Excellent experience in writing SQL queries to validate data movement between different layers in data warehouse environment.
- Proficient in designing Data Mart and Data Warehouse using Star and Snowflake Schemas.
- Extensive experience in using ER modeling tools such as Erwin and ER/Studio, Teradata, BTEQ, and MDM
- Experience in designing Star schema, Snowflake schema for Data Warehouse, ODS architecture.
- Hands on experience in SQL queries and optimizing the queries in Oracle, SQL Server, DB2, and Netezza.
- Experience in working with RDBMS like Oracle, Microsoft SQL Server and Teradata.
- Experience in data analysis to track data quality and for detecting/correcting inaccurate data from the databases.
- Good Knowledge on SQL queries and creating database objects like stored procedures, triggers, packages and functions using SQL and PL/SQL for implementing the business techniques.
TECHNICAL/FUNCTIONAL SKILLS:
Cloud Management: Amazon Web Services(AWS), Amazon Redshift
Data Modeling Tools: ER/Studio V17, Erwin 9.7, Power Sybase Designer 16.6.
OLAP Tools: Tableau, SAP BO, SSAS, Business Objects, and Crystal Reports 9
Programming Languages: SQL, PL/SQL, UNIX shell Scripting, PERL, AWK, SED
Databases: Oracle 12c, Teradata R15, MS SQL Server 2017, DB2.
Testing and defect tracking Tools: HP/Mercury, Quality Center, Win Runner, MS Visio 2016 & Visual Source Safe
Operating System: Windows 10/8, Unix, Sun Solaris
ETL/Data warehouse Tools: Informatica 9.6/9.1, SAP Business Objects XIR3.1/XIR2, Talend, Tableau
Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Mode
WORK EXPERIENCE:
Confidential, Chicago, IL
Sr. Data Architect/Data Modeler
Responsibilities:
- As a Sr. Data Architect/Modelercollaboratively worked with the Data modeling architects and other datamodelers in the team to design the Enterprise Level Standard Data model.
- Interacted with users for verifying User Requirements, managing Change Control Process, updating existing Documentation.
- Working with the architecture and development teams to help choose data-related technologies, design architectures, and model data in a manner that is efficient, scalable, and supportable.
- Worked closely with the development and database administrators to guide the development of the physical data model and database design.
- Responsible for Big data initiatives and engagement including analysis, brainstorming, POC, and architecture.
- Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and BigData technologies.
- Worked on designing Conceptual, Logical and Physical data models and performed data design reviews with the Project team members.
- Designed a STAR schema for sales data involving shared dimensions (Conformed) using ErwinData Modeler.
- Worked on building the Logicaldatamodel from the scratch from the XMLs as the data source.
- Worked on building the data models to convert the data from one data Application to another in a way that suit the needs of the target database.
- Involved in versioning and saving the models to the data mart and maintaining the Data mart Repository.
- Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns.
- Built Data Lake in Azure using Hadoop (HDInsight clusters) and migrated Data using AzureDataFactory pipeline.
- Designed Lambda architecture to process streaming data using Spark. Data was ingested using Sqoop for structured data and Kafka for unstructureddata.
- Creation Azure Event Hubs, Azure Service Bus, Azure Service Analysis, Power BI for handling IOT Messages.
- Ensure the datawarehouse and datamart designs to efficiently support the reporting and BIteam requirements.
- Performed Hive programming for applications that were migrated to big data using Hadoop.
- Involved in creating Hivetables and loading and analyzing data using hive queries Developed Hive.
- Executed Hive queries on Parquettables stored in Hive to perform data analysis to meet the business requirements.
- Produced 3NF data models for OLTP designs using data modeling best practices and modeling skills.
- Worked with Data Stewards and Businessanalysts to gather requirements for MDM Project.
- Worked with reversed engineer Data Model from Database instance and Scripts.
- Created data models for different databases like Oracle,Sql Server.
- Responsible for defining the naming standards for data warehouse.
- Enforced Referential integrity in the OLTPdatamodel for consistent relationship between tables and efficient database design.
- Involved in the creation, maintenance of Data Warehouse and repositories containing Metadata.
- Created Source to Target Mapping Documents to help guide the data model design from the Data source to the data model.
- Involved in the validation of the OLAPUnittesting and System Testing of the OLAP Report Functionality and data displayed in the reports.
- Created HBasetables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
- Data Governance of RAW, Staging, Curated and Presentation Layers in AzureDataLakeStore.
- Involved in writing T-SQL, working on SSIS, SSRS, SSAS, Data Cleansing, Data Scrubbing and Data Migration.
- Involved in Data loading using PL/ SQL Scripts and SQL Server Integration Services (SSIS).
- Conducted and participated in JAD sessions with the users, modelers, and developers for resolving issues.
- Applied data naming standards, created the data dictionary and documented data model translation decisions and also maintained DW metadata.
- Created data masking mappings to mask the sensitive data between production and test environment.
- Participated in PerformanceTuning using Explain Plan and TKPROF.
- Performance tuning and stress-testing of NoSQLdatabase environments in order to ensure acceptable database performance in production mode.
Environment: Erwin9.7, Hadoop3.0, NoSQL, PL/Sql, T-Sql, SSIS, UNIX, Spark, Azure Data lake, OLTP,Azure SQL DB and Azure SQL DW.
Confidential, Arlington, VA
Sr. Data Modeler/Data Analyst
Responsibilities:
- Massively involved in Data Modeler/Analyst role to review business requirement and compose source to target data mapping documents.
- Gather all the analysis reports prototypes from the business analysts belonging to different Business units.
- Interacted with Business Analysts to gather the user requirements and participated in data modeling JAD sessions.
- Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
- Extensively used Agile methodology as the Organization Standard to implement the data Models.
- Actively participated in JAD sessions involving the discussion of various reporting needs.
- Created PL/SQL packages and Database Triggers and developed user procedures and prepared user manuals for the new programs.
- Interacted with Business Analyst, SMEs to understand Business needs and functionality for various project solutions.
- Planned and defined system requirements to Use Case, Use Case Scenario and Use Case Narrative using the UML (Unified Modeling Language) methodologies.
- Reverse Engineering the existing data marts and identified the Data Elements (in the source systems), Dimensions, Facts and Measures required for reports.
- Conduct Design discussions and meetings to come out with the appropriate Data Warehouse at the lowest level of grain for each of the Dimensions involved.
- Created Entity Relationship Diagrams (ERD), Functional diagrams, Data flow diagrams and enforced referential integrity constraints.
- Developed the data warehouse model (Kimball’s) with multiple data marts with conformed dimensions for the proposed central model of the Project.
- Designed a STAR schema for sales data involving shared dimensions (Conformed) for other subject areas using E/R Studio Data Modeler.
- Created and maintained Logical Data Model (LDM) for the project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.
- Validated and updated the appropriate LDM's to process mappings, screen designs, use cases, business object model, and system object model as they evolve and change.
- Conduct Design reviews with the business analysts and content developers to create a proof of concept for the reports.
- Ensured the feasibility of the logical and physical design models.
- Worked on the Snow-flaking the Dimensions to remove redundancy.
- Wrote PL/SQL statement, stored procedures and Triggers for extracting as well as writing data.
- Worked extensively on Data Migration by using SSIS.
- Developed rule sets for data cleansing and actively participated in data cleansing and anomaly resolution of the legacy application.
- Defined facts, dimensions and designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using E/R Studio.
- Involved in Data profiling and performed Data Analysis based on the requirements, which helped in catching many Sourcing Issues upfront.
- Developed Data mapping, Transformation and Cleansing rules for the Data Management involving OLTP, ODS and OLAP.
- Responsible for data governance rules and standards to maintain the consistency of the business element names in the different data layers.
- Created data masking mappings to mask the sensitive data between production and test environment.
- Normalized the database based on the new model developed to put them into the 3NF of the data warehouse.
- Created SSIS package for daily email subscriptions to alert Tableau subscription failure using the ODBC driver and PostgreSQL database.
- Designed logical and physical data models, Reverse engineering, Complete compare for Oracle and SQL server objects using E/R Studio.
Environment: PL/SQL, E/R Studio v17, MS SQL 2016, OLTP, ODS, OLAP, SSIS, Tableau, ODBC, Transact-SQL, TOAD, Teradata r15, SQL Assistant
Confidential, St. Louis, MO
Sr. Data Modeler
Responsibilities:
- Worked as a Sr. Data Modeler to generate Data Models using E/R Studio and subsequent deployment to Enterprise Data Warehouse.
- Documented Technical & Business User Requirements during requirements gathering sessions.
- Analyzed conceptual into logical data and had JAD sessions and also communicated data related issues and standards.
- Construct complex SQL queries with sub-queries, inline views as per the functional needs in the Business Requirements Document (BRD).
- Involved in designing and developing SQL server objects such as Tables, Views, Indexes (Clustered and Non-Clustered), Stored Procedures and Functions in Transact-SQL.
- Developed scripts that automated DDL and DML statements used in creations of databases, tables, constraints, and updates.
- Involved in modeling business processes through UML diagrams.
- Developed Logical and Physical Data models for the loan Systems by using E/R Studio.
- Conversed with Business Analyst and developers to gather information about the data models (Data Definition) to place the data dictionary in place.
- Designed different type of STAR schemas for detailed data marts and plan data marts in the OLAP environment.
- Supported the DBA to physically implement the tables in both oracle databases.
- Established process on the work flow to create Work flow diagrams using Microsoft Visio.
- Developed the Conceptual Data Models, Logical Data Models and transformed them to creating schema using E/R Studio.
- Developed the Data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
- Forward Engineering the Data Models, Reverse Engineering on the existing Data Models and updates the data models.
- Build and maintain SQL scripts, Indexes, and complex queries for data analysis and extraction.
- Identified and documented data sources and transformation rules required to populate and maintain data Warehouse content.
- Involved in extracting, cleansing, transforming, integrating and loading data into different Data Marts using Data Stage Designer.
- Developed SQL Queries to fetch complex data from different tables in remote databases using joins, database links and bulk collects.
- Worked on Data governance, data quality, data lineage establishment processes.
- Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
Environment: E/R Studio v13, SQL, SQL server 2014, Transact-SQL, DDL, DML, UML diagrams, OLAP, Microsoft Visio 2014
Advanced Auto Parts - Raleigh, NC
Sr. Data Analyst/Data Modeler
Responsibilities:
- As a Data Analyst/Modeler responsible for Conceptual, Logical and Physical model for Supply Chain Project.
- Participated in JAD sessions involving the discussion of various reporting needs.
- Analyzed conceptual into logical data and had JAD sessions and also communicated data related issues and standards.
- Interacted with the Subject Matter Experts (SME's) and Stakeholders to get a better understanding of client business processes and gather business requirements.
- Assisted in analysis and recommendations on which Reporting tools.
- Created Database Tables, Views, Indexes, Triggers and Sequences and developing the Database Structure.
- Wrote a complex SQL, PL/SQL, Procedures, Functions, and Packages to validate data and testing process.
- Generated reports using SQL Server Reporting Services from OLTP and OLAP data sources.
- Designed and Developed Use Cases, Activity Diagrams, and Sequence Diagrams using Unified Modeling Language (UML).
- Designed Star and Snowflake Data Models for Enterprise Data Warehouse using PowerDesigner.
- Developed, documented and maintained logical and physical data models for development projects.
- Identified the facts and dimensions and designed star schema model for generating reports.
- Documented Technical & Business User Requirements during requirements gathering sessions.
- Involved in modeling business processes through UML diagrams.
- Created entity process association matrices, functional decomposition diagrams and data flow diagrams from business requirements documents.
- Used Sybase Power Designer tool for relational database and dimensional data warehouse designs.
- Worked alongside the database team to generate the best Physical Model from the Logical Model using Power Designer.
- Developed Cleansing and data migration rules for the Integration Architecture (OLTP, ODS, DW).
- Developed data mapping documents between Legacy, Production, and User Interface Systems.
- Used Crystal Reports to generate Ad-Hoc Reports
Environment: SQL, PL/SQL, OLTP, OLAP, SQL Server 2012, Sybase Power Designer 16.5
Confidential
Data Analyst
Responsibilities:
- Performed Data analysis and Data profiling using complex SQL on various sources systems.
- Developed SAS macros for data cleaning, reporting and to support routing processing.
- Created SQL scripts to find Data quality issues and to identify keys, Data anomalies, and Data validation issues.
- Actively involved in writing T-SQL Programming for implementing Stored Procedures and Functions and cursors, views for different tasks.
- Performed Gap Analysis to check the compatibility of the existing system infrastructure with the new business requirement.
- Used MS Visio for business flow diagrams and defined the workflow.
- Performed Data analysis for the existing Data warehouse and changed the internal schema for performance.
- Wrote SQL Stored Procedures and Views, and coordinate and perform in-depth testing of new and existing systems.
- Experienced in developing business reports by writing complex SQL queries using views, volatile tables.
- Extensively use SAS procedures like means, frequency and other statistical calculations for Data validation.
- Performed Data Analysis and extensive Data validation by writing several complex SQL queries.
- Involved in design and development of standard and ad-hoc reporting using SQL/SSRS
- Identified source databases and created the dimensional tables and checked for data quality using complex SQL queries.
- Responsible for data lineage, maintaining data dictionary, naming standards and data quality.
- Performed Data Manipulation using MS Excel Pivot Sheets and produced various charts for creating the mock reports.
- Used SQL Server and MS Excel on daily basis to manipulate the data for business intelligence reporting needs.
- Performed Data Analysis and Data Profiling and worked on data transformations and data quality rules.
- Extracted data from different sources like Oracle and text files using SAS/Access, SAS SQL procedures and created SAS datasets.
- Developed SQL Joins, SQL queries, tuned SQL, views, test tables, scripts in development environment.
Environment: SQL, SAS macros, T-SQL, MS Visio 2010, MS Excel 2010, SQL Server 2010