We provide IT Staff Augmentation Services!

Sr. Data Architect / Data Modeler Resume

3.00/5 (Submit Your Rating)

Phoenix, AZ

SUMMARY:

  • 8+ years of IT industry experience in Application Design, Development, and Data Management - Data Governance, Data Architecture, Data Modeling, Data Warehousing and BI, Data Integration, Meta-data, Reference Data and MDM.
  • Having good knowledge in Normalization and De-Normalization techniques for optimum performance in relational and dimensional database environments.
  • Experience working with Agile and Waterfall data modeling methodologies.
  • Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of big data.
  • Proficient in data mart design, creation of cubes, identifying facts & dimensions, star & snowflake schemes.
  • Experience with emerging technologies such Big Data, Hadoop, and NoSQL.
  • Excellent SQL programming skills and developed Stored Procedures, Triggers, Functions, Packages using SQL/PL SQL.
  • Performance tuning and query optimization techniques in transactional and data warehouse environments.
  • Have good experience with Normalization (1NF, 2NF and 3NF) and De-normalization techniques for improved database performance in OLTP, OLAP and Datamart environments.
  • Strong Experience in Entity Relationships & Dimensional Data Modeling to deliver Normalized ER & Star/Snowflake schemas.
  • Experienced in Technical consulting and end-to-end delivery with architecture, data modeling, data governance and design - development - implementation of solutions.
  • Experience in modeling OLAP systems using Kimball and Bill Inmon Data warehousing methodology.
  • Experience in using Business intelligence tools like Tableau and Power BI for visualizing and analyzing of data.
  • Experience in writing Stored Procedures, Functions, Packages, Materialized Views, Triggers.
  • Good understanding of service-oriented architecture (SOA) and web services like XML and SOAP.
  • Experience in object-oriented analysis and design (OOAD), used modeling language (UML) and design patterns.
  • Expertise in SQL Server Analysis Services (SSAS) and SQL Server Reporting Services (SSRS) tools.
  • Strong experience in Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export.
  • Strong experience in analyzing/ Data Transformation of large amounts of data sets writing Pig scripts and Hive, AWS EMR, AWS RDS. Extensive knowledge in Hadoop stack components viz. Apache Hive, Pig Scripting, etc.
  • Extensively worked with data warehousing, Business Intelligence, ETL methodologies, technologies, using Informatica
  • Expertise in performing User Acceptance Testing (UAT) and conducting end user training sessions.
  • Experience with Teradata utilities such as Fast Export, MLOAD for handling various tasks.
  • Strong experience in using Excel and MS Access to dump the data and analyze based on business needs.
  • Excellent understanding of Hub Architecture Style for MDM hubs the registry, repository and hybrid approach.
  • Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
  • Experience in analyzing data using Hadoop Ecosystem including HDFS, Hive, Spark, Spark Streaming, Elastic Search, Kibana, Kafka, HBase, Zookeeper, PIG, Sqoop, Flume.
  • Excellent experience in writing SQL queries to validate data movement between different layers in data warehouse environment.
  • Extensive experience in development of T-SQL, Oracle PL/SQL Scripts, Stored Procedures and Triggers for business logic implementation.
  • Experience designing security at both the schema level and the accessibility level in conjunction with the DBAs.
  • Experience in working with Excel Pivot and VBA macros for various business scenarios.
  • Experience working with data modeling tools like Erwin, Power Designer and ER Studio.

TECHNICAL SKILLS:

Data Modeling Tools: Erwin Data Modeler 9.7/9.6, Erwin Model Manager, ER Studio v17, and Power Designer.

Programming Languages: SQL, PL/SQL, HTML5, XML and VBA.

Reporting Tools: SSRS, Power BI, Tableau, SSAS, MS-Excel, SAS BI Platform.

Cloud Platforms: AWS, EC2, EC3, Redshift & MS Azure

OLAP Tools: Tableau 7, SAP BO, SSAS, Business Objects, and Crystal Reports 9

Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server 2016/2014, DB2.

Operating System: Windows, Unix, Sun Solaris

ETL/Data warehouse Tools:: Informatica 9.6/9.1, SAP Business Objects XIR3.1/XIR2, Talend, and Pentaho.

Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Model

PROFESSIONAL EXPERIENCE:

Confidential, Phoenix, AZ

Sr. Data Architect / Data Modeler

Responsibilities:

  • Designed and implemented a Data Lake to consolidate data from multiple sources, using Hadoop stack technologies like SQOOP, HIVE/HQL.
  • Designed semantic layer data model. Conducted performance optimization for BI infrastructure.
  • Worked with the developers in deciding the application architecture.
  • Participated in preparing Logical Data Models/Physical Data Models.
  • Identify source systems, their connectivity, related tables and fields and ensure data suitably for mapping.
  • Worked with BTEQ to submit SQL statements, import and export data, and generate reports in Teradata.
  • Worked on HL7 2.x file format (ADT and clinical messages) on MEDIFAX and a thorough understanding of how interface development projects work.
  • Developed company-wide data standards, data policies and data warehouse/business intelligence architectures.
  • Designed and documented Use Cases, Activity Diagrams, Sequence Diagrams, OOD (Object Oriented Design) using UML and Visio.
  • Performed data cleaning and data manipulation activities using NOSQL utility.
  • Designed and Developed Oracle PL/SQL Procedures and UNIX Shell Scripts for Data Import/Export and Data Conversions.
  • Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2.
  • Designing Logical data models and Physical Data Models using ER Studio.
  • Developing the Conceptual Data Models, Logical data models and transformed them to creating schema using ER Studio.
  • Involved in the creation, maintenance of Data Warehouse and repositories containing Metadata.
  • Performed Hive programming for applications that were migrated to big data using Hadoop.
  • As an Architect implement MDM hub to provide clean, consistent data for a SOA implementation.
  • Installing and configuring the a 3-node Cluster in AWS EC2 Linux Servers.
  • Designed different type of STAR schemas like detailed data marts and Plan data marts, Monthly Summary data marts using ER studio with various Dimensions Like Time, Services, Customers and various FACT Tables.
  • Developed and maintained data dictionary to create metadata reports for technical and business purpose.
  • ETL processing using Pig & Hive in AWS EMR, S3
  • Developed strategies for data warehouse implementations, data acquisitions, provided technical and strategic advice and guidance to senior managers and technical resources in the creation and implementation for data architecture and data modelling.
  • Extensive Data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Data Profiling, Mapping and Integration from multiple sources to AWS S3.
  • Design and development of ETL routines to extract data from heterogeneous sources and loading to Actuarial Data Warehouse.
  • Involve in Data Architect role to review business requirement and compose source to target data mapping documents.
  • Actively involved in the Design and development of the Star schema data model.
  • Implemented slowly changing and rapidly changing dimension methodologies; created aggregate fact tables for the creation of ad-hoc reports.
  • Created and maintained surrogate keys on the master tables to handle SCD type 2 changes effectively.

Environment: DB2, ER Studio, Oracle 11g, MS-Office, SQL Architect, Hadoop, Hive, Pig, TOAD Benchmark Factory, Sqoop, SQL Loader, AWS S3, PL/SQL, SharePoint, MS-Office, SQL Server 2008/2012

Confidential, Santa Clara, CA

Sr. Data Architect/Data Modeler

Responsibilities:

  • Designed and implemented a Data Lake to consolidate data from multiple sources, using Hadoop stack technologies like SQOOP, HIVE/HQL.
  • Interacted with Business Analysts to gather the user requirements and participated in data modeling JAD sessions.
  • Extensively used Agile methodology as the Organization Standard to implement the data Models.
  • Developed Data mapping, Transformation and Cleansing rules for the Data Management involving OLTP and OLAP.
  • Implemented Forward engineering to create tables, views and SQL scripts and mapping documents
  • Lead the project definition, scoping, business analytics and optimization, and presentations to the Architectural Review Board and the newly created Data Governance Board.
  • Worked with data compliance teams, Data governance team to maintain data models, Metadata, Data Dictionaries; define source fields and its definitions.
  • ETL processing using Pig & Hive in AWS EMR, S3
  • Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
  • Created PL/SQL packages and Database Triggers and developed user procedures and prepared user manuals for the new programs.
  • Worked with Hadoop eco system covering HDFS, HBase, YARN and Map Reduce
  • Reverse Engineering the existing data marts and identified the Data Elements (in the source systems), Dimensions, Facts and Measures required for reports.
  • Conducted Design discussions and meetings to come out with the appropriate Data Warehouse at the lowest level of grain for each of the Dimensions involved.
  • Performed the Data Accuracy, Data Analysis, Data Quality checks before and after loading the data.
  • Installing and configuring the a 3-node Cluster in AWS EC2 Linux Servers.
  • Used SAS procedures like means, frequency and other statistical calculations for Data validation.
  • Developed the data warehouse model (Kimball's) with multiple data marts with conformed dimensions for the proposed central model of the Project.
  • Designed a STAR schema for sales data involving shared dimensions (Conformed) for other subject areas using E/R Studio Data Modeler.
  • Developed rule sets for data cleansing and actively participated in data cleansing and anomaly resolution of the legacy application.
  • Involved in data analysis, data discrepancy reduction in the source and target schemas.
  • Documented of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.
  • Validated and updated the appropriate LDM's to process mappings, screen designs, use cases, business object model, and system object model as they evolve and change.
  • Developed and deployed quality T-SQL codes, stored procedures, views, functions, triggers and jobs.
  • Involved in Data profiling and performed Data Analysis based on the requirements, which helped in catching many Sourcing Issues upfront.
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Worked on Data Mining and data validation to ensure the accuracy of the data between the warehouse and source systems.
  • Performed Data Manipulation using MS Excel Pivot Sheets and produced various charts for creating the mock reports.

Environment: ER/Studio v17, SQL, PL/SQL, Agile, OLAP, OLTP, Data Mart, T-SQL, SAS, Oracle 12c, HDFS, HBase, Hadoop, YARN, SAS.

Confidential, Mt Laurel, NJ

Sr. Data Analyst/Data Modeler

Responsibilities:

  • Gathered and translated business requirements into detailed, production-level technical specifications, new features, and enhancements to existing technical business functionality.
  • Part of team conducting logical data analysis and data modeling JAD sessions, communicated data-related standards.
  • Performed Reverse Engineering of the current application using Erwin, and developed Logical and Physical data models for Central Model consolidation.
  • Translated logical data models into physical database models, generated DDLs for DBAs
  • Performed Data Analysis and Data Profiling and worked on data transformations and data quality rules.
  • Involved in extensive data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Collected, analyze and interpret complex data for reporting and/or performance trend analysis
  • Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
  • Developed and maintain sales reporting using in MS Excel queries, SQL in Teradata, and MS Access.
  • Involved in writing T-SQL working on SSIS, SSRS, SSAS, Data Cleansing, Data Scrubbing and Data Migration.
  • Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns as part of Data Analysis responsibilities.
  • Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
  • Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business
  • Involved in extensive data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Created SQL tables with referential integrity, constraints and developed queries using SQL, SQL*PLUS and PL/SQL.
  • Performed GAP analysis of current state to desired state and document requirements to control the gaps identified.
  • Developed the batch program in PL/SQL for the OLTP processing and used UNIX Shell scripts to run in corn tab.
  • Identified & record defects with required information for issue to be reproduced by development team.
  • Worked on the reporting requirements and involved in generating the reports for the Data Model using crystal reports

Environment: Erwin 9.1, SQL, MS Excel 2012, Teradata r13, MS Access 2012, T-SQL, SSIS, SSRS, PL/SQL, UNIX

Confidential, Broadway, NY

Data Architect

Responsibilities:

  • Created dimensional model for the reporting system by identifying required dimensions and facts using Erwin
  • Participated in design discussions and assured functional specifications are delivered in all phases of SDLC in an Agile Environment.
  • Created a Data Governance Board. Implemented Team Foundation Services (TFS) and integrated Sales Force data with BI platforms
  • Developed and configured on Informatica MDM hub supports the Master Data Management (MDM), Business Intelligence (BI) and Data Warehousing platforms to meet business needs.
  • Worked with Business users for requirements gathering, business analysis and project coordination.
  • Created Logical & Physical Data Modeling on Relational (OLTP), Dimensional Data Modeling (OLAP) on Star schema for Fact & Dimension tables using Erwin.
  • Worked with data compliance teams, data governance team to maintain data models, Metadata, data Dictionaries, define source fields and its definitions.
  • Worked on Performance Tuning of the database which includes indexes, optimizing SQL Statements.
  • Wrote DDL and DML statements for creating, altering tables and converting characters into numeric values.
  • Integrated various sources in to the Staging area in Data warehouse to Integrating and Cleansing data.
  • Worked in importing and cleansing of data from various sources like Teradata, flat files, SQL Server with high volume data.
  • Used Flume extensively in gathering and moving log data files from Application Servers to a central location in Hadoop Distributed File System (HDFS) for data science.
  • Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable.
  • Prepared complex T-SQL queries, views and stored procedures to load data into staging area.
  • Monitored the Data quality and integrity of data was maintained to ensure effective functioning of department.
  • Connected to AWS Redshift through Tableau to extract live data for real time analysis.
  • Translated business concepts into XML vocabularies by designing XML Schemas with UML.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle.
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
  • Designed and Developed PL/SQL procedures, functions and packages to create Summary tables.
  • Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide SQL-like access on Hadoop data.
  • Worked with MDM systems team with respect to technical aspects and generating reports.
  • Performed thorough data analysis for the purpose of overhauling the database using SQL Server.
  • Worked closely with the ETL SSIS Developers to explain the complex Data Transformation using Logic.
  • Enhanced all Business Intelligence and Risk Analytics and architected an in-house developed Master Data Management system.
  • Converting existing hive queries to Spark SQL queries to reduce execution time.
  • Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
  • Developed and presented Business Intelligence reports and product demos to the team using SSRS (SQL Server Reporting Services).
  • Worked Normalization and De-normalization concepts and design methodologies like Ralph Kimball and Bill Inmon approaches.
  • Developed Map Reduce programs to cleanse the data in HDFS obtained from heterogeneous data sources to make it suitable for ingestion into Hive schema for analysis.
  • Created data flow, process documents and ad-hoc reports to derive requirements for existing system enhancements.
  • Used Microsoft Excel tools like pivot tables, graphs, charts, solver to perform quantitative analysis.

Environment: Erwin 9.7, Teradata r15, Oracle 12c, SQL, PL/SQL, AWS, OLTP, OLAP, MDM, T-SQL, SSIS, SSRS, UAT, ETL, Hadoop, HBase, Hive, Agile, XML, MS Excel 2017.

Confidential, Greensboro, NC

Data Modeler

Responsibilities:

  • Conducted Design reviews with the business analysts and content developers to create a proof of concept for the reports.
  • Initiated and conducted JAD sessions inviting various teams to finalize the required data fields and their formats.
  • Interacted with Business Analysts to gather the user requirements and participated in data modeling sessions.
  • Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture involving OLTP, ODS and OLAP.
  • Created Entity Relationship Diagrams (ERD), Functional diagrams, Data flow diagrams and enforced referential integrity constraints.
  • Worked with the Application Development team to implement data strategies, build data flows and develop data models
  • Created conceptual, logical and physical data models using best practices to ensure high data quality and reduced redundancy.
  • Designed and Developed Oracle, PL/SQL, Procedures and Unix Shell Scripts for data Import/Export and data Conversions.
  • Developed Data mapping, Data Governance, Transformation and Cleansing rules for the Data Management involving OLTP and OLAP.
  • Generated various reports using Sql Server Report Services (SSRS) for business analysts and the management team.
  • Used Load utilities (Fast Load & Multi Load) with the mainframe interface to load the data into Teradata.
  • Optimized and updated UML Models (Visio) and Relational Data Models for various applications.
  • Created Data stage jobs (ETL Process) for populating the data into the Data warehouse constantly from different source systems.
  • Performed extensive Data Validation, Data Verification against Data Warehouse and performed debugging of the SQL-Statements and stored procedures for business scenarios.
  • Involved in Data flow analysis, Data modeling, Physical database design, forms design and development, data conversion, performance analysis and tuning.
  • Identified and tracked the slowly changing dimensions and determined the hierarchies in dimensions.
  • Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Normalized the database based on the new model developed to put them into the 3NF of the data warehouse.
  • Created DDL scripts using Erwin and source to target mappings to bring the data from source to the warehouse.
  • Maintained data consistency and database integrity and attended team meetings to identify requirements for data loading and reporting.

Environment: Erwin 9.2, Oracle 12c, SQL, PL/SQL, UNIX, 3NF, OLTP, OLAP, ETL, SSRS, UML.

Confidential, Chicago, IL

Data Analyst

Responsibilities:

  • Worked extensively in data analysis by querying in SQL and generating various PL/SQL objects.
  • Worked with Business Analysts team in requirements gathering and in preparing functional specifications and translating them to technical specifications.
  • Involved in Data profiling and performed Data Analysis based on the requirements, which helped in catching many Sourcing Issues upfront.
  • Involved in designing and developing SQL server objects such as Tables, Views, Indexes (Clustered and Non-Clustered), Stored Procedures and Functions in Transact-SQL.
  • Involved in database development by creating Oracle PL/SQL Functions, Procedures and Collections.
  • Designed data process flows using Informatica to source data into Statements database on Oracle platform.
  • Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Analyzed business requirements, system requirements, data mapping requirement specifications, and
  • Developed PL/SQL scripts to validate and load data into interface tables
  • Extensively used MS Access to pull the data from various data bases and integrate the data.
  • Prepared business case for the data mart and then developed and deployed it.
  • Modification of all databases via indexing of tables, MS SQL Server 2005 design parameters and stored procedures SQL code optimization.
  • Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns as part of Data Analysis responsibilities.
  • Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
  • Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
  • Responsible for different Data mapping activities from Source systems to Teradata
  • Performed ad-hoc analyses, as needed, with the ability to comprehend analysis as needed
  • Involved in Teradata SQL Development, Unit Testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports.
  • Provide support to client with assessing how many virtual user licenses would be needed for performance testing.

Environment: SQL, PL/SQL, Oracle 10g, MS Access 2010, MS SQL Server 2010, Teradata r12

We'd love your feedback!