We provide IT Staff Augmentation Services!

Sr. Data Architect/modeler Resume

5.00/5 (Submit Your Rating)

Pittsburgh, PA

SUMMARY:

  • 7 years in Information Technology with Expertise in Data Architect/Modeler/Analyst with IT professional experienced in designing, developing, and implementing data models for enterprise - level applications and systems.
  • Experienced in integration of various relational and non-relational sources such as DB2, Teradata, Oracle, Netezza, SQL Server, NoSQL, COBOL, XML and Flat Files, to Netezza database.
  • Experience in BI/DW solution (ETL,OLAP, Data mart), Informatica, BI Reporting tool like Tableau and Qlikview and also experienced leading the team of application, ETL, BI developers, Testing team.
  • Experienced in Technical consulting and end-to-end delivery with architecture, data modeling, data governance and design - development - implementation of solutions.
  • Logical and physical database designing like Tables, Constraints, Index, etc. using Erwin, ER Studio, TOAD Modeler and SQL Modeler.
  • Work on Background process in oracle Architecture. Also drill down to the lowest levels of systems design and construction.
  • Experience in Dimensional Data Modeling, Star / Snowflake schema, FACT & Dimension tables.
  • Good understanding and hands on experience with Azure, AWS.
  • Work on Background process in oracle Architecture. Also drill down to the lowest levels of systems design and construction.
  • Heavy use of Access queries, V-Lookup, formulas, Pivot Tables, etc. Working knowledge of CRM Automation Salesforce.com, SAP.
  • Highly skilled in writing Parameterized Queries for generating parameterized reports and Sub reports using global variables, Expressions, and Functions, Sorting the data, Defining Data sources and Subtotals for the reports using SSRS 2016/2014/2012/2008 R2/2008.
  • Specifies overall Data Architecture for all areas and domains of the enterprise, including Data Acquisition, ODS, MDM, Data Warehouse, Data Provisioning, ETL, and BI.
  • Assist in creating communication materials based on data for key internal /external audiences.
  • Skillful in Data Analysis using SQL on Oracle, MS SQL Server, DB2 & Teradata.
  • Extensive Knowledge in creating complex SSIS packages for ETL purposes. Implemented complicated transformations to be used in SSIS packages.
  • Expertise on Relational Data modeling (3NF) and Dimensional data modeling.
  • Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Skillful in Data Analysis using SQL on Oracle, MS SQL Server, DB2 & Teradata.
  • Logical and physical database designing like Tables, Constraints, Index, etc. using Erwin, ER Studio, TOAD Modeler and SQL Modeler.
  • Expertise in developing SSIS Packages and using data transformations and deploying them on various production servers.
  • Proficient in creating sub reports and defining query for generating drill down reports using MS SQL Server Reporting Services (SSRS).
  • Heavy use of Access queries, V-Lookup, formulas, Pivot Tables, etc. Working knowledge of CRM Automation Salesforce.com, SAP.
  • Data Warehousing Full life-cycle project leadership, business-driven requirements, capacity planning, gathering, feasibility analysis, enterprise and solution architecture, design, construction, data quality, profiling and cleansing, source-target mapping, gap analysis, data integration/ETL, SOA, ODA, data marts, Inman/Kimball methodology, Data Modeling for OLTP, canonical modeling, Dimension Modeling for data ware house star/snowflake design.

TECHNICAL SKILLS:

Tools: OBIE 10g/11g/12c, SAP ECC6 EHP5, Go to meeting, Docusign, Insidesales.com, Share point, Mat-lab.

Testing and defect tracking Tools: HP/Mercury (Quality Center, Win Runner, Quick Test Professional, Performance Center, Requisite, MS Visio & Visual Source Safe

Operating System: Windows, Unix, Sun Solaris

Data Modeling Tools: Erwin R6/R9, Rational System Architect, IBM Infosphere Data Architect, ER Studio and Oracle Designer.

Database Tools: Microsoft SQL Server12.0, Teradata 15.0, Oracle 11g/9i/12c and MS Access

BI Tools: Tableau 7.0/8.2, Tableau server 8.2, Tableau Reader 8.1,SAP Business Objects, Crystal Reports Packages: Microsoft Office 2010, Microsoft Project 2010, SAP and Microsoft Visio, Share point Portal Server

Version Tool: GIT, SVN

Project Execution Methodologies: Agile, Ralph Kimball and BillInmon data warehousing methodology, Rational Unified Process (RUP), Rapid Application Development (RAD), Joint Application Development (JAD)

Quality Assurance Tools: Win Runner, Load Runner, Test Director, Quick Test Pro, Quality Center, Rational Functional Tester.

RDBMS: Microsoft SQL Server14.0, Teradata 15.0, Oracle 12c/11g/10g/9i, and MS Access

ETL/Data warehouse Tools: Informatica 9.6/9.1/8.6.1/8.1 , SAP Business Objects XIR3.1/XIR2, Web Intelligence, Talend, Tableau 8.2, Pentaho

Big Data Technologies: Pig, Hive, Hadoop

PROFESSIONAL EXPERIENCE:

Confidential, Pittsburgh, PA

Sr. Data Architect/Modeler

  • Working in Regulatory Compliance IT team where worked as Data Architect role which involved Data Profiling, Data Modeling, ETL Architecture & Oracle DBA .
  • Work on data profiling and analysis to create test cases for new Architecture evaluation.
  • Responsible for Big data initiatives and engagement including analysis, brainstorming, POC, and architecture.
  • Designed the Logical Data Model using ERWIN 9.64 with the entities and attributes for each subject areas.
  • Developed long term data warehouse roadmap and architectures, designs and builds the data warehouse framework per the roadmap.
  • Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications.
  • Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database
  • Managed multiple ETL development teams for business intelligence and Master data management initiatives.
  • Hand on Cloud computing using Microsoft Azure with various BI Technologies.
  • Involved in creating Hive tables, and loading and analyzing data using hive queries Developed Hive queries to process the data and generate the data cubes for visualizing Implemented.
  • Responsible to create conceptual, logical and physical data models , of disparate information for report development.
  • Implemented Join optimizations in Pig using Skewed and Merge joins for large datasets schema.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models.
  • Designed and developed a Data Lake using Hadoop for processing raw and processed claims via Hive and Informatica.
  • Developed and implemented different Pig UDFs to write ad-hoc and scheduled reports as required by the Business team.
  • Design of Big Data platform technology architecture. The scope includes data intake, data staging, data warehousing, and high performance analytics environment.
  • Involved in the process of adding a new Datacenter to existing Cassandra Cluster.
  • Used Polybase for ETL/ELT process with Azure Data Warehouse to keep data in Blob Storage with almost no limitation on data volume.
  • Involved in designing Logical and Physical data models for different database applications using the Erwin.
  • Data modeling, Design, implement, and deploy high-performance, custom applications at scale on Hadoop.
  • Implemented Data Integrity and Data Quality checks in Hadoop using Hive and Linux scripts.
  • Reverse engineered some of the databases using Erwin.
  • Created the template SSIS package that will replicate about 200 processes to load the data using Azure SQL
  • Involved in loading data from LINUX file system to HDFS Importing and exporting data into HDFS and Hive using Sqoop Implemented Partitioning, Dynamic Partitions, Buckets in Hive.
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
  • Specifies overall Data Architecture for all areas and domains of the enterprise, including Data Acquisition, ODS, MDM, Data Warehouse, Data Provisioning, ETL, and BI.
  • Exploring NoSQL options for current back using Azure Cosmos DB (SQL API)
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
  • Involved in Normalization / De normalization techniques for optimum performance in relational and dimensional database environments.
  • Implemented strong referential integrity and auditing by the use of triggers and SQL Scripts.
  • Designed and developed T-SQL stored procedures to extract, aggregate, transform, and insert data.
  • Created and maintained SQL Server scheduled jobs, executing stored procedures for the purpose of extracting data from DB2 into SQL Server.
  • Developed SQL Stored procedures to query dimension and fact tables in data warehouse.
  • Experience with SQL Server Reporting Services (SSRS) to author, manage, and deliver both paper-based and interactive Web-based reports.
  • Performed migration and merging of RPD's in OBIEE
  • Performed Hive programming for applications that were migrated to big data using Hadoop
  • Deployed SSRS reports to Report Manager and created linked reports, snapshots, and subscriptions for the reports and worked on scheduling of the reports.
  • Generated parameterized queries for generating tabular reports using global variables, expressions, functions, and stored procedures using SSRS.
  • Created External and Managed tables in Hive and used them appropriately for different PIG scripts required for reporting.
  • Generate comprehensive analytical reports by running SQL queries against current databases to conduct data analysis pertaining to various loan products.
  • Proficiency in SQL across a number of dialects (we commonly write MySQL, PostgreSQL, Redshift, SQL Server, and Oracle).
  • Routinely deal in with large internal and vendor data and perform performance tuning, query optimizations and production support for SAS, Oracle 12c.
  • Coordinating with Client and Business Analyst to understand and develop OBIEE reports.

Environment: DB2, CA Erwin 9.6, Oracle 12c, MS-Office, SQL Architect, TOAD Benchmark Factory, SQL Loader, PL/SQL, SharePoint, ERwin r9.64, Talend, MS-Office, SQL Server 2008/2012, Hive, Pig, Hadoop, Azure.

Confidential, Chicago, IL

Data Architect/Modeler

  • Responsible for delivering and cordinating data-profiling, data-analysis, data-governance, data-models (conceptual, logical, physical), data-mapping, data-lineage and reference data management.
  • Worked on Normalization and De-normalization concepts and design methodologies like Ralph Kimball and Bill Inmon's Data Warehouse methodology.
  • Involved in Data Profiling, Data cleansing and make sure the data is accurate and analyzed when it is transferring from OLTP to Data Marts and Data Warehouse.
  • Generated and DDL (Data Definition Language) scripts using ERWIN and assisted DBA in Physical Implementation of Data Models.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using Star and Snow Flake Schemas.
  • Used forward engineering to generate DDL from the Physical Data Model and handed it to the DBA.
  • Worked with Data Vault Methodology Developed normalized Logical and Physical database models
  • Worked with Architecture team to get the metadata approved for the new data elements that are added for this project.
  • Extensively used Erwin r9.6 for Data modeling. Created Staging and Target Models for the Enterprise Data Warehouse.
  • Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ERWIN.
  • Worked with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.
  • Worked on a POC to compare processing time of Impala with Apache Hive for batch applications to implement the former in project.
  • Creation of BTEQ, Fast export, Multi Load, TPump, Fast load scripts for extracting data from various production systems.
  • Wrote JIL scripts to manage the scheduled loading of sample date into the various Databases and respective objects, through a combination of shell and SQL scripts.
  • Generated Sybase SQL scripts that include creating tables, Primary Keys, and Foreign Keys.
  • Created complex stored procedures, Functions, Triggers Indexes, Tables, Views and SQL joins for applications.
  • Troubleshooting performance issues and fine-tuning queries and stored procedures.
  • Generated reports to retrieve data using database code objects, such as Stored Procedures, views, functions and multiple T-SQL.
  • Involved in Normalization and De-Normalization of existing tables for faster query retrieval and designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas.
  • Developed several behavioral reports and data points creating complex SQL queries and stored procedures using SSRS and Excel.
  • Generated periodic reports based on the statistical analysis of the data using SQL Server Reporting Services (SSRS).
  • Generated reports using Global Variables, Expressions and Functions using SSRS.
  • Developed different kind of reports such as Drill down, Drill through, Sub Reports, Charts, Matrix reports, Parameterized reports and Linked reports using SSRS.
  • Created Hive tables to process Advisor performance data in HDFS and exported it to downstream databases for various type of analyses using Sqoop.
  • Developed Hive scripts (using partitions, joins, buckets) to process the data for analysis (like quaterly growths of products, area wise sales, predict prices for new products based on market needs etc).
  • Used Data Services to move data in to HIVE and Hadoop Clusters.
  • Developed HIVE and Map-reduce tools to design and manage HDFS data blocks and data distribution methods.
  • Implemented Data Vault Modeling Concept solved the problem of dealing with change in the environment by separating the business keys and the associations between those business keys, from the descriptive attributes of those keys using HUB, LINKS tables and Satellites.
  • Worked on AWS Redshift and RDS for implementing models and data on RDS and Redshift.
  • Designed ER diagrams (Physical and Logical using Erwin) and mapping the data into database objects and identified the Facts and Dimensions from the business requirements and developed the logical and physical models using Erwin.

Environment: AWS, SSRS, Big Data, JDBC, NOSQL, Hive, Pig, Python, Hadoop, MySQL, PostgreSQL, SQL Server, Erwin, Informatica.

Confidential, Dayton, OH

Sr. Data Analyst/Modeler

  • Analyzed the physical data model to understand the relationship between existing tables. Cleansed the unwanted tables and columns as per the requirements as part of the duty being a Data Analyst.
  • Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
  • Worked on Unit Testing for three reports and created SQL Test Scripts for each report as required
  • Extensively used ER Studio as the main tool for modeling along with Visio
  • Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
  • Involved in understanding and creating Logical and Physical Data model using ERstudio Tool.
  • Created dynamic SSRS reports with the help of parameters which changes according to the user input.
  • Created various SSRS dashboard reports for commission, sales and profit analysis.
  • Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ER Studio.
  • Worked on Metadata Repository (MRM) for maintaining the definitions and mapping rules up to mark .
  • Trained Spotfire tool and gave guidance in creating Spotfire Visualizations to couple of colleagues
  • Developed Contracting Business Process Model Workflows (current / future state) using Bizagi Process Modeler software.
  • Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable
  • Worked on Unit Testing for three reports and created SQL Test Scripts for each report as required
  • Configured & developed the triggers, workflows, validation rules & having hands on the deployment process from one sandbox to other.
  • Modified cube and dimensions, deployed KPIs on SSRS and created different metrics, added new measure groups and dimensions.
  • Managed Logical and Physical Data Models in ER Studio Repository based on the different subject area requests for integrated model.
  • Created automatic field updates via workflows and triggers to satisfy internal compliance requirement of stamping certain data on a call during submission.
  • Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
  • Worked on Metadata Repository (MRM) for maintaining the definitions and mapping rules up to mark.
  • Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
  • Developed enhancements to Mongo DB architecture to improve performance and scalability.
  • Forward Engineering the Data models, Reverse Engineering on the existing Data Models and Updates the Data models.
  • Performed data cleaning and data manipulation activities using NZSQL utility.
  • Analyzed and understood the architectural design of the project in a step by step process along with the data flow .
  • Created DDL scripts for implementing Data Modeling changes. Created ERWIN reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, co-coordinated with DBAs' to apply the data model changes.

Environment: Oracle SQL Developer, Oracle Data Modeler, Teradata14, SSIS, Business Objects, SQL Server 2008, ER/Studio Windows, MS Excel.

Confidential

Data Analyst

  • Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
  • Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
  • Optimized the existing procedures and SQL statements for the better performance using EXPLAIN PLAN, HINTS, SQL TRACE and etc. to tune SQL queries.
  • The interfaces were developed to be able to connect to multiple databases like SQL server and oracle.
  • Assisted Kronos project team in SQL Server Reporting Services installation.
  • Developed SQL Server database to replace existing Access databases.
  • Attended and participated in information and requirements gathering sessions
  • Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications.
  • Designed and created web applications to receive query string input from customers and facilitate entering the data into SQL Server databases.
  • Performed thorough data analysis for the purpose of overhauling the database using SQL Server.
  • Designed and implemented business intelligence to support sales and operations functions to increase customer satisfaction.
  • Converted physical database models from logical models, to build/generate DDL scripts.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Extensively used ETL to load data from DB2, Oracle databases.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Expertise and worked on Physical, logical and conceptual data model
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas
  • Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
  • Worked and experienced on Star Schema, DB2 and IMS DB.

Environment: Oracle, PL/SQL, DB2, ERWIN, UNIX, Teradata SQL Assistant, Informatica, OLTP, OLAP, Data Marts, DQ analyzer.

We'd love your feedback!