We provide IT Staff Augmentation Services!

Sr. Data Architect/data Modeler Resume

3.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY:

  • Over 9+ years of total IT experience and expertise in data modeling for data warehouse/data mart development, Data Architecture, Data Analysis, SQL and analysis of Online transactional Processing (OLTP), data warehouse (OLAP) and business Intelligence (BI) applications.
  • Extensive IT experience in Data Modeling, Data warehouse Design, ER Modeling, Business Object Modeling, Dimensional Modeling (Star Schema. Snowflake Schema Modeling) and good understanding of Online Transaction Processing and Online Analytical Processing (OLTP & OLAP)
  • Extensive knowledge of big data, Hadoop, Map - Reduce, Hive, NoSQL Databases and other emerging technologies.
  • Experienced working with AWS, S3, Redshift.
  • Business Intelligence: Requirements analysis, Key Performance Indicators (KPI), metrics development, sourcing and gap analysis, OLAP concepts and methods, aggregates / materialized views and performance, rapid prototyping, tool selection, semantic layers Excellent experience in writing SQL queries to validate data movement between different layers in data warehouse environment
  • Experienced using ER diagram, Dimensional data modeling, Logical/Physical Design, Star Schema modeling, Snow-flake modeling using tools like Erwin and ER Studio and Proficient in business analysis, end-user analytics, development lifecycle, ETL, QA and testing, production support, performance tuning.
  • Experienced in importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa.
  • Excellent knowledge of waterfall and spiral methodologies of Software Development Life Cycle (SDLC).
  • Extensive experience with Data Modeling and Data warehousing Implementation, analysis, design, development, customizations and implementation of software applications including ERP applications.
  • Experienced in Extract Transform and Load (ETL) data from spreadsheets, database tables and other sources using Informatica and well-versed in writing SQL queries to perform end-to-end ETL validations and support Ad-hoc business requests.
  • Experienced with distributed data warehousing and/or data mining systems, using one or more Big Data/NoSQL technologies (Hadoop, Hive, Hbase, Pig, Cassandra, MongoDB etc.)
  • Expertise in building Enterprise Data Warehouse or Data warehouse appliances from Scratch using both Kimball and Inmon Approach.
  • Experience in Creating Derived fields using calculated functions and parameters in Tableau
  • Efficient in Dimensional Data Modeling for Data Mart design, identifying Facts and Dimensions and Strong understanding of principles of data warehousing, fact tables, dimension tables, Slowly Changing Dimensions (SCD) Type I and Type II.
  • Expertise on optimization by writing Enterprise SQL queries using SAS/SQL procedures.
  • Excellent experience working on Netezza/Teradata and writing heavy SQL queries and expertise in various Teradata utilities like Fastload, Multiload, BTEQ, and Teradata SQL Assistant.
  • Excellent experience in Creating ERWIN reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, co-coordinated with DBAs' to apply the data model changes.
  • Expertise in SQL Server Analysis Services (SSAS) and SQL Server Reporting Services (SSRS).
  • Expertise in Normalization (1NF, 2NF, 3NF and BCNF)/Denormalization techniques for effective and optimum performance in OLTP and OLAP
  • Expertise in creating database objects and structures including databases, tables, indexes, views, snapshots, stored procedures, triggers, functions and cursors.
  • Experienced in developing TSQL scripts and stored procedures to perform administrative tasks and multiple DDL, DML, and DCL activities to carry out business requirements.
  • Well versed in system analysis, ER/Dimensional Modeling, Database design and implementing RDBMS specific features.
  • Expertise in dealing with different data sources ranging from flat files, Netezza, Teradata, Excel, Oracle, SQL Server and expert in writing SQL and PL/SQL scripts in Data warehousing applications.
  • Strong understanding of the principles of Data warehousing, Fact Tables and Dimension Tables.
  • Experienced in creating, managing and delivering server based reports with interactive views that provide valuable insight for business heading using SSMS dashboard and MS SQL Reporting service (SSRS).
  • Experienced in using dynamic management views (DMV) and dynamic management functions (DMF), TSQL scripts, Triggers, and Store Procedure for troubleshooting and performance tuning that include finding out long running queries and query costs.
  • Highly Proficient in T-SQL DDL/DML, perform most of the SQL Server Enterprise Manager and Management studio functionality using T-SQL Scripts.
  • Expertise in writing SQL Queries, Dynamic-queries, sub-queries and complex joins for generating Complex Stored Procedures, Triggers, User-defined Functions, Views and Cursors.
  • Excellent knowledge in Data Validation, Data Cleansing, Data Verification and identifying data mismatch.

TECHNICAL SKILLS:

Database:: NETEZZA, Teradata, Oracle 12c/11g/10g, MS Access, MS SQL Server 2014, 2012, 2008 R2, 2005, MySQL, DB2.

ETL Tools:: Informatica Power Center 8.6, 7.1, 9.1, SSIS.

Hadoop EcoSystem: Hive, Pig, MapReduce, HDFS, HBase, MongoDB, Sqoop, etc.

Data Modeling Tools:: Erwin, Microsoft Visio 2003, ER Studio, Power Designer.

Reporting Tools:: Business Objects, Crystal Reports

Languages:: PL/SQL, SQL, UNIX SHELL Scripting, Visual Basic 6, C#, C++, HTML, T-SQL, OOP, Data Structure, Algorithms, SAS.

Operating System:: Windows 2003/2008/2008 R2/2012/2012R2 Server, NT & Windows9x, UNIX

Version Control:: PVCS, VSS, CVS Schedulers Dollar Universe.

Others:: MS Project, MS Office, Toad, SQL Developer, Tableau, VI, Business Objects, SAS, Lotus Notes, WebEx, NetezzaAginity, Teradata SQL Assistant, AWS

PROFESSIONAL EXPERIENCE:

Confidential, Chicago IL

Sr. Data Architect/Data Modeler

Responsibilities:

  • Developed a high performance, scalable data architecture solution that incorporates a matrix of technology to relate architectural decision to business needs.
  • Lead the development and implementation of the logical data model and physical data design utilizing data architecture and modeling standards.
  • Architected and designed solutions for complex business requirements, including data processing, analytics and ETL and reporting processes to improve performance of data loads and processes.
  • Define information architecture, data interfaces for serving to dashboards, ETL processes, data models, schemas for high performance distributed processing and analyze of functional and non-functional categorized data elements for data profiling and mapping from source to Confidential data environment.
  • Advanced information management and new data processing techniques may be applied to extract the value locked up in this data called Hadoop (HDFS) along with processing large data sets in parallel across a Hadoop cluster and the utilization of Hadoop MapReduce framework.
  • Worked on AWS Redshift and RDS for implementing models and data on RDS and Redshift.
  • Presented the data scenarios via, Erwin logical models and excel mockups to visualize the data better and used Erwin and Visio to create 3NF and dimensional data models and published to the business users and ETL /BI teams.
  • Dealing with Business analysts in agile methodology to understand the business challenges and data for reporting.
  • Involved in designing and architecting data warehouses and data lakes on regular (Oracle, SQL Server) high performance (Netezza and Teradata) and big data (Hadoop - MongoDB, Hive, Cassandra and HBase) databases.
  • Responsible for interaction with business stake holders, gathering requirements and managing the delivery, covering the entire Tableau development life cycle.
  • Handled importing data from various data sources, performed transformations using Hive, Map Reduce, and loaded data into HDFS
  • Interpret problems and provides solutions to business problems using data analysis, data mining, optimization tools, and machine learning techniques and statistics.
  • Develop TSQL scripts to create database objects and perform DML and DDLtasks and wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
  • Designed the ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) for Oracle and Teradata as per business requirements using Erwin 9.6 and troubleshoot test scripts, SQL queries, and ETL jobs and data mart/data store models.
  • Processed the data using HQL (like SQL) on top of Map-reduce.
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica and create Databases, Tables, Indexes, Stored Procedures, Views, database management policies and Constraints, Defaults, Rules, Functions, Triggers, Cursors and Dynamic SQL queries.
  • Worked on SAS for Data Analysis and involved in importing and cleansing of data from various sources like Teradata, Oracle, flat files, Netezza, SQL Server with high volume data and Creating SAS programs and performed data validation on raw data sets and created data set for analysis and Capable of utilizing data management techniques like Merging, concatenating, interleaving of SAS datasets using MERGE and SET statements in DATA step.
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues and resolved data issues and updates for multiple applications using SQLqueries/scripts.
  • As a Architect implement MDM hub to provide clean, consistent data for a SOA implementation.
  • Identified all ERP fields, tables and relationships for sales, Finance, Management reporting.
  • Involved in creating Data Lake by extracting customer's Big Data from various data sources into Hadoop HDFS. This included data from Excel, Flat Files, Oracle, SQL Server, MongoDb, Cassandra, HBase, Teradata, Netezza and also log data from servers
  • Developed reports using SQL Server Reporting Services (SSRS) and involved in the validation of the OLAP Unit testing and System Testing of the OLAP Report Functionality and data displayed in the reports.
  • Populate or refresh Teradata tables using Fast load, Multi load & Fast export utilities for user Acceptance testing and loading history data into Teradata.

Environment: ERWIN r9.x, MS Excel, PL/SQL, SQL Server, SSRS, SSIS, Java, Business ObjectsXIR2, ETL Tools Informatica 9.5, Oracle 12c, Tableau, Teradata V2R12, ERP, BigData, Agile, MDM, SAS, Workday, MongoDB, AWS, Cassandra, Teradata SQL Assistant 12.0, NetezzaAginity, UNIX, Hadoop, Hive, HBase, Pig, SQL, T-SQL.

Confidential, Minneapolis MN

Sr. Data Architect/Data Modeler

Responsibilities:

  • Presented technical concepts, analysis, and solutions to both technical and nontechnical audience to ensure broad adoption of the overall data architectural plan and Maintained effective working relationships with functional owners, peers and vendors for implementation of IT projects .
  • Worked with EDM Analytics Leaders to understand business needs and information requirements and Assisted with definition of optimum system requirements to meet business needs, and prioritize and manage new requirements across multiple subject areas.
  • Implemented end-to-end systems for Data Analytics, Data Automation and integrated with custom visualization tools using R, Hadoop and MongoDB, Cassandra.
  • Worked with the business users and ETL architect to gather and understand the business requirements in agile methodology.
  • Developed and assisted with the development of robust wing to wing data designs, including logical and physical data models, that cover source system ingestion to performance tuned downstream data delivery. Ensure that these designs fit within the overall data strategy.
  • Used NameNode where Hadoop stores all the file location information in HDFS and tracks the file data across the cluster or multiple machines
  • Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center.
  • Proficient in Data Analysis, Data Validation, Data Lineage Data Cleansing, Data auditing, Data Tracing, Data Verification and identifying data mismatch.
  • Designed, developed and implemented Tableau Business Intelligence reports.
  • Perform ERP Related Activities like Streamlining/Creating new and existing MIS reports, strategizing with management.
  • Worked with Data Steward Team for designing, documenting and configuring Informatica DataDirector for supporting management of MDM data.
  • Created Conceptual, logical and physical data modeling for OLTP and OLAP and tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to Confidential tables.
  • Analyzed the web log data using the HiveQL to extract number of unique visitors per day, page views, visit duration, most purchased product on website.
  • Worked on NOSQL databases like MongoDB, HBase.
  • Design new SAS programs by analyzing requirements, constructing workflow charts and diagrams, studying system capabilities and writing specifications.
  • Performed business analysis including analysis of the data flows and the system data and perform extensive source system data mapping to gain a strong understanding of the data needed for metrics and available for reporting.
  • Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements and performed data profiling and data analysis of source and Confidential systems in Teradata and SQL Server environments.
  • Responsible for creating Hive tables, loading data and writing hive queries.
  • Designed ETL build specifications and testing specifications that ensure technical compatibility and integration with existing solutions. Ensure that each design incorporates security, change management, data quality validations, data lineage and metadata capture.
  • Delivered file in various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.) and performed ad hoc analyses, as needed, with the ability to comprehend analysis as needed.
  • Involved in Teradata SQL Development, Unit Testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports and created UNIX scripts for file transfer and file manipulation.
  • Used Tableau Desktop and analyze and obtain insights into large data sets.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata and involved in extensive DATA validation using SQL queries and back-end testing.
  • Worked with different sources such as Oracle, Teradata, SQL Server2012 and Excel, Flat, Complex Flat File, Cassandra, MongoDB, HBase, and COBOL files.
  • Created jobs using DataStage designer to import data from heterogeneous sources like Db2, Oracle, Flat files and SQL server.
  • Accessing and analyzing large volumes (millions) of transaction data using SAS and SAS Enterprise Guide (EG).
  • Managed and administrated SQL Server 2012/2008R2 on production and non-production servers.
  • Logical and Physical design using both Ralph Kimball and Bill Inmon style data warehouse designs and worked with different ETL tasks for downloading data from Excel, Flat file, CSV file to SQL Server Database .
  • Performed Verification, Validation, and Transformations on the Input data (Text files, XML files) before loading into Confidential database.
  • Designed complex SSAS solutions using multiple dimensions, perspectives, hierarchies, measures groups and KPIs to analyze performance of Strategic Business Units as well as corporate centers.
  • Used Normalization methods up to 3NF and De-normalization techniques for effective performance in OLTP systems.

Environment: Erwin r9.5, MS Excel, MS Access, Oracle 11g, UNIX, Hadoop, Hive, PIG, MongoDB, Cassandra, BigData, MapReduce, Windows7, SQL, PL/SQL, T-SQL, Datastage, UNICA 6.4, UNIX, Agile, SSAS, MDM, Teradata, Metadata, SAS, SQL Server, Tableau, Netezza, ERP, SSRS, Teradata SQL Assistant, DB2, Netezza Aginity.

Confidential - Seattle, WA

Sr. Data Modeler/ Data Analyst/ Data Warehousing

Responsibilities:

  • Created logical physical data models and Meta Data to support the requirements Analyzed requirements to develop design concept and technical approaches to find the business requirements by verifying Manual Reports.
  • Involved in fixing invalid mappings, testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and the Confidential Data.
  • Understand the business process; gather business requirements; determine impact analysis based on ERP.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snowflake Schema.
  • Use Informatica Power Center to move data MDM into the hubs
  • Worked on data integration and workflow application on SSIS platform and responsible for testing all new and existing ETL data warehouse components.
  • Reverse Engineered the Data Models and identified the Data Elements in the source systems and adding new Data Elements to the existing data models and used SQL for querying the database in UNIX environment.
  • End to End process involvement from gathering client business requirements, developing the dashboard in Tableau and publishing the dashboard into server.
  • Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW)
  • Performed Business Area Analysis and logical and physical data modeling for a Data Warehouse utilizing the Bill Inmon Methodology and also designed Data Mart application utilizing the Star Schema Dimensional Ralph Kimball methodology.
  • Extracted/Transformed/Loaded (ETL) design and implementation in areas related to Teradata utilities such as Fast Export and MLOAD for handling numerous tasks.
  • Implement functional requirements using Base/SAS, SAS/Macros, SAS/QL, UNIX, Oracle and DB2.
  • Upgrading the SQL Server Databases, Monitoring and Performances tuning and developed reports using Crystal Reports with T-SQL, MS Excel and Access.
  • Migrated SQL Server 2005 databases to SQL Server 2008R, 2008R2 databases also migrated to IBM DB2.
  • Worked on multiple Data Marts in Enterprise Data Warehouse Project (EDW) and involved in designing OLAP data models extensively used slowly changing dimensions (SCD).
  • Worked on all activities related to the development, implementation, administration and support of ETL processes for large-scale Data Warehouses using SQL Server SSIS.
  • Developed automated procedures to produce data files using Microsoft Integration Services (SSIS) and performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Netezza
  • Developed ER and Dimensional Models using ER Studio advanced features and created physical and logical data models using ER Studio.
  • Coding SAS programs with the use of Base SAS and SAS/Macros for Adhoc jobs requested by Users.
  • Designed Dimensional Modeling using SSAS packages for End-User. Created Hierarchies and defined Dimension Relationships.
  • Used SQL Profiler for monitoring and troubleshooting performance issues in T-SQL code and stored procedures.
  • I mplemented Agile Methodology for building an internal application.
  • Extracted data from databases Oracle, Teradata, Netezza, SQL server and DB2 using Informatica to load it into a single repository for data analysis and used SQL on a wide scale for analysis, performancetuning and testing.

Environment: ER Studio, SQL Server 2008, SQL Server Analysis Services 2008, SSIS 2008, SSRS 2008, Oracle 10g, Business Objects XI, Rational Rose, Tableau, ERP, Netezza, Teradata, Excel, Pivot tables, DB2, Data stage, MS Office, MS Visio, SQL, SQL Server 2008, Rational Rose, T-SQL, UNIX, SSRS, Agile, SAS, SSAS, MDM, Shell Scripting, Crystal Reports 9, Aginity.

Confidential, Memphis, TN

Sr. Data Modeler/Data Analyst/Data Warehousing

Responsibilities:

  • Interacting with business users to analyze the business process and requirements and transforming requirements into Conceptual, logical and Physical Data Models.
  • Performed Data modeling using Erwin. Identified objects and relationships and how those all fit together as logical entities, these are then translated into physical design using the forward engineering ERWIN tool.
  • Worked with slowly changing dimensions (SCDs) in implementing custom SCD transformations and involved in loading the data from Source Tables to Operational Data Source tables using Transformation and Cleansing Logic.
  • Redefined attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns as part of data analysis responsibilities.
  • Coordinated data profiling/data mapping with business subject matter experts, data architects, ETL developers, and data modelers and used CAErwin Data Modeler (Erwin) for data modeling of custom developed information systems, including databases of transactional systems and datamarts.
  • Created Entity/Relationship Diagrams, grouped and created the tables, validated the data, identified PKs for lookup tables.
  • Used BETQto run and Teradata SQL scripts to create physical data model and created UNIX scripts for file transfer and file manipulation.
  • Validated the data in the 3NF data model by manually executing the query through the data models and generated many datasets according to specifications by writing SQL queries for modeling purposes.
  • Integrated data from various Data sources like MS SQL Server, DB2, Oracle, Netezza and Teradata using Informatica to perform Extraction, Transformation, loading (ETL processes) Worked on ETL development and Data Migration using SSIS and (SQL Loader, PL/SQL).
  • Used SAS Macros, PROC SQL to extract, sort match test accounts and ODS to HTML, PDF and RTF.
  • Involved in modeling (Star Schema methodologies) in building and designing the logical data model into Dimensional Models and Performance query tuning to improve the performance along with index maintenance.
  • Conducted data cleaning to understand the current applications relevant to IRB RWA calculation, PPNR forecast, and CCAR stress test.
  • Assisting development teams with data cleansing, migration and loading in addition to assisting and supporting development teams with reviewing, building, testing and maintaining database structures and SQL scripts.
  • Responsible for Creating and Modifying T-SQL stored procedures/triggers for validating the integrity of the data.
  • Created Tableau reports based on existing Excel reports. Combine data sources by joining multiple tables and using data blending. Based on the client requirements, created subscriptions, snapshot and linked reports.
  • Developing metric checks for various source system files to analyze monthly and quarterly data trends using Proc SQL and SAS/STAT procedures.
  • Performed extensive data analysis to identify various data quality issues with the data coming from external systems
  • Created Stored Procedures to transform the Data & worked extensively in T-SQL for various needs of the transformations while loading the data.
  • Created number of standard reports and complex reports to analyze data using Slice & Dice and Drill Down, Drill through using SSRS.
  • Created Physical Data Model (PDM) and works closely with the application development team to implement required performance enhancement features (such as adding indexes, splitting tables) in PDM.

Environment: Erwin, Informatica Power Center, Windows, Teradata, Netezza, Toad, Oracle 9i, SQL Server 2000, SQL*Loader, AIX 5.0, PL/SQL, SAS, Tableau, SSRS, Excel, Windows XP, UNIX, Linux, SQL Server, Visio, T-SQL,SSIS.

Confidential

Data Analyst

Responsibilities:

  • Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines.
  • Enterprise Metadata Library with any changes or updates
  • Evaluated data profiling, cleansing, integration and extraction tools(e.g. Informatica)
  • Wrote SQL prototypes against new strategic sources to confirm the numbers are showing the same.
  • Implementation of Metadata Repository, Data Cleanup procedures, Transformations, Data Standards, Data Governance program, Scripts, Stored Procedures, triggers and execution of test plans.
  • Used ER/Studio to transform data requirements into data models.
  • Performed tuning of SQL Queries to improve the response time.
  • Designed and Developed SQL procedures, functions and packages to create Summary tables.
  • Perform data manipulation using BASIC functions and DataStage transforms and performance tuning of DataStage ETL- Jobs and Teradata Tables.
  • Involved in extensive Analysis on the Teradata and Oracle Systems.
  • Assisted in creation, verification and publishing of metadata including identification, prioritization, definition and data lineage capture for Key Data Elements (KDE) and Important Data Elements (IDE).
  • Acted as a Strong Data Analyst analyzing the data from low level and Generated Reports using SSRS and Excel Spreadsheet.
  • Wrote T-SQL procedures to generate DML scripts that modified database objects dynamically based on user inputs and involved in performance tuning and monitoring of both T-SQL and PL/SQL blocks.
  • Created various Parameterized, Cascaded, Linked, Drill-through and Drill-down Reports.
  • Involved in generating Parameterized, Drill-through, and Drill-down Reports to get monthly customers account details and profits using SSRS.
  • Creation of database objects like tables, views, Materialized views, procedures, packages using Oracle tools like PL/SQL, SQL* Plus, SQL*Loader and Handled Exceptions.
  • Extensively used ER/Studio for developing data model using star schema methodologies.
  • Performed Normalization, Indexes Tuned and Optimized the existing database design.
  • Converted Data stored in flat files into Oracle tables.
  • Involved in extensive SQL Scripting and Build and maintained complex SQL queries for data analysis and data extract.

Environment: SQL/Server, Oracle 9i, MS-Office, Teradata, DataSatge, ER Studio, XML, Business Objects, T-SQL, PL/SQL, SSRS.

We'd love your feedback!