We provide IT Staff Augmentation Services!

Sr. Data Architect/ Data Modeler Resume

3.00/5 (Submit Your Rating)

Atlanta, GA

SUMMARY

  • Around 7 years of Industry experienced in IT with solid understanding of Data Modeling, Data Analysis, Data Architecture, Evaluating Data Sources and strong understanding of Data Warehouse/Data Mart Design, ETL, BI, OLAP,OLTP, Client/Server applications.
  • Experienced in writing SQL queries and optimizing the queries in Oracle, SQL Server, and Netezza, Teradata.
  • Experienced in Dimensional Data Modeling using Data Modeling, Relational Data modeling, ER/Studio, ERwin, and Sybase Power Designer, Star Join Schema/Snowflake modeling, FACT & Dimensions tables, Conceptual, Physical & logical Data Modeling.
  • Experienced in big data analysis and developing data models using Hive, PIG, and Map reduce, SQL with strong data architecting skills designing data - centric solutions.
  • Experienced in Management and implementation of database models, data flow diagrams, database schemas, db scripts, DTD schemas, structures and data standards to support a robust data management infrastructure.
  • Experienced in Data Analysis and Data Profiling using complexSQL on various sources systems including Oracle and Teradata.
  • Very good knowledge and experience on AWS, Redshift, S3 and EMR.
  • Extensive experience in Normalization (1NF, 2NF, 3NF and BCNF) and De-normalization techniques for improved database performance in OLTP, OLAP and Data Warehouse/Data Mart environments.
  • Experienced on Metadata definition, implementation and maintenance, new business rules identification and implementation to data rules, transformation program library maintenance, XML file generation and data quality.
  • Experienced using MapReduce and Big data work on Hadoop and other NO SQL platforms
  • Experienced in designing standards for using normalized data structures, de-normalized structures and dimensional structures. Defines common design patterns for modeling various types of relationships.
  • Experienced in Batch processes, Import, Export, Backup, Database Monitoring tools and Application support.
  • Experienced in Teradata SQL queries, Teradata Indexes, Utilities such as Mload, Tpump, Fast load and Fast Export.
  • Experienced in using the databases like DB2, Teradata and its utilities, Netezza, Oracle, SQL Server Integration Services (SSIS).
  • Experienced in data from various data sources/business systems including MS Excel, MS Access, Flat Files etc to SQL Server using SSIS using various features like data conversion etc.
  • Experienced in Oracle, Netezza, and Teradata, SQL Server, and DB2 database architecture.
  • Expertise in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
  • Extensive experience in development of T-SQL, DTS, OLAP, PL/SQL, Stored Procedures, Triggers, Functions, Packages, performance tuning and optimization for business logic implementation.
  • Good knowledge of Data Marts, Operational Data Store (ODS), OLAP, Dimensional Data Modeling with Ralph Kimball Methodology (Star Schema Modeling, Snow-Flake Modeling for FACT and Dimensions Tables) using Analysis Services.
  • Excellent in performing data transfer activities between SAS and various databases and data file formats like XLS,CSV,DBF,MDB etc.
  • Experienced in ER Studio and Dimensional Models using ERwin advanced features, Conceptual, logical and physical data models using ERwin.
  • Experienced in development and support knowledge on Oracle, SQL, PL/SQL, T-SQL queries.
  • Experienced in testing integration solutions for Data import, export and Migration using EIM (Enterprise Integration Manager).
  • Experienced in Data Scrubbing/Cleansing, Data Quality, Data Mapping, Data Profiling, Data Validation in ETL.
  • Excellent Knowledge of Ralph Kimball and Bill Inmon's approaches to Data Warehousing.
  • Excellent knowledge in developing Informatica Mappings, Mapplets, Sessions, Workflows and Worklets for data loads from various sources such as Oracle, Flat Files, DB2, SQL Server etc.
  • Experienced in writing UNIX shell scripting and hands on experienced with scheduling of shell scripts using Control-M.

TECHNICAL SKILLS

Analysis and Modeling Tools: Erwin r9.6/r9.5/9.1, Sybase Power Designer, Oracle Designer, BP win ER/Studio, MS Access 2000, Star-Schema, Snowflake-Schema Modeling, and FACT and dimension tables, Pivot Tables.

OLAP Tools: Tableau, SAP BO, SSAS, Business Objects, and Crystal Reports 9.

Oracle: Oracle12c/11g/10g/9i/8.x, R2 database servers with RAC, ASM, Data Guard, Grid Control and Oracle Golden Gate(Oracle Enterprise Manager),Oracle Data Guard, SQL, Net, SQL Loader and SQL PLUS, AWR,ASH, ADDM, Explain Plan.

ETL Tools: SSIS, Pentaho, Informatica Power Center 9.7/9.6/9.5/9.1 etc.

Programming Languages: Java, Base SAS, SSIS and SAS/SQL, SQL, T-SQL, HTML/ XHTML/ HTML4.0.1/ HTML3.2, Java Script, CSS3/CSS2/CSS1, UNIX shells scripting, PL/SQL.

Database Tools: Microsoft SQL Server 2014/2012/2008/2005 , Teradata, and MS Access, Poster SQL, Netezza, SQL Server, Oracle.

Reporting Tools: Business Objects, Crystal Reports, and SSRS.

Operating Systems: Microsoft Windows and UNIX

Tools: & Software: TOAD, MS Office, BTEQ, Teradata 15/14.1/14/13.1/13 , SQL Assistant

Big Data: Hadoop, HDFS 2, Hive, Pig, HBase, Sqoop, Flume.

Other tools: TOAD, SQL PLUS, SQL LOADER, MS Project, MS Visio and MS Office, Have worked on C++, UNIX, PL/SQL, Microsoft Team Foundation Server etc.

PROFESSIONAL EXPERIENCE

Confidential, Atlanta, GA

Sr. Data Architect/ Data Modeler

Responsibilities:

  • Worked as a Data Architect / Modeler to generate Data Models using Erwin and developed relational database system and massively involved in Data Architect role to review business requirement and compose source to target data mapping documents.
  • Analyzed the reverse engineered Enterprise Originations (EO) physical data model to understand the relationships between already existing tables and cleansed unwanted tables and columns as part of Data Analysis responsibilities.
  • Responsible for Big data initiatives and engagement including analysis, brainstorming, POC, and architecture.
  • Designed the Logical Data Model using ERWIN 9.64 with the entities and attributes for each subject areas.
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
  • Create conceptual, logical and physical models for OLTP, Data Warehouse Data Vault and Data Mart, Star/Snowflake schema implementations.
  • Involved in Normalization / De normalization techniques for optimum performance in relational and dimensional database environments.
  • Designed Logical & Physical Data Model /Metadata/ data dictionary using Erwin for both OLTP and OLAP based systems.
  • Developed long term data warehouse roadmap and architectures, designs and builds the data warehouse framework per the roadmap.
  • Identified required dimensions and Facts using Erwin tool for the Dimensional Model.
  • Worked on setting up AWS DMS and SNS for data transfer and replication.
  • Implemented Join optimizations in Pig using Skewed and Merge joins for large datasets schema.
  • Designed and developed a Data Lake using Hadoop for processing raw and processed claims via Hive and Informatica.
  • Extensively used the Erwin design tool & Erwin model manager to create and maintain the versions of the data model. Created integrity rules and defaults and applied naming conventions as per client standards.
  • Implemented Forward Engineering by using DDL scripts and generating indexing strategies to develop the logical data model using Erwin.
  • Migrating existing on-premise applications and services to AWS.
  • Involved in loading data from LINUX file system to HDFS Importing and exporting data into HDFS and Hive using Sqoop Implemented Partitioning, Dynamic Partitions, Buckets in Hive.
  • Design and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
  • Data modeling, Design, implement, and deploy high-performance, custom applications at scale on Hadoop /Spark.
  • Migrated Oracle Database of size 100TB over to AWS cloud from Data centers.
  • Specifies overall Data Architecture for all areas and domains of the enterprise, including Data Acquisition, ODS, MDM, Data Warehouse, Data Provisioning and BI.
  • Performance tuning and stress-testing of NoSQL database environments in order to ensure acceptable database performance in production mode.
  • Involved in making screen designs, Use Cases and ER diagrams for the project using ERWIN and Visio Implemented database procedures, triggers and SQL scripts for development teams.
  • Designed and developed T-SQL stored procedures to extract, aggregate, transform, and insert data.
  • Developed SQL Stored procedures to query dimension and fact tables in data warehouse.
  • Performed Hive programming for applications that were migrated to big data using Hadoop
  • Generated parameterized queries for generating tabular reports using global variables, expressions, functions, and stored procedures using SSRS.
  • Used SSRS to create reports, customized Reports, on-demand reports, ad-hoc reports and involved in analyzing multi-dimensional reports in SSRS.
  • Extensive knowledge in Data loading using PL/ SQL Scripts and SQL Server Integration Services (SSIS)
  • Focused on architecting NoSQL databases like Mongo, Cassandra and Cache database.
  • Point in time Backup and recovery in MongoDB using MMS. Data modeling for data from RDBMS to and MongoDB for optimal reads and writes.
  • Implemented Data Integrity and Data Quality checks in Hadoop using Hive and Linux scripts.
  • Proficiency in SQL across a number of dialects (we commonly write MySQL, PostgreSQL, Redshift, SQL Server, and Oracle)
  • Routinely deal in with large internal and vendor data and perform performance tuning, query optimizations and production support for SAS, Oracle 12c.

Environment: DB2, CA Erwin 9.6, Oracle 12c, MS-Office, SQL Architect, TOAD Benchmark Factory, SQL Loader, PL/SQL, SharePoint, Talend, MS-Office, Redshift, SQL Server 2008/2012/2014 , Hive, Pig, Hadoop, Spark, AWS.

Confidential, Minneapolis, MN

Data Modeler

Responsibilities:

  • As an Architect implemented MDM hub to provide clean, consistent data for a SOA implementation.
  • Used Agile Methodology of Data Warehouse development using Kanbanize.
  • Worked on the enterprise level modules for the insurance client to perform the data modeling and data architectural related tasks.
  • Extensively used Agile methodology as the Organization Standard to implement the data Models.
  • Provided solutions for the data migration on various platforms from hierarchical to relational databases to unstructured databases.
  • Developed the data warehouse model (Kimball's) with multiple data marts with conformed dimensions for the proposed central model of the Project.
  • Actively participated in JAD sessions involving the discussion of various reporting needs.
  • Working on the OLAP for data warehouse and data mart developments using Ralph Kimball methodology as well as OLTP models, both.
  • Worked with Business Analyst during requirements gathering and business analysis to prepare high level Logical Data Models and Physical Data Models using E/R Studio.
  • Generated Physical model using forward engineering and then deployed DDL when it was approved by business.
  • Reverse engineered reports from old systems and identified required Data Elements in the source systems for Dimensions, Facts and Measures.
  • Created ERD diagrams using ER Studio and implemented concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable.
  • Experience in Multiple MPP and SMP Informix 8.x parallel database Data Warehousing projects as development DBA and analyst/developer.
  • Involved in Data modeling using ER Studio identified objects and relationships and how those all fit together as logical entities, these are then translated into physical design using forward engineering ER Studio tool.
  • Designed a STAR schema for the detailed data marts and confirmed dimensions.
  • Created and maintained the Data Model repository as per the standards.
  • Created dimensional bus matrix by consolidating all the models.
  • Worked on the Snow-flaking the Dimensions to remove redundancy.
  • Created ETL Jobs to load data from Source Systems to target data marts.
  • Defined business rules, aggregations and lookups.
  • Collected technical and business metadata and maintained naming standards by working along with architects, data governance, business analysts and developers, SME's, etc.
  • Created PL/SQL packages and Database Triggers and developed user procedures and prepared user manuals for the new programs.
  • Coordinate with Data Architects to Design Big Data, Hadoop projects and provide for a designer that is an idea-driven.
  • Configured Hadoop Ecosystems to read data transaction from HDFS and Hive.
  • Performed ad-hoc analyses, as needed, with the ability to comprehend analysis as needed.
  • Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT).
  • Collaborated with the report development team to design monthly/summary level cubes to support the further aggregated level of detailed reports.

Environment: E/R Studio 9.5, SQL Server 2014, OLAP, OLTP, Data Warehouse, Data Vault, Data Mart, Apache Hadoop & Big Data.

Confidential, Eden Prairie, MN

Sr. Data Analyst/ Modeler

Responsibilities:

  • Taking care of all day to day operations on databases which included incidents, changes and alerts.
  • Developed a Conceptual model using Erwin based on requirements analysis
  • Developed normalized Logical and Physical database models to design OLTP system for insurance applications.
  • Created dimensional model for the reporting system by identifying required dimensions and facts using Erwin 9.5.
  • Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model.
  • Generated various presentable reports and documentation using report designer and pinned reports in ERWIN.
  • Identified entities and attributes and developed conceptual, logical and physical models using ERWIN.
  • Designed and Developed Use Cases, Activity Diagrams, and Sequence Diagrams using Unified Modeling Language (UML).
  • Involved in the analysis of the existing claims processing system, mapping phase according to functionality and data conversion procedure.
  • Performed Normalization of the existing OLTP systems (3rd NF), to speed up the DML statements execution time.
  • Data modeling in Erwin; design of target data models for enterprise data warehouse (Teradata)
  • Created and Maintained Logical Data Model (LDM) for the project. Includes documentation of all Entities, Attributes, Data Relationships, Primary and Foreign Key Structures, Allowed Values, Codes, Business Rules, Glossary Terms, etc.
  • Developed the required data warehouse model using Star schema for the generalized model
  • Experienced in Oracle installations, upgrades, migration, designing logical/physical architecture, Tuning, Capacity planning, database access and Security and auditing.
  • Possess strong Conceptual and Logical Data Modeling skills, has experience with JAD sessions for requirements gathering, creating data mapping documents, writing functional specifications.
  • Database backup, recovery & cloning using RMAN methodology.
  • Identify performance problems using wait event statistics. Monitor session-level wait events and collect historical data for root cause analysis Supported the development team at offshore in issue fixing, migrations of code & other support activities.
  • Collaborated with BI and DBA teams to analyze and provide solutions to data issues and other challenges while implementing the OLAP model.
  • Developed the Logical and physical data model and designed the data flow from source systems to Teradata tables and then to the Target system.
  • Designed and Developed Use Cases, Activity Diagrams, and Sequence Diagrams using Unified Modeling Language (UML).
  • Worked for cleansing and organizing various tables in a presentable manner to help with better understanding of already existing models.

Environment: Erwin, OLTP, DDL, UML, Star schema, Oracle11g, Informatica, Teradata.

Confidential, Washington DC

Data Modeler

Responsibilities:

  • Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
  • Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
  • Optimized the existing procedures and SQL statements for the better performance using EXPLAIN PLAN, HINTS, SQL TRACE and etc. to tune SQL queries.
  • The interfaces were developed to be able to connect to multiple databases like SQL server and oracle.
  • Assisted Kronos project team in SQL Server Reporting Services installation.
  • Developed SQL Server database to replace existing Access databases.
  • Attended and participated in information and requirements gathering sessions
  • Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications.
  • Designed and created web applications to receive query string input from customers and facilitate entering the data into SQL Server databases.
  • Performed thorough data analysis for the purpose of overhauling the database using SQL Server.
  • Designed and implemented business intelligence to support sales and operations functions to increase customer satisfaction.
  • Converted physical database models from logical models, to build/generate DDL scripts.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas
  • Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
  • Worked and experienced on Star Schema, DB2 and IMS DB.

Environment: Oracle, PL/SQL, DB2, Erwin7.0, Unix, Teradata SQL Assistant, Informatica, OLTP, OLAP, Data Marts, DQ analyzer.

We'd love your feedback!