We provide IT Staff Augmentation Services!

Sr. Data Architect/modeler Resume

Peoria, IL

SUMMARY

  • Around 8 years of strong IT experienced in Data Architecture, Data Modeling, and Big Data Reporting Design and Development.
  • Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export
  • Experienced in Technical consulting and end - to-end delivery with architecture, data modeling, data governance and design - development - implementation of solutions.
  • Extensive experience Logical and physical database designing like Tables, Constraints, Index, etc. using Erwin, ER Studio, TOAD Modeler and SQL Modeler.
  • Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Experienced in integration of various relational and non-relational sources such as DB2, Teradata, Oracle, Netezza, SQL Server, NoSQL, COBOL, XML and Flat Files, to Netezza database.
  • Experience in BI/DW solution (ETL,OLAP, Data mart), Informatica, BI Reporting tool like Tableau and Qlikview and also experienced leading the team of application, ETL, BI developers, Testing team.
  • Work on Background process in oracle Architecture. Also drill down to the lowest levels of systems design and construction.
  • Skillful in Data Analysis using SQL on Oracle, MS SQL Server, DB2 & Teradata.
  • Expertise on Relational Data modeling (3NF) and Dimensional data modeling.
  • Skillful in Data Analysis using SQL on Oracle, MS SQL Server, DB2 & Teradata.
  • Logical and physical database designing like Tables, Constraints, Index, etc. using Erwin, ER Studio, TOAD Modeler and SQL Modeler.
  • Heavy use of Access queries, V-Lookup, formulas, Pivot Tables, etc. Working knowledge of CRM Automation Salesforce.com, SAP.
  • Data Warehousing Full life-cycle project leadership, business-driven requirements, capacity planning, gathering, feasibility analysis, enterprise and solution architecture, design, construction, data quality, profiling and cleansing, source-target mapping, gap analysis, data integration/ETL, SOA, ODA, data marts, Inman/Kimball methodology, Data Modeling for OLTP, canonical modeling, Dimension Modeling for data ware house star/snowflake design.
  • Have Knowledge on Apache Spark with Cassandra.
  • Experience in Dimensional Data Modeling, Star / Snowflake schema, FACT & Dimension tables.
  • Good understanding and hands on experience with Azure, AWS S3 and EC2.
  • Good experience on programming languages Python, Scala.
  • Heavy use of Access queries, V-Lookup, formulas, Pivot Tables, etc. Working knowledge of CRM Automation Salesforce.com, SAP.
  • Specifies overall Data Architecture for all areas and domains of the enterprise, including Data Acquisition, ODS, MDM, Data Warehouse, Data Provisioning, ETL, and BI.
  • Assist in creating communication materials based on data for key internal /external audiences.

TECHNICAL SKILLS

Data Modeling Tools: Erwin R6/R9, Rational System Architect, IBM Infosphere Data Architect, ER Studio and Oracle Designer.

Big Data Technologies: Pig, Hive, Spark, Scala.

Cloud Platforms: AWS, Azure.

ETL/Data warehouse Tools: Informatica 9.6/9.1/8.6.1/8.1 , SAP Business Objects XIR3.1/XIR2, Web Intelligence, Talend, Tableau 8.2, Pentaho.

Database Tools: Microsoft SQL Server 12.0, Teradata 15.0, Oracle 11g/9i/12c and MS Access.

BI Tools: Tableau 7.0/8.2, Tableau server 8.2, Tableau Reader 8.1,SAP Business Objects, Crystal Reports Packages Microsoft Office 2010, Microsoft Project 2010, SAP and Microsoft Visio, Share point Portal Server.

RDBMS: Microsoft SQL Server14.0, Teradata 15.0, Oracle 12c/11g/10g/9i, and MS Access

Version Tool: GIT, SVN

Project Execution Methodologies: Agile, Ralph Kimball and BillInmon data warehousing methodology, Rational Unified Process (RUP), Rapid Application Development (RAD), Joint Application Development (JAD)

Tools: OBIE 10g/11g/12c, SAP ECC6 EHP5, Go to meeting, Docusign, Insidesales.com, Share point, Mat-lab.

Operating System: Windows, Unix, Sun Solaris

PROFESSIONAL EXPERIENCE

Confidential, Peoria, IL

Sr. Data Architect/Modeler

Responsibilities:

  • Responsible for developing and supporting a data model and architecture that supports and enables the overall strategy of expanded data deliverables, services, process optimization and advanced business intelligence.
  • Worked with Data Vault Methodology Developed normalized Logical and Physical database models
  • Worked with Architecture team to get the metadata approved for the new data elements that are added for this project.
  • Extensively used Erwin r9.6 for Data modeling. Created Staging and Target Models for the Enterprise Data Warehouse.
  • Used Data Services to move data in to HIVE and Hadoop Clusters.
  • Developed HIVE and Map-reduce tools to design and manage HDFS data blocks and data distribution methods.
  • Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ERWIN.
  • Worked with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.
  • Worked on a POC to compare processing time of Impala with Apache Hive for batch applications to implement the former in project.
  • Worked on AWS and architecting a solution to load data, create data models.
  • Creation of BTEQ, Fast export, Multi Load, TPump, Fast load scripts for extracting data from various production systems.
  • Wrote JIL scripts to manage the scheduled loading of sample date into the various Databases and respective objects, through a combination of shell and SQL scripts.
  • Created data models for AWS and Hive from dimensional data models and involved in the Configuration of Hadoop Ecosystems with developers to read data transaction from HDFS and Hive.
  • Generated Sybase SQL scripts that include creating tables, Primary Keys, and Foreign Keys.
  • Created complex stored procedures, Functions, Triggers Indexes, Tables, Views and SQL joins for applications.
  • Troubleshooting performance issues and fine-tuning queries and stored procedures.
  • Generated reports to retrieve data using database code objects, such as Stored Procedures, views, functions and multiple T-SQL.
  • Selecting the appropriate AWS service based on data, compute, database, or security requirements.
  • Developed several behavioral reports and data points creating complex SQL queries and stored procedures using SSRS and Excel.
  • Generated periodic reports based on the statistical analysis of the data using SQL Server Reporting Services (SSRS).
  • Generated reports using Global Variables, Expressions and Functions using SSRS.
  • Developed different kind of reports such as Drill down, Drill through, Sub Reports, Charts, Matrix reports, Parameterized reports and Linked reports using SSRS.
  • Created Hive tables to process Advisor performance data in HDFS and exported it to downstream databases for various type of analyses using Sqoop.
  • Developed Hive scripts (using partitions, joins, buckets) to process the data for analysis (like quaterly growths of products, area wise sales, predict prices for new products based on market needs etc).
  • Implemented Data Vault Modeling Concept solved the problem of dealing with change in the environment by separating the business keys and the associations between those business keys, from the descriptive attributes of those keys using HUB, LINKS tables and Satellites.
  • Designed ER diagrams (Physical and Logical using Erwin) and mapping the data into database objects and identified the Facts and Dimensions from the business requirements and developed the logical and physical models using Erwin.
  • Developed mapping spreadsheets for (ETL) team with source to target data mapping with physical naming standards, data types, volumetric, domain definitions, and corporate meta-data definitions.

Environment: Erwin, Informatica, AWS, SSRS, JDBC, Cassandra, NOSQL, Hive, Pig, Spark, Scala, Python, Hadoop, MySQL, PostgreSQL, SQL Server.

Confidential, Monroe, LA

Sr. Data Modeler

Responsibilities:

  • Worked as a Data Architect / Modeler to generate Data Models using Erwin and developed relational database system and massively involved in Data Architect role to review business requirement and compose source to target data mapping documents.
  • Worked in Regulatory Compliance IT team where worked as Data Architect role which involved Data Profiling, Data Modeling, ETL Architecture & Oracle DBA .
  • Responsible for Big data initiatives and engagement including analysis, brainstorming, POC, and architecture.
  • Designed the Logical Data Model using ERWIN 9.64 with the entities and attributes for each subject areas.
  • Developed long term data warehouse roadmap and architectures, designs and builds the data warehouse framework per the roadmap.
  • Experience with Big Data and Big Data on Cloud, Master Data Management and Data Governance.
  • Hand on Cloud computing using Microsoft Azure with various BI Technologies.
  • Involved in creating Hive tables, and loading and analyzing data using hive queries Developed Hive queries to process the data and generate the data cubes for visualizing Implemented.
  • Implemented Join optimizations in Pig using Skewed and Merge joins for large datasets schema.
  • Designed and developed a Data Lake using Hadoop for processing raw and processed claims via Hive and Informatica.
  • Developed and implemented different Pig UDFs to write ad-hoc and scheduled reports as required by the Business team.
  • Design of Big Data platform technology architecture. The scope includes data intake, data staging, data warehousing, and high performance analytics environment.
  • Involved in the process of adding a new Datacenter to existing Cassandra Cluster.
  • Used Polybase for ETL/ELT process with Azure Data Warehouse to keep data in Blob Storage with almost no limitation on data volume.
  • Created the template SSIS package that will replicate about 200 processes to load the data using Azure SQL.
  • Data modeling, Design, implement, and deploy high-performance, custom applications at scale on Hadoop /Spark.
  • Implemented Data Integrity and Data Quality checks in Hadoop using Hive and Linux scripts.
  • Involved in loading data from LINUX file system to HDFS Importing and exporting data into HDFS and Hive using Sqoop Implemented Partitioning, Dynamic Partitions, Buckets in Hive.
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
  • Specifies overall Data Architecture for all areas and domains of the enterprise, including Data Acquisition, ODS, MDM, Data Warehouse, Data Provisioning, ETL, and BI.
  • Exploring NoSQL options for current back using Azure Cosmos DB (SQL API)
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
  • Involved in Normalization / De normalization techniques for optimum performance in relational and dimensional database environments.
  • Performance tuning and stress-testing of NoSQL database environments in order to ensure acceptable database performance in production mode.
  • Implemented Spark solution to enable real time reports from Cassandra data.
  • Implemented strong referential integrity and auditing by the use of triggers and SQL Scripts.
  • Designed and developed T-SQL stored procedures to extract, aggregate, transform, and insert data.
  • Created and maintained SQL Server scheduled jobs, executing stored procedures for the purpose of extracting data from DB2 into SQL Server.
  • Developed SQL Stored procedures to query dimension and fact tables in data warehouse.
  • Experience with SQL Server Reporting Services (SSRS) to author, manage, and deliver both paper-based and interactive Web-based reports.
  • Performed Hive programming for applications that were migrated to big data using Hadoop
  • Deployed SSRS reports to Report Manager and created linked reports, snapshots, and subscriptions for the reports and worked on scheduling of the reports.
  • Generated parameterized queries for generating tabular reports using global variables, expressions, functions, and stored procedures using SSRS.
  • Created External and Managed tables in Hive and used them appropriately for different PIG scripts required for reporting.
  • Focused on architecting NoSQL databases like Mongo, Cassandra and Cache database.
  • Perform routine management operations, including configuration and performance analysis for mongodb. Diagnosing Performance Issues for mongodb.
  • Managed multiple ETL development teams for business intelligence and Master data management initiatives.
  • Point in time Backup and recovery in MongoDB using MMS. Data modeling for data from RDBMS to and MongoDB for optimal reads and writes.
  • Involved in designing Logical and Physical data models for different database applications using the Erwin.
  • Reverse engineered some of the databases using Erwin.
  • Proficiency in SQL across a number of dialects (we commonly write MySQL, PostgreSQL, SQL Server, and Oracle).
  • Coordinating with Client and Business Analyst to understand and develop OBIEE reports.

Environment: DB2, CA Erwin 9.6, Oracle 12c, MS-Office, SQL Architect, TOAD Benchmark Factory, SQL Loader, PL/SQL, SharePoint, ERwin r9.64, Talend, MS-Office, Redshift, SQL Server 2008/2012, Hive, Pig, Hadoop, Spark, Azure.

Confidential, Fairfield, NJ

Sr. Data Analyst/Modeler

Responsibilities:

  • Performed as a Data Analysis, Data Migration and data profiling using complex SQL on various sources systems including Oracle and Teradata.
  • Used forward engineering to generate DDL from the Physical Data Model and handed it to the DBA.
  • Generated comprehensive analytical reports by running SQL queries against current databases to conduct Data Analysis.
  • Created a list of domains in Erwin and worked on building up the data dictionary for the company.
  • Created DDL scripts for implementing Data Modeling changes. Created ERWIN reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, co-coordinated with DBAs' to apply the data model changes.
  • Performed logical data model design including normalization/de-normalization referential integrity, data domains; primary and foreign key assignments and data element definitions as applied to both relational and dimensional modeling.
  • Worked on Unit Testing for three reports and created SQL Test Scripts for each report as required.
  • Modified cube and dimensions, deployed KPIs on SSRS and created different metrics, added new measure groups and dimensions.
  • Created dashboard SSRS reports under report server projects and publishing SSRS reports to the reports server.
  • Use SAS, SQL, XML, PL/SQL and Windows batch programming techniques to code the technical specifications apply business logic and produce automated reporting solutions.
  • Involved in Data Architecture, Data profiling, Data analysis, data mapping and Data architecture artifacts design.
  • Modified SSIS packages for manipulating the data Source coming from As400 to Back Office system. Altered the stored procedures and joins to efficient usage of mappings.
  • Created various SSRS dashboard reports for commission, sales and profit analysis.
  • Developed Complex Stored Procedures, Views and Temporary Tables as per the requirement.
  • Created and developed the stored procedures, triggers to handle complex business rules, history data and audit analysis.
  • Creation of custom Visual force pages, apex triggers and apex classes, REST API Web services.
  • Developed several Apex Triggers, Classes and Apex API.
  • Wrote SOQL and SOSL statements within custom controllers, extensions and triggers.
  • Worked with the ETL team to document the transformation rules for data migration from OLTP to Warehouse environment for reporting purposes.
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, SQL Server 2005 with high volume data.
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Responsible for all metadata relating to the EDW’s overall data architecture, descriptions of data objects, access methods and security requirements.
  • Perform logical and physical OLAP / OLTP schema design.
  • Designed STAR schema for the detailed data marts and plan data marts consisting of confirmed dimensions.
  • Used Erwin model mart for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Used Model Manager Option in Erwin to synchronize the data models in Model Mart approach.
  • Created views and dashboards on end client's data. Produced powerful dashboards telling story behind the data in an easy to understand format such as pie, bar, geo, and line charts that are viewed daily by senior Management.

Environment: Windows, Erwin, OLAP, OLTP, Teradata, SQL SERVER, SSRS, Informatica Power Center 6.1/7.1, QTP 9.2, Test Director 7.x, Load Runner 7.0, Oracle11g, UNIX AIX 5.2, PERL, Shell Scripting.

Confidential, Triangle Park, NC

Data Analyst

Responsibilities:

  • Analyzed the physical data model to understand the relationship between existing tables. Cleansed the unwanted tables and columns as per the requirements as part of the duty being a Data Analyst.
  • Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
  • Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ER Studio.
  • Worked on Metadata Repository (MRM) for maintaining the definitions and mapping rules up to mark .
  • Trained Spotfire tool and gave guidance in creating Spotfire Visualizations to couple of colleagues
  • Created DDL scripts for implementing Data Modeling changes. Created ER Studio reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, co-coordinated with DBAs' to apply the data model changes.
  • Developed Contracting Business Process Model Workflows (current / future state) using Bizagi Process Modeler software.
  • Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
  • Worked on Unit Testing for three reports and created SQL Test Scripts for each report as required
  • Extensively used ER Studio as the main tool for modeling along with Visio
  • Worked on Unit Testing for three reports and created SQL Test Scripts for each report as required
  • Configured & developed the triggers, workflows, validation rules & having hands on the deployment process from one sandbox to other.
  • Managed Logical and Physical Data Models in ER Studio Repository based on the different subject area requests for integrated model.
  • Created automatic field updates via workflows and triggers to satisfy internal compliance requirement of stamping certain data on a call during submission.
  • Worked on Metadata Repository (MRM) for maintaining the definitions and mapping rules up to mark.
  • Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
  • Developed enhancements to Mongo DB architecture to improve performance and scalability.
  • Forward Engineering the Data models, Reverse Engineering on the existing Data Models and Updates the Data models.
  • Performed data cleaning and data manipulation activities using NZSQL utility.
  • Analyzed and understood the architectural design of the project in a step by step process along with the data flow .

Environment: Oracle SQL Developer, Oracle Data Modeler, Teradata14, SSIS, Business Objects, SQL Server 2008, ER/Studio Windows, MS Excel.

Hire Now