Sr. Data Architect/modeler Resume
Tampa, FL
SUMMARY
- 9+ years in Information Technology with Expertise in Data Architect/Modeler/Analyst with IT professional experienced in Data Analysis, Data Modeling, Data Architecture, designing, developing, and implementing data models for enterprise - level applications and systems.
- Experienced in Technical consulting and end-to-end delivery with architecture, data modeling, data governance and design - development - implementation of solutions.
- Extensive experience Logical and physical database designing like Tables, Constraints, Index, etc. using Erwin, ER Studio, TOAD Modeler and SQL Modeler.
- Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
- Experienced in integration of various relational and non-relational sources such as DB2, Teradata, Oracle, Netezza, SQL Server, NoSQL, COBOL, XML and Flat Files, to Netezza database.
- Experience in BI/DW solution (ETL, OLAP, Data mart), Informatica, BI Reporting tool like Tableau and Qlikview and also experienced leading the team of application, ETL, BI developers, Testing team.
- Work on Background process in oracle Architecture. Also drill down to the lowest levels of systems design and construction.
- Data Warehousing Full life-cycle project leadership, business-driven requirements, capacity planning, gathering, feasibility analysis, enterprise and solution architecture, design, construction, data quality, profiling and cleansing, source-target mapping, gap analysis, data integration/ETL, SOA, ODA, data marts, Inman/Kimball methodology, Data Modeling for OLTP, canonical modeling, Dimension Modeling for data ware house star/snowflake design.
- Has Knowledge on Apache Spark withCassandra.
- Experience in Dimensional Data Modeling, Star / Snowflake schema, FACT & Dimension tables.
- Good understanding and hands on experience with Azure, AWS S3 and EC2.
- Good experience on programming languages Python, Scala.
- Heavy use of Access queries, V-Lookup, formulas, Pivot Tables, etc. Working noledge of CRM Automation Salesforce.com, SAP.
- Specifies overall Data Architecture for all areas and domains of the enterprise, including Data Acquisition, ODS, MDM, Data Warehouse, Data Provisioning, ETL, and BI.
- Assist in creating communication materials based on data for key internal /external audiences.
- Skillful in Data Analysis using SQL on Oracle, MS SQL Server, DB2 & Teradata.
- Expertise on Relational Data modeling (3NF) and Dimensional data modeling.
- Skillful in Data Analysis using SQL on Oracle, MS SQL Server, DB2 & Teradata.
- Logical and physical database designing like Tables, Constraints, Index, etc. using Erwin, ER Studio, TOAD Modeler and SQL Modeler.
- Heavy use of Access queries, V-Lookup, formulas, Pivot Tables, etc. Working noledge of CRM Automation Salesforce.com, SAP.
TECHNICAL SKILLS
Data Modeling Tools: Erwin R6/R9, Rational System Architect, IBM Infosphere Data Architect, ER Studio and Oracle Designer.
ETL/Data warehouse Tools: Informatica 9.6/9.1/8.6.1/8.1, SAP Business Objects XIR3.1/XIR2, Web Intelligence, Talend, Tableau 8.2, Pentaho.
Database Tools: Microsoft SQL Server12.0, Teradata 15.0, Oracle 11g/9i/12c and MS Access.
Big Data Technologies: Pig, Hive, Spark, Scala.
BI Tools: Tableau 7.0/8.2, Tableau server 8.2, Tableau Reader 8.1, SAP Business Objects, Crystal Reports Packages Microsoft Office 2010, Microsoft Project 2010, SAP and Microsoft Visio, Share point Portal Server
Cloud Platforms: AWS, Azure.
Tools: OBIE 10g/11g/12c, SAP ECC6 EHP5, go to meeting, Docusign, Insidesales.com, Share point, Mat-lab.
Operating System: Windows, Unix, Sun Solaris
RDBMS: Microsoft SQL Server14.0, Teradata 15.0, Oracle 12c/11g/10g/9i, and MS Access
Version Tool: GIT, SVN
Project Execution Methodologies: Agile, Ralph Kimball and BillInmon data warehousing methodology, Rational Unified Process (RUP), Rapid Application Development (RAD), Joint Application Development (JAD)
PROFESSIONAL EXPERIENCE
Confidential, Tampa, FL
Sr. Data Architect/Modeler
Responsibilities:
- Involved in the Data Architecture and in developing the overall ETL architecture, including solution architecture as needed.
- Worked in Regulatory Compliance IT team where worked as Data Architect role which involved Data Profiling, Data Modeling, ETL Architecture & Oracle DBA.
- Responsible for Big data initiatives and engagement including analysis, brainstorming, POC, and architecture.
- Designed the Logical Data Model using ERWIN 9.64 with the entities and attributes for each subject areas.
- Developed long term data warehouse roadmap and architectures, designs and builds the data warehouse framework per the roadmap.
- Experience with Big Data and Big Data on Cloud, Master Data Management and Data Governance.
- Hand on Cloud computing using MicrosoftAzurewith various BI Technologies.
- Involved in creating Hive tables and loading and analyzing data using hive queries Developed Hive queries to process the data and generate the data cubes for visualizing Implemented.
- Implemented Join optimizations in Pig using Skewed and Merge joins for large datasets schema.
- Designed and developed a Data Lake using Hadoop for processing raw and processed claims via Hive and Informatica.
- Created OOPs objects like Collections, Interfaces implementations usingVB.NetGenerics for extracting data.
- Creating applications usingVB.NETand enhancing projects for .NETapplications
- Developed and implemented different Pig UDFs to write ad-hoc and scheduled reports as required by the Business team.
- Design of Big Data platform technology architecture. The scope includes data intake, data staging, data warehousing, and high-performance analytics environment.
- Involved in the process of adding a new Datacenter to existingCassandraCluster.
- Used Polybase for ETL/ELT process withAzureDataWarehouse to keepdatain Blob Storage with almost no limitation ondatavolume.
- Created the template SSIS package that will replicate about 200 processes to load thedatausingAzureSQL.
- Data modeling, Design, implement, and deploy high-performance, custom applications at scale on Hadoop /Spark.
- Implemented Data Integrity and Data Quality checks in Hadoop using Hive and Linux scripts.
- Involved in loading data from LINUX file system to HDFS Importing and exporting data into HDFS and Hive using Sqoop Implemented Partitioning, Dynamic Partitions, Buckets in Hive.
- Designed and developed architecture fordataservices ecosystem spanning Relational, NoSQL, and BigDatatechnologies.
- Specifies overall Data Architecture for all areas and domains of the enterprise, including Data Acquisition, ODS, MDM, Data Warehouse, Data Provisioning, ETL, and BI.
- Exploring NoSQL options for current back usingAzureCosmos DB (SQL API)
- Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
- Involved in Normalization / De normalization techniques for optimum performance in relational and dimensional database environments.
- Performance tuning and stress-testing of NoSQL database environments in order to ensure acceptable database performance in production mode.
- Implemented Spark solution to enable real time reports fromCassandradata.
- Implemented strong referential integrity and auditing by the use of triggers and SQL Scripts.
- Designed and developed T-SQL stored procedures to extract, aggregate, transform, and insert data.
- Created and maintained SQL Server scheduled jobs, executing stored procedures for the purpose of extracting data from DB2 into SQL Server.
- Developed SQL Stored procedures to query dimension and fact tables in data warehouse.
- Experience with SQL Server Reporting Services (SSRS) to author, manage, and deliver both paper-based and interactive Web-based reports.
- Performed migration and merging of RPD's inOBIEE
- Performed Hive programming for applications that were migrated to big data using Hadoop
- Deployed SSRS reports to Report Manager and created linked reports, snapshots, and subscriptions for the reports and worked on scheduling of the reports.
- Generated parameterized queries for generating tabular reports using global variables, expressions, functions, and stored procedures using SSRS.
- Created External and Managed tables in Hive and used them appropriately for different PIG scripts required for reporting.
- Developed, and scheduled variety of reports like cross-tab, parameterized, drill through and sub reports with SSRS.
- Focused on architecting NoSQL databases like Mongo, Cassandra and Cache database.
- Perform routine management operations, including configuration and performance analysis for mongodb. Diagnosing Performance Issues for mongodb.
- Managed multiple ETL development teams for business intelligence andMasterdatamanagement initiatives.
- Point in time Backup and recovery in MongoDB using MMS. Data modeling for data from RDBMS to and MongoDB for optimal reads and writes.
- Involved in designing Logical and Physical data models for different database applications using the Erwin.
- Reverse engineered some of the databases using Erwin.
- Proficiency in SQL across a number of dialects (we commonly write MySQL, PostgreSQL, SQL Server, and Oracle).
- Coordinating with Client and Business Analyst to understand and developOBIEEreports.
Environment: DB2, CA Erwin 9.6, Oracle 12c, MS-Office, SQL Architect, TOAD Benchmark Factory, SQL Loader, PL/SQL, SharePoint, ERwin r9.64, Talend, MS-Office, Redshift, SQL Server 2008/2012, Hive, Pig, Hadoop, Spark, Azure.
Confidential, Chicago, IL
Sr. Data Architect/Modeler
Responsibilities:
- Responsible for developing and supporting a data model and architecture that supports and enables the overall strategy of expanded data deliverables, services, process optimization and advanced business intelligence.
- Worked with Data Vault Methodology Developed Normalized Logical and Physical database models
- Worked with Architecture team to get the metadata approved for the new data elements that are added for this project.
- Extensively used Erwin r9.6 for Data modeling. Created Staging and Target Models for the Enterprise Data Warehouse.
- Used Data Services to move data in to HIVE and Hadoop Clusters.
- Developed HIVE and Map-reduce tools to design and manage HDFS data blocks and data distribution methods.
- Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ERWIN.
- Worked with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.
- Worked on a POC to compare processing time of Impala with Apache Hive for batch applications to implement the former in project.
- Creation of BTEQ, Fast export, Multi Load, TPump, Fast load scripts for extracting data from various production systems.
- Wrote JIL scripts to manage the scheduled loading of sample date into the various Databases and respective objects, through a combination of shell and SQL scripts.
- Generated Sybase SQL scripts that include creating tables, Primary Keys, and Foreign Keys.
- Created complex stored procedures, Functions, Triggers Indexes, Tables, Views and SQL joins for applications.
- Troubleshooting performance issues and fine-tuning queries and stored procedures.
- Generated reports to retrieve data using database code objects, such as Stored Procedures, views, functions and multiple T-SQL.
- Developed several behavioral reports and data points creating complex SQL queries and stored procedures using SSRS and Excel.
- Experience with AWS ecosystem (EC2, S3, RDS, Redshift).
- Generated periodic reports based on the statistical analysis of the data using SQL Server Reporting Services (SSRS).
- Generated reports using Global Variables, Expressions and Functions using SSRS.
- Developed different kind of reports such as Drill down, Drill through, Sub Reports, Charts, Matrix reports, Parameterized reports and Linked reports using SSRS.
- Created Hive tables to process Advisor performance data in HDFS and exported it to downstream databases for various type of analyses using Sqoop.
- Developed Hive scripts (using partitions, joins, buckets) to process the data for analysis (like quaterly growths of products, area wise sales, predict prices for new products based on market needs etc).
- Implemented Data Vault Modeling Concept solved the problem of dealing with change in the environment by separating the business keys and the associations between those business keys, from the descriptive attributes of those keys using HUB, LINKS tables and Satellites.
- Worked on AWS Redshift and RDS for implementing models and data on RDS and Redshift.
- Designed ER diagrams (Physical and Logical using Erwin) and mapping the data into database objects and identified the Facts and Dimensions from the business requirements and developed the logical and physical models using Erwin.
- Developed mapping spreadsheets for (ETL) team with source to target data mapping with physical naming standards, data types, volumetric, domain definitions, and corporate meta-data definitions.
Environment: AWS Redshift, RDS, SSRS, Big Data, JDBC, Cassandra, NOSQL, Hive, Pig, Spark, Scala, Python, Hadoop, MySQL, PostgreSQL, SQL Server, Erwin, Informatica.
Confidential, NYC, NY
Sr. Data Analyst/Modeler
Responsibilities:
- Performedas a Data Analysis, Data Migration and data profiling using complex SQL on various sources systems including Oracle andTeradata.
- Used forward engineering to generate DDL from the Physical Data Model and handed it to the DBA.
- Generated comprehensive analytical reports by running SQL queries against current databases to conductData Analysis.
- Created a list of domains in Erwin and worked on building up the data dictionary for the company.
- Created DDL scripts for implementing Data Modeling changes. Created ERWIN reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, co-coordinated with DBAs' to apply the data model changes.
- Performed logical data model design including normalization/de-normalization referential integrity, data domains; primary and foreign key assignments and data element definitions as applied to both relational and dimensional modeling.
- Worked on Unit Testing for three reports and created SQL Test Scripts for each report as required.
- Modified cube and dimensions, deployed KPIs on SSRS and created different metrics, added new measure groups and dimensions.
- Created dashboard SSRS reports under report server projects and publishing SSRS reports to the reports server.
- UseSAS, SQL, XML, PL/SQL and Windows batch programming techniques to code the technical specifications apply business logic and produce automated reporting solutions.
- Involved in Data Architecture, Data profiling, Data analysis, data mapping and Data architecture artifacts design.
- Modified SSIS packages for manipulating the data Source coming from As400 to Back Office system. Altered the stored procedures and joins to efficient usage of mappings.
- Created various SSRS dashboard reports for commission, sales and profit analysis.
- Developed Complex Stored Procedures, Views and Temporary Tables as per the requirement.
- Created and developed the stored procedures, triggers to handle complex business rules, history data and audit analysis.
- Creation of custom Visual force pages, apex triggers and apex classes, REST API Web services.
- Developed several Apex Triggers, Classes and Apex API.
- Wrote SOQL and SOSL statements within custom controllers, extensions and triggers.
- Worked with the ETL team to document the transformation rules for data migration from OLTP to Warehouse environment for reporting purposes.
- Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, SQL Server 2005 with high volume data.
- IntegratedSpotfirevisualization into client's Salesforce environment.
- Created data masking mappings to mask the sensitive data between production and test environment.
- Responsible for all metadata relating to the EDW’s overall data architecture, descriptions of data objects, access methods and security requirements.
- Perform logical and physical OLAP / OLTP schema design.
- Designed STAR schema for the detailed data marts and plan data marts consisting of confirmed dimensions.
- Used Erwin model mart for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
- Used Model Manager Option in Erwin to synchronize the data models in Model Mart approach.
- Created views and dashboards on end client'sdata. Producedpowerful dashboards telling story behind the data in an easy to understand format such as pie, bar, geo, and line charts that are viewed daily by senior Management.
Environment: Windows, Erwin, OLAP, OLTP, Teradata, SQL SERVER, SSRS, Informatica Power Center 6.1/7.1, QTP 9.2, Test Director 7.x, Load Runner 7.0, Oracle11g, UNIX AIX 5.2, PERL, Shell Scripting.
Confidential, Dayton, OH
Sr. Data Analyst/Modeler
Responsibilities:
- Analyzed the physicaldatamodel to understand the relationship between existing tables. Cleansed the unwanted tables and columns as per the requirements as part of the duty being aDataAnalyst.
- Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
- Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ER Studio.
- Worked on Metadata Repository (MRM) for maintaining the definitions and mapping rules up to mark.
- TrainedSpotfiretool and gave guidance in creatingSpotfireVisualizations to couple of colleagues
- Created DDL scripts for implementing Data Modeling changes. Created ERWIN reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, co-coordinated with DBAs' to apply the data model changes.
- Developed Contracting Business Process Model Workflows (current / future state) usingBizagi Process Modeler software.
- Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
- Worked on Unit Testing for three reports and created SQL Test Scripts for each report as required
- Extensively used ER Studio as the main tool for modeling along with Visio
- Worked on Unit Testing for three reports and created SQL Test Scripts for each report as required
- Configured & developed the triggers, workflows, validation rules & having hands on the deployment process from one sandbox to other.
- Managed Logical and Physical Data Models in ER Studio Repository based on the different subject area requests for integrated model.
- Created automatic field updates via workflows and triggers to satisfy internal compliance requirement of stamping certain data on a call during submission.
- Worked on Metadata Repository (MRM) for maintaining the definitions and mapping rules up to mark.
- Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
- Developed enhancements toMongo DBarchitecture to improve performance and scalability.
- Forward Engineering the Data models, Reverse Engineering on the existing Data Models and Updates the Data models.
- Performeddatacleaning anddatamanipulation activities using NZSQL utility.
- Analyzed and understood the architectural design of the project in a step by step process along with the data flow.
Environment: Oracle SQL Developer, OracleDataModeler, Teradata14, SSIS, Business Objects, SQL Server 2008, ER/Studio Windows, MS Excel.
Confidential
Data Analyst
Responsibilities:
- Designed and created web applications to receive query string input from customers and facilitateentering the data into SQL Server databases.
- Performed thorough data analysis for the purpose of overhauling the database using SQL Server.
- Designed and implemented business intelligence to support sales and operations functions to increase customer satisfaction.
- Converted physical database models from logical models, to build/generate DDL scripts.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Extensively used ETL to loaddatafrom DB2, Oracle databases.
- Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
- Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
- Worked with Business users during requirements gathering and prepared Conceptual, Logical and PhysicalDataModels.
- Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
- Optimized the existing procedures and SQL statements for the better performance using EXPLAIN PLAN, HINTS, SQL TRACE and etc. to tune SQL queries.
- The interfaces were developed to be able to connect to multiple databases like SQL server and oracle.
- Assisted Kronos project team in SQL Server Reporting Services installation.
- Developed SQL Server database to replace existing Access databases.
- Attended and participated in information and requirements gathering sessions
- Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications.
- Expertise and worked on Physical, logical and conceptual data model
- Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas
- Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
- Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
- Worked and experienced on Star Schema, DB2 and IMS DB.
Environment: Oracle, PL/SQL, DB2, ERWIN, UNIX, Teradata SQL Assistant, Informatica, OLTP, OLAP, Data Marts, DQ analyzer.