We provide IT Staff Augmentation Services!

Sr. Data Modeler/data Analyst Resume

5.00/5 (Submit Your Rating)

Phoenix, AZ

SUMMARY

  • 8+ years of Experience in Data Analysis, Data Modeling, Data Warehouse & Business intelligence professional with applied information Technology.
  • Experience in Design, Development, Testing and Maintenance of various Data Warehousing and Business Intelligence (BI) applications in complex business environments.
  • Well versed in Conceptual, Logical/Physical, Relational and Multi - dimensional modeling, Data analysis for Decision Support Systems (DSS), Data Transformation (ETL) and Reporting.
  • Proficient in developing Entity-Relationship diagrams, Star/Snow Flake Schema Designs, and expert in modeling Transactional Databases and Data Warehouse.
  • Developed solution to copy on-premisesdatawarehouse toAzureusingDataFactory, Logic Apps, andAzureSQL
  • Hand on Experience working withAWSRDS, Creating, updating and backups of Databases
  • Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Data Profiling, Data Mapping, Performance Tuning and System Testing.
  • Proficient in Normalization/De-normalization techniques in relational/dimensional database environments and have done normalizations up to 3NF.
  • Efficient in Dimensional Data Modeling for Data Mart design, identifying Facts and Dimensions, creation of cubes.
  • Hands on experience in analyzing data using Hadoop Ecosystem including HDFS, Hive, Streaming, Elastic Search, Kibana, Kafka, HBase, Zookeeper, PIG, Sqoop, and Flume.
  • Experience in developing Map Reduce Programs using Apache Hadoop for analyzing the big data as per the requirement.
  • Experienced in Technical consulting and end-to-end delivery with architecture, data modeling, data governance and design - development - implementation of solutions.
  • Hands on experience in SQL, PL/SQL programming, performed End-to-End ETL validations and supported Ad-hoc business requests. Developed Stored Procedures and Triggers and extensively used Quest tools like TOAD.
  • Good understanding of Ralph Kimball (Dimensional) & Bill Inmon (Relational) model Methodologies.
  • Designed and implementedAzureStream Analytics (ASA) jobs and PowerBI reports for an IoT Analytics platform using
  • 3 years ofAzureDeveloper andAzureadmin experience. Experience in managingAzureSQL with DBA activities.
  • Experience with Teradata utilities such as Fast Export, MLOAD for handling various tasks.
  • Strong experience in Data Analysis, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Integration and Metadata Management Services and Configuration management.
  • Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Performance Tuning and System Testing.
  • Strong Experience in working with Excel Pivot and VBA Macros for various business scenarios.
  • Experience in generating DDL (Data Definition Language) Scripts and creating Indexing strategies.
  • Excellent SQL Programming skills and developed Stored Procedures, Triggers, Functions, Packages using SQL/PL SQL, Performance tuning and query optimization techniques in transactional and data warehouse environments.
  • Experience in Data analysis and Data profiling using complex SQL on various sources systems including Oracle and Teradata.
  • Experience in dashboard reports using SQL Server reporting services (SSRS).
  • Expert in Agile/Scrum and waterfall methodologies.
  • Working withAWSInfrastructure, with numerous services such as S3, EC2, RDS,CloudWatch, IAM, VPC and Route 53 etc.
  • Experienced with event-driven and scheduledAWSLambda functions to trigger variousAWSresources.
  • Experience in developing Entity Relationship diagrams and modeling Transactional Databases and Data Warehouse using tools like ERWIN, ER/Studio and Power Designer.
  • Strong experience in using Excel and MS Access to dump the data and analyze based on business needs.
  • Good understanding of Access queries, V-Lookup, formulas, Pivot Tables, etc.
  • Working knowledge of CRM Automation Salesforce.com, SAP.

TECHNICAL SKILLS

Data Modeling Tools: Erwin Data Modeler 9.7/9.6, ER Studio v17, and Power Designer.

Big Data technologies: HBase 1.2, HDFS, Sqoop 1.4, Spark, Hadoop 3.0, Hive 2.3

AWS tools: EC2, S3 Bucket, AMI, RDS, Redshift

ETL/Data warehouse Tools: Informatica 9.6/9.1, SAP Business Objects XIR3.1/XIR2, Talend and Pentaho.

Reporting Tools: SSRS, Power BI, Tableau, SSAS, MS-Excel, SAS BI Platform.

Languages: UNIX Shell Scripting, HTML, Confidential -SQL, Data Structure, Algorithms.

OLAP Tools: Tableau 7, SAP BO, SSAS, Business Objects, and Crystal Reports 9

Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server 2016/2014, DB2.

Operating System: Windows, Unix, Sun Solaris

Other tools: SQL PLUS, SQL LOADER, MS Project, MS Visio, UNIX, PL/SQL etc

Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Model.

BI Tools: Tableau 7.0/8.2, Tableau server 8.2, Tableau Reader 8.1,SAP Business Objects, Crystal Reports Packages: Microsoft Office 2010, Microsoft Project 2010, SAP and Microsoft Visio, Share point Portal Server

Project Execution Methodologies: Agile, Ralph Kimball and BillInmondatawarehousing methodology, Rational Unified Process (RUP), Rapid Application Development (RAD), Joint Application Development (JAD)

Testing and defect tracking Tools: HP/Mercury (Quality Center), Quick Test Professional, Performance Center, Requisite, MS Visio.

PROFESSIONAL EXPERIENCE

Confidential, Phoenix, AZ

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Developed full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
  • Created Logical & Physical Data Model on Relational (OLTP) on Star schema for Fact and Dimension tables using Erwin.
  • Performed GAP analysis to analyze the difference between the system capabilities and business requirements.
  • Developing SSIS packages to integrate them intoAzureDataFactoryV2.
  • Used Agile Method for daily scrum to discuss the project related information.
  • Prepared ETL technical Mapping Documents along with test cases for each Mapping for future developments to maintain Software Development Life Cycle (SDLC).
  • Involved in Data flow analysis, Data modeling, Physical database design, forms design and development, data conversion, performance analysis and tuning.
  • Created sampling specification, performed quality assurance, created multiple datasets to communicate with Medicare Advantage plans and analyzed Medicare plans response.
  • Created and maintained data model standards, including master data management (MDM) and Involved in extracting the data from various sources like Oracle, SQL, Teradata, and XML.
  • Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
  • UsedAWSCloudWatch, Nagios as a performance monitoring and analytics tools.
  • Worked on Master data Management (MDM) Hub and interacted with multiple stakeholders.
  • Worked with medical claim data in the Oracle database for Inpatient/Outpatient data validation, trend and comparative analysis.
  • Design and implementation of an Azuredatalake
  • Design and development of thedatawarehousedataETL Jobs usingazuredatafactoryandazuredatabricks.
  • Proficient in developing Entity-Relationship diagrams, Star/Snow Flake Schema Designs, and expert in modeling Transactional Databases and Data Warehouse.
  • Worked on normalization techniques, normalized the data into 3rd Normal Form (3NF).
  • Worked Normalization and De-normalization concepts and design methodologies like Ralph Kimball and Bill Inmon approaches and implemented Slowly Changing Dimensions.
  • Implemented Forward engineering to create tables, views and SQL scripts and mapping documents.
  • Responsible for all BI TeamAzurebased solutions along with developing strategy on implementation
  • Used reverse engineering to connect to existing database and create graphical representation (E-R diagram).
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies under guidance of several Data Architect.
  • Azurelogic app development to trigger the workflow onAzureSQL DB
  • Performance tuning and stress-testing of NoSQL database environments in order to ensure acceptable database performance in production mode.
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
  • Assisted in defining business requirements and created BRD (Business Requirements Document) and functional specifications documents.
  • Developed automated data pipelines from various external data sources (web pages, API etc) to internal data warehouse (SQL server) then export to reporting tools like Datorama by Python.
  • Connected to AWS Redshift through Tableau to extract live data for real time analysis.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Netezza
  • Monitored the Data quality and integrity of data was maintained to ensure effective functioning of department.
  • Managed database design and implemented a comprehensive Star-Schema with shared dimensions.
  • Analyzed the data which is using the maximum number of resources and made changes in the back-end code using PL/SQL stored procedures and triggers
  • Developed and maintained stored procedures, implemented changes to database design including tables and views and Documented Source to Target mappings as per the business rules.
  • Conducted Design reviews with the business analysts and the developers to create a proof of concept for the reports.
  • Performed detailed data analysis to analyze the duration of claim processes and created the cubes with Star Schemas using facts and dimensions through SQL Server Analysis Services (SSAS).
  • Deployed SSRS reports to Report Manager and created linked reports, snapshots, and subscriptions for the reports and worked on scheduling of the reports.
  • Generated parameterized queries for generating tabular reports using global variables, expressions, functions, and stored procedures using SSRS.
  • Developed, and scheduled variety of reports like cross-tab, parameterized, drill through and sub reports with SSRS.
  • Directing and overseeing data quality tests, including providing input to quality assurance team members.

Environment: Erwin v9.7, Agile, NoSQL, Big Data, Azure, Redshift, MDM, Oracle 12c, SQL, Teradata r15, XML, Python 3.6, PL/SQL, Tableau, SSRS.

Confidential, Atlanta, GA

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Worked with Database Administrators, Business Analysts and Content Developers to conduct design reviews and validate the developed models.
  • Involved in data model reviews with internal data architect, business analysts, and business users with explanation of the data model to make sure it is in-line with business requirements.
  • Worked with delivery ofData& Analytics applications involving structured and un-structureddataon Hadoop based platforms on AWSEMR.
  • Independently coded new programs and design Tables to load and test the program effectively for the given POC's using BigData/Hadoop.
  • Installation and Configuration of other Open Source Software like Pig, Hive, HBase, Flume and Sqoop
  • Translate business needs in to data models that integrate with the Enterprise DataModel.
  • Architecting and Modeling EnterpriseDataHub (EDH) cloud based solutions usingAWSRedshift for Analytical platform andAWSOracle RDS for Reference and MasterData.
  • Develop data models according to enterprise data management practices
  • Ensure the quality of data submitted to regulatory agencies by developing detailed metadata for submissions.
  • Worked for map reduce and query optimization for Hadoophive and HBase architecture
  • Extensively worked Data Governance, i.e. Metadata management, Master data Management, Data Quality, Data Security
  • Developed Spark/Scala, Python for regular expression (regex) project in the Hadoop/Hive environment with Linux/Windows for big data resources.
  • Data sources are extracted, transformed and loaded to generate CSV data files with Python programming and SQL queries.
  • Providing day to daydataadministration and security related tasks for ETL team related toAWS Redshift andAWSOracle RDS.
  • Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.
  • Worked on data cleaning and reshaping, generated segmented subsets using Numpy and Pandas in Python
  • Designed data models for development of new transactional systems and data warehouse and datamarts also.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.
  • Reverse engineer data models for existing databases and systems.
  • Work with data analysts to identify data issues within the organization and assist in the development plans to resolve data issues.
  • Created dimensional model for the reporting system by identifying required dimensions and facts using Erwin r9.8.
  • Worked on data pre-processing and cleaning the data to perform feature engineering and performed data imputation techniques for the missing values in the dataset using Python.
  • Created Data Quality Scripts using SQL and Hive to validate successful das ta load and quality of the data. Created various types of data visualizations using Python and Tableau.
  • Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model.
  • Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetch data from legacy DB2 and SQL Server database systems.
  • Utilized Apache Spark with Python to develop and execute Big Data Analytics and Machine learning applications, executed machine Learning use cases under Spark ML and Mllib.
  • Used Erwin for reverse engineering to connect to existing database and Operational data store (ODS) to create graphical representation in the form of Entity Relationships and elicit more information.
  • Coordinated with DBA ondatabase build and table normalizations and de-normalizations
  • Tested the ETL process for both beforedatavalidation and afterdatavalidation process. Tested the messages published by ETL tool anddataloaded into various databases.
  • Used Informatica Designer, Workflow Manager and Repository Manager to create source and target definition, design mappings, create repositories and establish users, groups and their privileges.
  • Extracted data from the databases (Oracle and SQL Server, DB2, FLAT FILES) using Informatica to load it into a single data warehouse repository.
  • Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT).

Environment: Erwin 9.8, Oracle 12c, HBase, HDFS, Sqoop, PIG, SQL Assistance, OLAP, OLTP, IBM DB2 UDB, ODS, AWS Redshift, OBIEE, AWS, SAP, SQL Server 2012/14, Informatica Power Center 9.6, Toad, PL/SQL, XML Files, XML Spy, MS Office Tools.

Confidential, Monroe LA

Sr. Data Modeler / Data Engineer/ Data Analyst

Responsibilities:

  • Responsible for Data Modeling, Data Integration, Data quality & Metadata management solution design and delivery for Enterprise EDW and Hadoop environment.
  • As aModeler/Analystimplement MDM hub to provide clean, consistentdatafor a SOA implementation
  • Installation and Configuration of other Open Source Software like Pig, Hive,HBase, Flume and Sqoop.
  • Independently coded new programs and design Tables to load and test the program effectively for the given POC's using BigData/Hadoop.
  • Develop ETL pipelines on AWS platform using Python and AWS services such as S3 Buckets, Lambda, API Gateway, SQS queues.
  • Worked with BTEQ to submit SQL statements, import and exportdata, and generate reports in Teradata.
  • Defined and deployed monitoring, metrics, and logging systems on AWS.
  • Translate business anddatarequirements into Logicaldatamodels in support of EnterpriseData Models, Operational data store (ODS), OLAP, OLTP, OperationalDataStructures and Analytical systems.
  • Full life cycle ofDataLake,DataWarehouse with Bigdatatechnologies like Hive, Hadoop.
  • Responsible for technicalDatagovernance, enterprise wideDatamodeling and Database design.
  • Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
  • Designed both 3NFdatamodels for ODS, OLTP systems and dimensionaldatamodels using Star and Snow flake Schemas.
  • Worked extensively on ER Studio for multiple Operations across Hartford in both OLAP and OLTP applications.
  • Involved in OLAP model based on Dimension and FACTS for efficient loads ofdatabased on Star Schema structure on levels of reports using multi-dimensional models such as Star Schemas and Snowflake Schema.
  • Performed theDataMapping,Datadesign (DataModeling) to integrate thedataacross the multiple databases in to EDW.
  • Develop ETL pipelines on AWS platform using Python and AWS services such as S3 Buckets, Lambda, API Gateway, SQS queues.
  • Preparing new ERwin data model template for Postgres, Dynamo DB (NoSQL database) on cloud environment like Amazon Web Services (AWS).
  • Creating Logical and physical data model to feed Amazon Aurora DB and Amazon DynamoDB in AWS environment
  • Perform administrative tasks, including creation of database objects such as database, tables, and views, using SQL DCL, DDL, and DML requests.
  • Work with Enterprise Data Governance team to review enterprise standards for data quality playbook.
  • Prepare Data Flows and Write SQL Quiries for Data Profiling/Data Quatily check list on Source/Target data and formalize Data Governance Rules. Presenting Data profiling results from QlikView for decision making.
  • Participated in several project activities includingDataArchitectdesign, ETL design, QA support, Code review.
  • Designed thedatamarts using the RalphKimball's DimensionalDataMart modeling methodology using ERStudio.
  • Experience using MapReduce, and "Big data" work on Hadoop and other NOSQL platforms.
  • Review system architecture, data flow, data warehouse dimensional model, DDL to identify the area for improvement to reduce the loading & reporting time for a meter reading system.
  • Data Analysis using SQL and worked in discovering data insights using Python - Pandas and NumPy.
  • Incorporated business requirements in quality conceptual, logicaldatamodels using ER Studio and created physicaldatamodels using forward engineering techniques to generate DDL scripts.
  • Designing normalized and star schemadataarchitectures using ER Studio and forward engineering these structures into Teradata.
  • Responsible for Bigdatainitiatives and engagement including analysis, brainstorming, POC, and architecture.
  • Transformation Services (DTS), and DataStage and ETL package design, and RDBM systems like SQL Servers, Oracle, and DB2.
  • Identifying Data Governance issues and formulating refined business process and data flow for long term solution.
  • Prepared a web UI for theHBasedatabase for crud operation like put, get, scan, delete and update.
  • Generated periodic reports based on the statistical analysis of the data using SQL Server Reporting Services (SSRS).
  • Worked on NoSQL databases includingHBase, Mongo DB, and Cassandra. Implemented multi-data center and multi-rack Cassandra cluster.
  • Selecting the appropriate AWS service based on data, compute, database, or security requirements.
  • Used Flume extensively in gathering and moving log data files from Application Servers to a central location in Hadoop Distributed File System (HDFS) for data science.
  • Working on Amazon Redshift and AWS and architecting a solution to load data, create data models and run BI on it.
  • Migrated DTS packages to SSIS and modified using DTS editor wizard.
  • Preparing new ERwin data model template for Postgres, Dynamo DB (NoSQL database) on cloud environment like Amazon Web Services (AWS).
  • Developed and configured on Informatica MDM hub supports the MasterDataManagement (MDM), Business Intelligence (BI) andDataWarehousing platforms to meet business needs.
  • Worked with Hadoop eco system covering HDFS, HBase, YARN and Map Reduce.
  • Developed a dashboard solution for analyzing STD statistics by building SSIS cubes and Tableau.
  • Developed, Implemented & Maintained the Conceptual, Logical & PhysicalDataModels using ER/Studio 9 (ER Studio)- Forward/Reverse Engineered Databases.
  • Involved in Agile project management environment.
  • Developed PL/SQL scripts to validate and load data into interface tables
  • Involved in maintaining data integrity between Oracle and SQL databases.

Environment: ERStudio9.8, Teradata15, Star Schema, Snowflake Schema, AWS, HBase, Pig, Hive, Sqoop, OLAP, OLTP, Oracle12c, ODS, Business Objects, MDM, Hadoop, SQL Server 2015,No SQL, Cassandra, Python, No Sql.

Confidential, Dorchester, MA

Sr. Data Analyst/ Modeler

Responsibilities:

  • Used SQL extensively to query Databases
  • Sales data analysis and the maintenance of pricing tables that feed live website
  • Conducted data assessments of legacy data
  • Used Excel, Access and VBA in the development of robust and flexible reporting systems
  • Retrieved data from data warehouse and generated a series of meaningful business reports using SSRS and Cognos.
  • Deploying and maintaining production environment usingAWSEC2 instances and Elastic Container Services with Docker
  • Monitoring resources and Applications using AWSCloudWatch, including creating alarms to monitor metrics such as EBS, EC2, ELB, RDS, S3, SNS and configured notifications for the alarms generated based on events defined
  • Perform troubleshooting and monitoring of the Linux server onAWS.
  • Maintaining the user accounts (IAM), RDS, Route 53, SES and SNS services inAWScloud
  • Validated and tested reports, then published the reports to the report server.
  • Performed and conducted complex custom analytics as needed by clients.
  • Designed and developed specific databases for collection, tracking and reporting of data.
  • Established, maintained and distributed daily, weekly and monthly reports
  • Created financial models using Excel Pivot Tables and formulas to develop best and worst case scenarios
  • Creation of user accounts, policies and roles on Amazon Web Services (AWS) using IAM
  • Data auditing, creating data reports & monitoring all data for accuracy
  • Data cleansing - checking for data redundancy, duplicates and reporting on all findings
  • Assisted in the development of Metrics to measure the business data cleansing progress
  • Performed and conducted complex custom analytics as needed by clients.

Environment: Erwin9.0, AWS, Redshift, Oracle11g, Sql Server 2010, Teradata14, XML, OLTP, PL/Sql, Linux, UNIX, Mload, BTEQ, UNIX shell scripting

Confidential, Triangle Park, NC

Data Analyst

Responsibilities:

  • Played an active and lead role in gathering, analyzing and documenting the Business Requirements, the business rules and technical requirements from the Business Group and the Technology Group.
  • Drafted enterprise security standards and guidelines
  • Performed and created procedures for system security audits
  • Assisted in the development of database access controls and separation of duties
  • Assisted in incident response and disaster recovery
  • Responsible for the documentation and execution of remediation plans
  • Performed periodic security review of privileged accounts
  • Performed a thorough analysis on the Sales Order Entry process to eliminate bottleneck
  • Captured As-Is Process, designed TO-BE Process and performed Gap Analysis
  • Gathered Requirements from different sources like Stakeholders, Existing Systems, and Subject Matter Experts by conducting Workshops, Interviews, Use Cases, Prototypes
  • Created Functional Requirement Specification documents - which include UML Use case diagrams, scenarios, activity, work Flow diagrams and data mapping. Process and Data modelling with MS VISIO.
  • Worked with the DW, ETL teams to create Order entry systems Business Objects reports.
  • Worked in a cross functional team of Business Analysts, Architects and Developers to implement new features in various custom applications.
  • Developed and maintained various User Manuals and Application Documentation Manuals on Share Point tool.
  • Created Test Plans and Test Strategies to define the Objective and Approach of testing.
  • Used Quality Center to track and report system defects and bug fixes, wrote modification requests for the bugs in the application and helped developers to track and resolve the problems.
  • Developed and Executed Manual, Automated Functional, GUI, Regression, UAT Test cases using QTP.
  • Gathered, documented and executed Requirements-based, Business process (workflow/user scenario), Data driven test cases for User Acceptance Testing.
  • Created Test Matrix, Used Quality Center for Test Management, track & report system defects and bug fixes.
  • Performed Load, stress testing and analyzed performance/response Times.
  • Created / developed SQL Queries with several parameters for Backend Database testing.

Environment: Erwin 8.0, MS Excel, MS Access, UNIX, Confidential -Sql, MS Sql 2008, Excel, SSIS, SSRS, MS Visio.

We'd love your feedback!