We provide IT Staff Augmentation Services!

Data Analyst Resume

0/5 (Submit Your Rating)

Denver, CO

SUMMARY

  • Overall 9+ years of experience in Data Analyst and Data Modeling, Data Development, Implementation and Maintenance of databases and software applications.
  • Good understanding of Software Development Life cycle (SDLC) including planning, analysis, design, development, testing, implementation.
  • Experience and Technical proficiency in Designing, Data Modeling Online Applications, Solution Lead for Architecting Data Warehouse/Business Intelligence Applications.
  • Good Knowledge and working experience on AWS tools like Amazon S3, and Amazon Red Shift.
  • Extensive experience in using ER modeling tools such as Erwin and ER/Studio, Power Designer.
  • Experience in conducting Joint Application Development (JAD) sessions with SMEs, Stakeholders and other project team members for requirements gathering and analysis.
  • Excellent experience in writing and executing unit, system, integration and UAT scripts in a data warehouse projects.
  • Create tables within Snowflake using SQL, PuTTY, WinSCP, and Python to produce financial, contracts, and project data analysis.
  • Knowledge of using R for predictive modelling and data visualization by using MySQL database and text files as sources.
  • Designed and developed visualizations and dashboards in Tableau and Microsoft Power BI.
  • Experience in developing Conceptual, logical models and physical database design for Online Transactional processing (OLTP) and Online Analytical Processing (OLAP) systems.
  • Good experience in Technical consulting and end - to-end delivery with data modeling, data governance.
  • Experience with DBA tasks involving database creation, performance tuning, creation of indexes, creating and modifying table spaces for optimization purposes.
  • Excellent proficiency in Agile/Scrum and waterfall methodologies.
  • Good experience in working with different reporting tool environments like SQL Server Reporting Services (SSRS), and Business Objects.
  • Constructed dashboard and ad-hoc reports Excel and Tableau.
  • Proficient in MySQL (RDBMS, SQL server, PL SQL), Python (NumPy, Pandas, Scikit-Learn), R programming (ddply, ggplot2, Random Forest), Core Java, Web scrapping, A/B Testing, Google analytics, web analytics, marketing analytics.
  • Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Data Profiling, Data Mapping, Performance Tuning and System Testing.
  • Excellent experience in writing SQL queries to validate data movement between different layers in data warehouse environment.
  • Experience in using Jupyter Notebook in Anaconda an open source distribution for Python programming.
  • Good understanding of Ralph Kimball (Dimensional) & Bill Inman (Relational) model Methodologies.
  • Experience with Teradata utilities such as Fast Export, MLOAD for handling various tasks.
  • Having good working experience in Data Vault which is used in maintain Historical Data in the Enterprise Data Warehouse.
  • Proficient in Creating different TABLEAU dashboards to perform margin analysis for e-ratedeals across the different funnel stages.
  • Good Knowledge on SQL queries and creating database objects like stored procedures, triggers, packages and functions using SQL and PL/SQL for implementing the business techniques.
  • Experience in generating DDL (Data Definition Language) Scripts and creating Indexing strategies.
  • Proficient in Normalization/De-normalization techniques in relational/dimensional database environments and have done normalizations up to 3NF.
  • Strong experience in Data Analysis, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Integration and Metadata Management Services and Configuration management.
  • Experience in data analysis to track data quality and for detecting/correcting inaccurate data from the databases.
  • Excellent knowledge on creating reports on SAP Business Objects, for multiple data providers.
  • Conduct data analysis, mapping transformation, data modeling and data-warehouse concepts.
  • Capture, validate and publish metadata in accordance with enterprise data governance policies and MDM taxonomies.
  • Experienced with PyTorch and Tensorflow. Able to use nn module in PyTorch to build CNN in training data set.
  • Strong experience in using MS Excel and MS Access to dump the data and analyze based on business needs.
  • Supporting ad-hoc business requests and Developed Stored Procedures and Triggers.
  • Experience in designing error and exception handling procedures to identify, record and report errors.

TECHNICAL SKILLS

  • MySQL
  • Power BI
  • Tableau
  • Snowflake
  • Data Warehouse
  • Microsoft Excel
  • Python
  • MATLAB
  • Data Modelling
  • Erwin tool
  • Data Analysis

PROFESSIONAL EXPERIENCE

Confidential

Data Analyst

Responsibilities:

  • As a Sr. Data Modeler/Analyst to generate Data Models using Erwin and subsequent deployment to Enterprise Data Warehouse.
  • Participated in design discussions and assured functional specifications are delivered in all phases of SDLC in an Agile Environment.
  • Performed Data analysis using Python Pandas.
  • Researched and developed hosting solutions using MS Azure for service solution.
  • Worked on Master Data Management (MDM) Hub and interacted with multiple stakeholders.
  • Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
  • Worked extensively on Tableau for data visualization producing interactive graphs.
  • Created various Physical Data Models based on discussions with DBAs and ETL developers.
  • Created PL/SQL packages and Database Triggers and developed user procedures and prepared user manuals for the new programs.
  • Compiled, studied, inferred large amounts of data, modelling information to drive auto policy pricing.
  • Used Python to save and retrieve data files from Amazon S3 buckets.
  • Used SAS procedures like means, frequency and other statistical calculations for Data validation.
  • Developed the data warehouse model (Kimball's) with multiple data marts with conformed dimensions for the proposed central model of the Project.
  • Constructed dashboard and ad-hoc reports Excel and Tableau.
  • Responsible for the design and implementation of a Business Intelligence solution for the entire company, including a data warehouse, automated reports and user interface for ad hoc reports and analytics.
  • Designed dashboards utilizing custom filters and DAX expressions with Power BI.
  • Developed quality analysis dashboards on Microsoft Power BI.
  • Involved in writing queries and stored procedures using MySQL and SQL Server.
  • Designed the data marts in dimensional data modeling using star schemas and snowflake schemas.
  • Reverse Engineering the existing data marts and identified the Data Elements (in the source systems), Dimensions, Facts and Measures required for reports.
  • Involved in the Data modelling to solve issue for Dimension to Dimension joins.
  • Wrote SQL queries on MySQL to perform testing of tables used in dashboard creation.
  • Developed various reports using Tableau Desktop, Tableau Prep and published on Tableau Online.
  • Developed Data mapping, Transformation and Cleansing rules for the Data Management involving OLTP and OLAP.
  • Wrote DDL and DML statements for creating, altering tables and converting characters into numeric values.
  • Developed rule sets for data cleansing and actively participated in data cleansing and anomaly resolution of the legacy application.
  • Recreate the detection layer for up sample function by applying Tensorflow (transpose, identity, image, concat).
  • Import training weight file using Python (NumPy) and Tensorflow (assign). Create a function to output the detection boxes with Python.
  • Document data dictionaries and business requirements for key workflows and process points
  • Performed the Data Accuracy, Data Analysis, Data Quality checks before and after loading the data.
  • Maintained and implemented Data Models for Enterprise Data Warehouse using Erwin.
  • Involved in the development and implementation of SSIS, SSRS and SSAS application solutions for various business units across the organization.
  • Worked with project management, business teams and departments to assess and refine requirements to design BI solutions using MS Azure.
  • Responsible for data analysis and business analysis to enable the creation, enhancement, and maintenance of the data warehouse.
  • Build and published customized interactive reports and dashboards, report scheduling using Tableau server.
  • Wrote DAX functions in Power BI to perform calculations.
  • Involved in data analysis, data discrepancy reduction in the source and target schemas.
  • Developed and deployed quality T-SQL codes, stored procedures, views, functions, triggers and jobs.
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Worked on Data Mining and data validation to ensure the accuracy of the data between the warehouse and source systems.
  • Created PHP/MySQL back-end for data entry from Flash. me had to assist the Flash developer send the correct data via query strings.
  • Involved in migration projects to migrate data from data warehouses on Oracle and migrated those to Teradata.
  • Performed Data Manipulation using MS Excel Pivot Sheets and produced various charts for creating the mock reports.
  • Created data flow, process documents and ad-hoc reports to derive requirements for existing system enhancements.

Confidential - Denver, CO

Data Modeler/Data Analyst

Responsibilities:

  • Worked as a Data Modeler/Analyst responsible for Conceptual, Logical and Physical model for Supply Chain Project.
  • Developed full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
  • Analyzed the data which is using the maximum number of resources and made changes in the back -end code using PL/SQL stored procedures and triggers.
  • Used insights from descriptive analysis to performed predictive modelling.
  • Connected to AWS Redshift through Tableau to extract live data for real time analysis.
  • Performed thorough data analysis for the purpose of overhauling the database using SQL Server.
  • Used Informatica & SAS to extract transform & load source data from transaction systems.
  • Created Logical & Physical Data Model on Relational (OLTP) on Star schema for Fact and Dimension tables using Erwin.
  • Data Manipulation and Aggregation from different source using Business Objects, Power BI and SmartView.
  • Created and maintained data model standards, including master data management (MDM)
  • Involved in extracting the data from various sources like Oracle, SQL, and Teradata.
  • Extensively used Agile methodology as the Organization Standard to implement the data Models.
  • Conducted Design reviews with the business analysts and the developers to create a proof of concept for the reports.
  • Developed scripts that automated DDL and DML statements used in creations of databases, tables, constraints, and updates.
  • Used Python Matplotlib packages to visualize and graphically analyses the data.
  • Worked on development of data warehouse, Data Lake and ETL systems using relational and non-relational tools like SQL, No SQL.
  • Designed different type of STAR schemas for detailed data marts and plan data marts in the OLAP environment.
  • Created SSIS package for daily email subscriptions to alert Tableau subscription failure using the PostgreSQL database.
  • Used Microsoft Visio for Data Modelling. Managing metadata and related tools.
  • Imported the claims data into Python using Pandas libraries and performed various data analysis.
  • Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
  • Identified and documented data sources and transformation rules required to populate and maintain data Warehouse content.
  • Transformed raw data into MySQL with custom-made ETL application to prepare unruly data for machine learning.
  • Communicated with all business functions to maintain a comprehensive understanding of data quality requirements.
  • Performed detailed data analysis to analyze the duration of claim processes and created the cubes with Star Schemas using facts and dimensions through SQL Server Analysis Services (SSAS).
  • Worked on normalization techniques, normalized the data into 3rd Normal Form (3NF).
  • Implemented Forward engineering to create tables, views and SQL scripts and mapping documents.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle.
  • Performed GAP analysis to analyze the difference between the system capabilities and business requirements.
  • Generated parameterized queries for generating tabular reports using global variables, expressions, functions, and stored procedures using SSRS.
  • Involved in understating the business functionality, data workflows along with preparing monthly, quarterly dashboards/reports using Excel, Tableau & Power BI.
  • Proficient in MySQL (RDBMS, SQL server, PL SQL), Python (NumPy, Pandas, Scikit-Learn), R programming (ddply, ggplot2, Random Forest), Core Java, Web scrapping, A/B Testing, Google analytics, web analytics, marketing analytics.
  • Worked on Metadata exchange among various proprietary systems using XML.
  • Worked with medical claim data in the Oracle database for data validation, trend and comparative analysis.
  • Developed Entity-Relationship diagrams, Star/Snow Flake Schema Designs, and expert in modeling Transactional Databases and Data Warehouse.
  • Performed data cleaning, feature scaling, feature engineering and exploratory data analysis to maximize insights, detect outliers and extract important features for modelling.
  • Involved in different phases of Analytics using Python and Jupyter Notebook.
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business
  • Monitored the Data quality and integrity of data was maintained to ensure effective functioning of department.

Confidential - North Chicago, IL

Data Modeler/Data Analyst

Responsibilities:

  • Worked with Business Analyst during requirements gathering and business analysis to prepare high level Logical Data Models and Physical Data Models using E/R Studio.
  • Developed the data warehouse model (Kimball's) with multiple data marts with conformed dimensions for the proposed central model of the Project.
  • Working on the OLAP for data warehouse and data mart developments using Ralph Kimball methodology as well as OLTP models.
  • Used SQL Loader, external tables and import/export toolbar to load data into Oracle.
  • Generated DDL statements for the creation of new ER/studio objects like table, views, indexes, packages and stored procedures.
  • Automated data streaming data through multiple processing stages (ingestion, data lake, atomic DW, and few dozen of datamarts) into final modelling and analytical platforms using combinations of Redshift,Hive, EMR & EC2, Lambda, gsutil, docker, PySpark.
  • Worked on predictive analytics use-cases using Python language.
  • Developed a 360-business dashboard in Tableau with multiple panels and parameters for Salesforce team.
  • Created Project Plan documents, Software Requirement Documents, Environment Configuration and UML diagrams.
  • Performed Hypothesis testing using SAS to check if the difference in the population mean is significant.
  • Developed process methodology for the Reverse Engineering phase of the project.
  • Prepared complex T-SQL queries, views and stored procedures to load data into staging area.
  • Worked in importing and cleansing of data from various sources like Teradata, flat files, SQL Server with high volume data.
  • Troubleshooting data-related & implementation issues in SQL and on web UI and Power BI, to facilitate timely data releases.
  • Provided solutions for the data migration on various platforms from hierarchical to relational databases to unstructured databases.
  • Wrote Python routines to log into the websites and fetch data for selected options.
  • Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable.
  • Databases and Data Warehouse: PostgresSQL, MySQL, SQL Server, Snowflake, Amazon Redshift.
  • Designed Star and Snowflake Data Models for Enterprise Data Warehouse using E/R Studio.
  • Identified the Facts & Dimensions Tables and established the Grain of Fact for Dimensional Models.
  • Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT).
  • Involved in the study of the business logic and understanding the physical system and the terms and condition for sales Data mart.
  • Worked with Power BI and created layouts to view reports and dashboards effectively.
  • Gathered and documented requirements of a Qlikview application from users.
  • Created Informatica mappings with PL/SQL Procedures/Functions to build business rules to load data.
  • Involved in the creation, maintenance of Data Warehouse and repositories containing Metadata.
  • Worked Normalization and De-normalization concepts and design methodologies like Ralph Kimball and Bill Inmon approaches.
  • Performed ad-hoc analyses, as needed, with the ability to comprehend analysis as needed.

Confidential - Mooresville, NC

Data Analyst

Responsibilities:

  • Defined actionable key performance indicators (KPI) to support business goals.
  • Developed various Complex SQL Queries, Views for Generating Reports.
  • Responsible for revelation, engagement, and churn analysis.
  • Responsible for operational analytics and tracking quality score (TQS).
  • Responsible for inbound and outbound campaign analysis.
  • Responsible for building automated insights and alerts.
  • Extensively used open source tools Anaconda (python) Jupyter Notebooks for statistical analysis.
  • Created action filters, parameters, and calculated sets for preparing dashboards and worksheets in Tableau.
  • Responsible for creating SQL datasets for Power BI and Ad-hoc reports.
  • Designed, Developed and Maintained MYSQL databases for cross sell engagements and to provide structureddata formats for different business units which ensured smooth flow of payments.
  • Gathered the functional and business requirements by conducting JAD sessions and participatory Design Sessions involving major leads from the Technical Department.
  • Identified and developed Use Cases from the business and systems requirements.
  • Documented high level and detailed Use Cases to include all the functionalities of the new system.
  • Involved in integrated adjacencies analytics for Risk optimization, cross-sell, and up-sell.
  • Defined actionable key performance indicators (KPI) to support business goals.
  • Involved in reports generation for path, funnel, and cohort analysis.
  • Developed various Complex SQL Queries, Views for Generating Reports.
  • Developed Python programs for manipulating the data reading from various Teradata and convert them as one CSV Files.
  • Developed python scripts for data cleanup and automations.

Confidential - NYC, NY

Data Analyst

Responsibilities:

  • Extensively used PL/SQL Developer for creating database objects, running the command scripts for insert in the configuration data items.
  • Automated data loading, extraction, reports generation using UNIX Shell scripting and loading data by using SQL Loader to custom tables.
  • Worked on materialized views, DBMS SCHEDULER for scheduling jobs and UTL MAIL to send mails using oracle 10g.
  • Implemented shell scripts for trouble shooting Oracle Errors with the help of trace files and scheduled them.
  • Configured database performance tools such as Oracle Enterprise Manager, SQLNavigator and used them for tuning of applications.
  • Performed statistical data analysis and data visualization using Python.
  • Managed access of reports and data for individual's users utilizing roles by embeddingPower BI reports.
  • Used Oracle Enterprise Manager, TOAD for developing and managing Pl/ SQL scripts.
  • Designed Web-based Internet applications linked to firm-wide SQL databases.
  • Creating and Debugging of Packages and Procedures & Functions and troubleshoot jobs using the debugging tool.
  • Analyzed business process workflows and developed ETL procedures for moving data from source to target systems.
  • Involved in Analysis, Coding, Testing and Implementation of various modules and reverse engineering of the application using Erwin. Used Erwin to transform data requirements into data models.

We'd love your feedback!