We provide IT Staff Augmentation Services!

Data Analyst Resume

3.00/5 (Submit Your Rating)

Charlotte, NC

SUMMARY

  • 8+ Years of experience in Data Modelling, Analysis, Migration and ETL with specialization in Python, R studio, Tableau, SQL.,
  • Solid experiences in using SQL,Python, Tableau,MATLAB,Excel, and Google Cloud
  • Worked on relational database in MySQL,PostgreSQL, and non - relational database in Mongo DB,DynamoDB
  • Experience of creating tables and views, writing complex SQL queries, stored procedures, and functions inMySQL
  • Used Pythonto do ETLand data manipulation process for MySQL database
  • Hands on experience of creating complex NoSQL database inMongoDBand connectingPyMongoto do ETL process byPython
  • Worked on data cleaning, data mining, and data wrangling with unorganized, inconsistent, and incorrect data byNumPy,PandasinPython
  • Experience of building statistical model in Python with Statsmodels and Scikit-Learn
  • Implemented logistic regression, random forest, cross validation, and k-nearest neighbor ML model inPython
  • Experience of creating data visualization with Matplotlib,Seaborn and Plotly in Python
  • Experience of connecting various local and live data source such as excel, json, and MySQL database inTableau
  • Used Tableau to draw various charts and maps and gnerate reports with various sheets, dashboards, and stories
  • Experience of applying ML model such asLasso Model,SVM, and SV Din MATLAB
  • Worked on pivot table, vlookup, and functions to do data analysis inExcel
  • Hands on experience of using workflow tools likeJirato view, manage, and report for work
  • Expertise in creating, configuring and fine tuningETLworkflows designed between homogenous and heterogeneous systems using SSIS ofMS SQL server.
  • Experience of cooperating with multi groups to communicate and negotiate for the projects
  • Combine patience, determination, and persistence to troubleshoot client issues and strong problem-solving and analytical skills

TECHNICAL SKILLS

Tools: Tableau, Spotfire, Informatica 9.6.0, Microstatergy, Test Management Tool (QC 9.2, ALM 11),SVN, Shiny with R

Languages: SQL, R, Python, C, Esqlc, UNIX shell scripting

Database: DB2, Informix, NoSQL (Mongo DB), MySQL, WinSQL, Dbeaver

Data Engineering: Data Mining (Python, R, SQL)

Packages: NumPy, Pandas, Seaborn, Matplotlib, Sklearn

Analysis Methods: A/B testing, Multiple Linear Regression, Logistic Regression, time series analysis, K-nearest neighbor clustering, cross validation, and hypothesis testing

PROFESSIONAL EXPERIENCE

Confidential, Charlotte NC

Data Analyst

Responsibilities:

  • Used Pythonto pre-processeddataand attempt to find insights.
  • Performed Logical & Physical Data Modeling and delivering Normalized, De-Normalized & Dimensional schemas.
  • Imported data from SQL Server DB, Excel to Power BI to generate reports.
  • WrotePythonmodules to extract/load assetdatafrom the MySQL source database.
  • Developed advanced SQL queries to extract, manipulate, and/or calculate information to fulfilldataand reporting requirements
  • Generated various capacity planning reports (graphical) using Python packages like NumPy, Pandas, Matplotlib.
  • Developed aConceptual modelusing Erwin based on requirements analysis.
  • Performed troubleshooting, fixed, and deployed many Python bug fixes of the two main applications that were the main source of data for both customers and the internal customer service team.
  • Used SQL to analyze, query, sort, and manipulatedataaccording to defined business rules and procedures.
  • Built REST APIs to easily add new analytics or issuers into the model.
  • Developed Python programs for manipulating the data reading from various Teradata and convert them as one CSV Files, update the Content in the database tables.
  • Performeddataanalysis anddataprofiling using complex SQL on various source systems including Oracle and MySQL.
  • UsedErwinfor reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information.
  • Developed data visualizations and dashboards using Tableau
  • Perform data analysis and data profiling on raw data using Python
  • Performed Data Extractions, Data Conversions, and Data Imports using SQL, SQL Server, and Import Utility.
  • Generated SQL queries and analyzed SQL to troubleshoot report issues.
  • Analyzed data using SQL and MS Excel to identify and monitor patterns in the transnational history data.

Environment: R, Python, Tableau, NumPy, Pandas, Matplotlib, SQL Server, OLTP, OLAP, Erwin, CSV, Windows.

Confidential, Atlanta GA

Data Analyst

Responsibilities:

  • Involved in generating various graphs and charts for analyzing the data using Python Libraries.
  • Performed Data Cleaning, features scaling, features engineering using pandas and NumPy packages in python.
  • Handled Unstructured Data to derive information.
  • Wrote complex SQL queries to identify granularity issues and relationships between data sets and created recommended solutions based on analysis of the query results.
  • Building, publishing customized interactive reports and dashboards, report scheduling using Tableau server
  • Built various graphs for business decision making using Python matplotlib library, created python reusable scripts using modules like NumPy, Pandas, SciPy, datetime to perform extensive data analysis.
  • Generated data extracts in Tableau by connecting to the view using Tableau MySQL connector.
  • Created complex Calculations in extracted Tableau data with parameter.
  • Performed data testing once every week over all the environment using SQL and Tableau
  • Drew upon full range of Tableau platform technologies to design and implement proof of concept solutions and create advanced BI visualizations.
  • Created Tableau scorecards, dashboards using stack bars, bar graphs, scattered plots, geographical maps and Gantt charts
  • Create various types of data visualizations Dashboards using Python and Tableau.
  • Participated in all phases of research including data collection, data cleaning, data mining, developing models and visualizations.

Environment: Python, R, SQL, Tableau, MATLAB, Linux

Confidential, Hartford CT

Data Analyst

Responsibilities:

  • Use Agile methodology throughout the project. Involved in weekly and daily bases release management.
  • Working with SciPy, NumPy, Matplotlib for developing various algorithms.
  • Data Analysis-Data collection, data transformation, and data loading the data using different ETL systems like SSIS.
  • Generating various capacity planning reports (graphical) using Python packages like NumPy, Matplotlib.
  • Data Warehousing principles like Fact Tables, Dimensional Tables, Dimensional Data Modelling - Star Schema, and Snow Flake Schema.
  • Developed Tableau data visualization using Scatter Plots, Geographic Map, Pie Charts and Bar Charts, and Density Chart.
  • Developed business process models in Waterfall to document existing and future business processes.
  • Imported the customer data into Python using Pandas libraries and performed various data analysis - found patterns in data which helped in key decisions for the company.
  • Created ETL documents that include data types, data definitions, business/transformation rules based on the requirements, or formatting needs of the data from both source and target using MS Office.
  • Involved in the entire data science project life cycle and actively involved in all the phases including data cleaning, data extraction, and data visualization with large data sets of structured and unstructured data, created ER diagrams and schema.
  • Assess and implement advanced web analytics tracking, monitor KPI’s using Google analytics, create google analytics reports, and dashboards.
  • Used python APIs for extracting daily data from multiple vendors.
  • Performed Data Analysis and Data profiling using complex SQL on various sources systems including MySQL.
  • Used JIRA for bug/issue tracking and project management purposes.
  • Create the Business metric KPI (Key Performance Indicator) to evaluate factors for different modules.
  • Created both lists and drill-down reports identifying the data hierarchy.
  • Write SQL queries to update the database and extract the data for analysis.

Environment: Agile, Python, SciPy, NumPy, Matplotlib, ETL, SSIS, Google Analytics, Tableau, JIRA, KPI, SQL.

Confidential

Data Analyst

Responsibilities:

  • Views and Templates were developed with Python and to create a user-friendly website interface Django's view controller and template language is used.
  • Developed UI using CSS, HTML, JavaScript, AngularJS, JQuery and JSON.
  • Used IMAT to connect the data and execute the code.
  • DB2 SQL Procedures and UNIX Shell Scripts were designed and developed for Data Import/Export and Conversions
  • Validated already developed python reports. Fixed the identified bugs and re-deployed the same.
  • A Django dashboard with custom look and feel for end user TEMP has been created after a careful study of the Django admin site and dashboard.
  • Unit Test Python library was used for testing many programs on python and other codes.
  • Worked on selenium testing framework.
  • JIRA was used to build an environment for development.
  • Different testing methodologies like unit testing, Integration testing, web application testing, selenium testing were performed. Used Django framework for application development.
  • Cleaned data and processed third party spending data into maneuverable deliverables wifin specific formats wif Excel macros and python libraries.
  • Used Python and Django creating graphics, XML processing, data exchange and business logic implementation
  • Updated and manipulated content and files by using python scripts.
  • Used several python libraries like wxPython, numPY and matPlotLib.
  • Build all database mapping classes using Django models and cassandra.
  • Used Pandas API to put the data as time series and tabular format for east timestamp data manipulation and retrieval.
  • Designed and developed data management system using MySQL.
  • Creating unit test/regression test framework for working/new code
  • Coded test programs and evaluated existing engineering processes.
  • Designed and configured database and back-end applications and programs.
  • Performed research to explore and identify new technological platforms.
  • Collaborated wif internal teams to convert end user feedback into meaningful and improved solutions.
  • Resolved ongoing problems and accurately documented progress of a project.

Environment: Python 2.7, scipy, Pandas, Bugzilla, SVN, C++, Java, JQuery, MySQL, Linux, Eclipse, Shell Scripting, HTML5/CSS. Red hat Linux, Apache, RUBY, Cassandra

Confidential

Software Engineer

Responsibilities:

  • Analyze the various requirements of user and testing (white box testing) accordingly.
  • Provide technical and functional support.
  • Co-coordinating with other programmers in the team to ensure that all the modules complement each other well.
  • Interacting with client to define business requirements and scope of the project.
  • Document business requirements by constructing easy to understand data and process models.
  • Prepare reports to help in keeping test and production environments in sync configuration wise.
  • Promoting configurations in Production and non-production environments.
  • Automating jobs by building shell scripts and VBA Macros.
  • Understand functional requirements, prepare technical design documents.
  • Perform Code Review and implement code review tools.
  • Draft and maintain business requirements and align them with the functional and technical requirements.

We'd love your feedback!