We provide IT Staff Augmentation Services!

Data Analyst Resume

5.00/5 (Submit Your Rating)

SUMMARY

  • Professional and insightful data analyst with over 5 years of experience and with MS of Computer Information Systems from BU.
  • Experienced with data collectionand aggregation from different sources like AWS(S3,SQS), MongoDB, and MYSQL DB using Python; and data cleaning, transforming, analyzing using Numpy, Panda, Matplotlib packages.
  • Strong skills in managing data and relational database, and interpreting data using MYSQL.
  • Solid knowledge with statistics like Regression Analysis and hypothesis analysis including A/B test.
  • Adept at using R for statistical computing and R packages like ggplot for data visualization.
  • Skilled in data visualization using TABLUE and Power BI to generate graphical representation of data.
  • Practiced with data mining through R, Wekaand JMP to find predictable correlation, patterns or rules.
  • Facilitated with Machine Learning knowledges using Python and Weka.
  • Worked with AWS Cloud platform and its features like SQS, S3, CloudWatch, ElastiCache, DynamoDB and IAM etc.
  • Proven ability to learn unfamiliar material and techniques quickly, exhibit great attention to detail, execute under aggressive deadlines, and adapt quickly to changing circumstances and systems to maintain strategic vision of the organization and link it with everyday work.
  • A wonderful team player who values collaboration and communication, eager and possesses a very strong work ethic and can relate and work efficiently with diverse groups.
  • An enthusiastic individual with extensive experience in Data Analysis, Data Manipulation, Data Extraction, Research and Reporting in different formats and from various sources.

TECHNICAL SKILLS

Programming Languages: MySQL,Python, R, Hadoop (Hive, Pig),Flask.

Data visualization: Tableau, Power BI, R, Python(Pandas, Numpy,Matplotlib)

Cloud Computing: AWS

MS - Office: MS Word, Excel, PowerPoint

Operating systems: Windows, Mac, Linux

Project management: Agile, scrum, waterfall

ETL: Informatica Power Center,Informatica Power Exchange, Google Data flow, SSIS, AWS Glue

PROFESSIONAL EXPERIENCE

Confidential

Data Analyst

Responsibilities:

  • Collected data from website, AWS and performed data cleaning and wrangling with Python.
  • Aggregated data into MYSQL tables and organized specific campaign periods data with advanced queries.
  • Performed Correlation and regression analysis to explore possible trend or relationship among revenue, product category, time, product number, gross margins of products, sales campaign metrics, location, etc. using Python.
  • Responsible for requirement gathering, business process flow, business process modelling and analysis.
  • Determining Confidential ’s and calibrate quality by presenting value additions and prospective services.
  • Worked as a liaison between the technical team and the clients in understanding the system.
  • Coordinate withvarious stakeholders, experience owners and data teams at enterprise level for building a centralized data store, which aids the Sr. Decision Science Analyst to deep dive and provide broader insights.
  • Experience in reporting, data mining, and ad-hoc analysis.
  • Design and development of Tableau dashboard covering metrics from various sources.
  • Extensive utilization of data blend across multiple data sources to extract maximum business insights.
  • Analyse existing Tableau dashboards and provide recommendations and areas to improve the performance for faster access of dashboards to the business partners.
  • Transition of Tableau Dashboards to enterprise IT team for post-production maintenance.
  • Design and develop info-graphic BI solutions to convey business stories that captivates the stakeholders and executives.

Confidential

Web Analytics and Data Analyst

Responsibilities:

  • Analyzedcurrent performing of Confidential ’ e-commerce checkout process using Google Analytics (GA). Confirmed business goals and formulate A/B test hypothesis, variations etc.
  • Performed multipage testingA/B test and analyze result data considering metrics like percentage increase with python.
  • Retrieved, transformed, and cleanedcustomer data from various sources such as Website, Mysql, Excel, AWS S3 BUCKET, AWS SQS, AWS Cloud Watch, Google Stack driver, MongoDB, Internal app, Kafka, and Rediswith python.
  • Developed use case diagram and relational diagram; created database tables with constraints, primary keys, foreign keys, etc.using MYSQL.
  • Loaded data from python and data files into MYSQL tables for storing;wrote advanced queries indexes, triggers, stored procedures for data extraction, transformation, etc.
  • Created data visualization dashboards with different kinds of charts and reports with Tableau.
  • Performed regression analysis to understand relationships between concerned parameters, time and categorical features by correlation matrix, scatterplots and boxplots with Python.
  • Generatedcomprehensive reportsabout the performance of current checkout process, prediction of future trend and analytical insight for process operation and marketing strategy.
  • Trackedand monitored checkout process performance by continuously collect GA and customer data with Cronjobs.
  • Implemented best practices, standards and procedures to encourage team consistency, efficiency, and maintainability.

Confidential

Database Design and Implementation with SQL

Responsibilities:

  • Analyzed the work flow of client company’sselling and determined structural business rules based on use cases.
  • Design and developed conceptual and logical Entity Relationship Diagrams with Lucid Chart.
  • Set up and developed databases by creating tables using MYSQL with primary keys and foreign keys based on ER diagram.
  • Wrote informative SQL queries for data extraction, such as parameterized stored procedures, index, and triggers.
  • Digitally mark each part of the website that potentially generate data such as link, photo, video, and form.
  • Input data into database on web application, and extract data via restful APIs using Python for data cleaning and data analytics with Numpy, Panda, Matplotlib packages.
  • Involved in Backups, Disaster recovery methods, High availability methods, creating Emails, jobs, alerts and in performance tuning, used SQL profiler.
  • Provided SQL support, evaluated changes to database schema and assisted in updating the ER diagram.
  • Leveraging the power Excel (VLOOKUPS, formulas, Pivot Tables and Charts, Power Query, Power Pivot, etc.) for data visualization and analysis to provide insight on the data.
  • Used Tableau and Power BI server as a front-end BI tool and MYSQL as a back-end database to design and develop workbooks, dashboards, global filter page, and complex parameters based calculations

We'd love your feedback!