Data Analyst Resume
OBJECTIVE
- I'm an experienced and insightful Data Analyst that has a passion for business growth.
- In my experience I have overseen key performance indicators as well as tracked data from multiple sources in various ways.
- I have attained many skills on different platforms such as Python, Power BI, Tableau, SQL, ETL, and Azure ML. I’m self - driven and I’m always looking for new challenges regarding company improvement as well as problem solving.
- My substantial experience in the field has allowed me to learn significant skills.
- While doing so, I am an effective communicator as well as a valuable team player.
- I love bringing new ideas to the table and learning from others.
SUMMARY
- Expresses professionalism, effective communication, and interpersonal skills
- Built foundation of time management in the workplace, meeting deadlines, as well as presentation skills
- Embraces teamwork and a team player who can also work independently while being determined and motivated in high pressure situations
- Always willing to learn new technology techniques and skills
- 4+ years of executive experience in Business Intelligence tools, Data Analytics, while increasing business growth and the development of the company
- Worked with higher management levels on multiple entities such as Finance, Marketing, Production and Technology topics regarding the OKR - Objective Key Results and KPI’s - Key Performance Indicators
- Proficient in building interactive dashboards and visualizations in PowerBI, such as scatter plots, maps, donut, bubble, gauge, and funnel charts. Whilst applying filtrations, local and global variables.
- Proficient in transforming dashboards and visualizations in Tableau Desktop and Tableau Public. Such as gantt, bubble, pie, line, and bar charts. I also utilized various other visualizations in regards to density maps, scatter plots, and tree maps.
- Utilized the drill down and drill through methods for breaking complex problems into more manageable and smaller parts in Power BI, whereas in Tableau I utilized the page shelf method.
- Wrote complex SQL queries to achieve the results of the present project key components I was working on
- Displayed various reporting tools such as Bins, Sets, Groups, Parameters, Calculated Fields, Measures, Quick Measures, Dimensions, and many more uncoverings of the data
- Reduced data redundancy while maintaining data normalization and data integrity
- Executed data cleansing in multiple databases, occupying the elimination of null values, incorrect values, inconsistencies, and duplication. Ensuring the data is prepared for reporting
- Utilized CTE’s, derived tables, subqueries, unions, and joins when writing more complex queries in SQL
- Wrote multiple DAX functions to build formulas and expressions in my PowerBI reports
- Examined companies long-term KPI’s while developing strategic long term goals in their financial and operational achievements, segmenting progression verses declines
- Wrote codes for data analysis in Python and Google Colab, utilizing libraries such as Matplotlib, Pandas, NumPy, and Seaborn for visualizing the main components and characteristics
- Implemented multiple libraries in Python and Google Colab such as Matplotlib, Seaborn, Pandas, NumPy, Seaborn, Scikit Learn, SciPy, and Python native functions
- Experience in Python using machine learning, deep learning, linear regression, feature selection & engineering, model building, and testing/training
- Created predictive models in Python using Artificial Intelligence and deep learning
- Computed ETL (Extract Transform and Load) with MS SQL Server Integration Services (SSIS) for data flow procedures and the execution of data warehousing. Migrating, transforming and computing the data to its destination
- Concatenated historical data from multiple sources during data warehousing that executed reporting, formatting ad hoc queries that led to a determined resolution with analysis
- Brought data from SQL Server to SSIS, executing queries to ensure data accuracy in SQL
TECHNICAL SKILLS
Data Analytics: Data Cleansing, Normalization, Data Mining, Big Data, Data Reporting
Repository Systems: GitHub
Data Science: Artificial Intelligence, Machine/Modeling Learning, Regression, Popular Packages, Workflows, Deep Learning
Python: Pandas, NumPy, Scikit Learn, MatplotLib, Seaborn, SciPy
Mathematics/Statistics: Probability, Standard Deviation, Correlation, Mean, Normal Distribution, Z-Scores
ETL: SSIS(SQL Server Integrated Services)
BI Reporting Tools: Tableau, Power BI
Databases: SQL Server, MySQL, Oracle
Cloud Platforms: AWS RedShift, Azure
Languages: SQL, TSQL, Basic Python
IDE: SQL Developer, SQL Server Management Studio, Jupyter, Google Colab
Operating Systems: Windows & MacOS
Extra Products: MS PowerPoint, MS Outlook, MS Teams, MS Excel, MS Word
PROFESSIONAL EXPERIENCE
Confidential
Data Analyst
Responsibilities:
- Utilized Power BI as a business intelligence tool for visualizations
- Teamed up together with management to create individual department key performance indicators for sectors such as marketing, finance, sales, and production.
- Utilized quick measures, measures, and DAX functions to implement aggregated fields on my reports
- Used Python, with the libraries Pandas and NumPy to manipulate and analyze the data visually
- Gathered with multiple partners of the company daily, weekly, and quarterly to review the new findings with in the data
- Created CTE’s (Common Expression Tables) when creating complex joins and subqueries
- Took a deeper dive into the creators and backers to identify trends with in the basis of projects
- Identified where majority of Confidential ’s marketing was getting more attention vs where they weren’t getting exposure
- Analyzed the consumers of the platform, where common issues were and what Confidential did best in
- Mined data from multiple sources within the business
- Utilized different joins, subqueries and unions when working with multiple fields, rows, and tables
- Gaped the differences within the company in different departments. Where certain entities were having weaknesses
- Improved the performance of the platform while working with a team. Finding more profound relationships with in the data and what could resolve numerous issues
- Performed comparative ratios with competitors in the same industry
- Underlined various measures of performance such as revenue, client retention rate, profit margin, and the average daily visits on the website
- Created 3D map visualizations to see where the majority of our backers and creators are located
- Modeled data within the data view, allowing me to join tables with in my report
- Utilized the Power BI mobile app for quick analysis and reportings
- Used DAX functions such as time and date, logical, filter, financial, and text functions to analyze the data
- Put together strong, interactive reports on the Power BI canvas within the report view.
- Edited the reports, while utilizing the diverging color format regarding the min and the max of the aggregated function
Environment: Power BI Desktop, Power BI mobile app, SQL, Python, NumPy, Pandas
Confidential
Data Analyst
Responsibilities:
- Attended multiple meetings regarding new findings, the reviewing of the data, statistics, what upper management was looking for in this project, and what the end goal is
- Performed data analysis by utilizing Tableau for my business intelligence tool
- Blended data from multiple sources in Tableau, such as Excel, Oracle SQL and MS SQL so I could see the bigger picture
- Cleansed data by removing duplicates, filtering outliers, handling missing data, as well as removing out unwanted observations for that specific report
- Produced a Machine Learning algorithms implementing Scikit-Learn to forecast the success of projects before they are put on the platform
- Implemented various dashboards by the advised information needed in Tableau
- Built multiple dashboards with a deep understanding of what could be more progressive and what is already progressive while attaining knowledge on the businesses revenue, the most funded projects, and the overall financial statistics
- Performed yearly and quarterly financial statistics on the business
- Created interactive dashboards and filters in my reports in Tableau, using dimensions, measures, context filters, data source filters and extract filters
- Utilized the keep only and exclude data points in Tableau to allow me to filter my data view
- Dragged and dropped dimensions, measures, and date fields to the filter shelf to expand my reporting
- Filtered categorical data by using the conditional, top, wildcard, and the general dimensions
- Used Tableau Desktop and Tableau Online to enable a schedule of data refreshes
- Created calculations and parameters on reports to create a more interactive presentation
- Generated trend lines and forecasting for predictions regarding specific topics
- Wrote multiple Ad-Hoc calculations to perform specified business requirements
- Connected individual data points using line charts; providing visualizations of trends over time
- Utilized global and local filters across multiple worksheets within a workbook
- Paid great attention to detail for requirements based on the project and gathered unrevised data for cleansing and reporting
- Matched different visualizations according to the data that was being shown
- Observed data clean up processes and analyzed data integrity issues
- Tested, modified, and created visualizations based on the users requirements
- Deployed the drill down and drill through methods Tableau to navigate hierarchies
- Used a statistical approach, linear regression, to compare the dependent and independent variables for predictions Ex: amount of money raised by a specific project, forecasting sales in the future month
- Utilized the pandas dataframe and numpy arrays to test for accuracy in the ML model
- Achieved statistical endeavors and methods to aid in understanding the data while using summary statistics to summarize distribution and relationships between variables using statistical quantities
- Eliminated irrelevant features to avoid overfitting and an unreliable model
- Used the Microsoft Azure ML platform to establish ML models to the cloud
- Performed ETL - Extract Transform and Load in Microsoft SSIS, to extract different sources to move it in to a centralized data warehouse
Environment: MySQL, SQL Server, AWS Azure, Oracle, Tableau, Jupyter Hub, Slack, Pandas, NumPy, Scikit-Learn, Microsoft SSIS, Python