Data Analyst Resume
5.00/5 (Submit Your Rating)
TX
SUMMARY
- Data analyst with around 4+ years of industry experience in collecting, organizing, interpreting, and disseminating various types of statistical figures.
- Good understanding in the software development methodologies such as Agile, Waterfall.
- Skilled at Python, SQL, R and Object - Oriented Programming (OOP) concepts such as Inheritance, Polymorphism, Abstraction, Encapsulation.
- Experience in using various packages in R and python-like ggplot, Gurobi, spacy, pandas, NumPy, Seaborn, SciPy, Matplotlib, scikit.
- Developed Tableau visualizations and dashboards using Tableau Desktop.
- Strong combination of leadership and hands - on skills in MS SQL Server and MySQL.
- Used the version control tools like Git.
- Extensive experience in RDBMS implementation and development using SQL, PL/SQL stored procedures and query optimization.
- Experience in developing cubes using Partitions, KPI’s, and Perspectives, slow moving dimensions.
- Experience in developing data applications with Python in Linux/Windows and Teradata environments.
PROFESSIONAL EXPERIENCE
Confidential, TX
Data Analyst
Responsibilities:
- Assisted in developing VegaLytics Confidential analytics platform for Confidential payers, ACO’s and ASO’s. The platform provides descriptive and prescriptive analytics support.
- Platform offers more than 150 business insights that help to improve member experience and health while reducing the total cost of care.
- Involved in building predictive models for osteoporosis and MSK.
- Worked with clinical team converting rule set into a deterministic model for early detection, detecting potential fraud and waste
- Used collections in Python for manipulating and looping through different user-defined objects.
- UsedPandas, Numpy, Matplotlib and ScipyinPython and Rfor developing various algorithms.
- Performed Data Cleaning, features scaling, features engineering using pandas and NumPy packages in python.
Confidential
Data Analyst
Responsibilities:
- Built a quadratic optimization model using interior points algorithm in python with cvxpy package to distribute the technician work hours across different categories based on their skillset more efficiently to improve service level by 15 percent and reduce response time by 2 days.
- Developed an improved forecast with an ensemble model based on ARIMA, exponential smoothening, and moving average methods that increased accuracy by 7 percent.
- Created multiple reports in excel by importing SQL tables using ODBC add-ins that updated senior management about the changes in key metrics at different timelines and granularity.
- Performed ad-hoc analysis that found the root cause of an existing problem by providing evidence backed by data.
- Developed Named Entity Recognition and part-of-speech models using spacy package in python and assisted in creating a graph database using neo4j to capture relationships that helped users to traceback different preparatory materials and their attributes used to create a final product.
- I have assisted in creating an EC2 instance to store information in a graph database using neo4j. I have retrieved information stored in a S3 and other data sources to create a data lake in Azure and to develop a basic chat bot in using AWS Lex and Lambda function.
- Developed a chatbot in Microsoft Azure as a proof of concept that transformed layman questions into neo4j queries using a trained Language Understanding (LUIS) bot framework to retrieve the requested information stored in a graph database.
- Created a data lake in Azure to store research data about biofuels from different sources like AWS endpoint, excel files, text files, pdf, and digital images and consolidated the information in MongoDB to perform analysis to understand the interaction between different parameters affecting the growth.
- Used VBA programming in excel to automate weekly reports that track key metrics about Customer Satisfaction Rates and Service Orders
Confidential
Data Analyst
Responsibilities:
- Served as the Product Owner in Waterfall Methodology development process representing and understanding the needs of the manufacturing and product servicing departments determining data mapping issues and features of the iteration release.
- Clean data and processed third party spending data into maneuverable deliverables within specific format with Excel macros and python libraries such as NumPy, Panda and matplotlib.
- Established the connection between legacy system using MySQL and SQL Server.
- Generated PL/SQL scripts for data manipulation, validation and materialized views for remote instances.
- Provided continued maintenance and development of bug fixes for the existing and newPower BI Reports.
- Responded to all incoming questions and inquiries related toJIRAapplications.
- Created Power BI visualization of Dashboards & Scorecards (KPI) for the Finance Department.
- Created databases for OLAP Metadata catalog tables using forward engineering of models.
