Data Scientist Resume
WA
PROFESSIONAL SUMMARY:
- A competent professional with 7 Years of industry experience in Data Science, Business Analysis, & Solution Architecture.
- Identified areas of improvement in existing business by unearthing insights by analyzing vast amount of data using machine learning techniques.
- Working with business units (partners) to identify, prioritize, clearly define and document analytical needs, while maintaining perspective on the ultimate KPIs of the company
- Utilized analytical applications like R, and Python to identify trends and relationships between different pieces of data, draw appropriate conclusions and translate analytical findings into risk management and marketing strategies that drive value.
- Designed and implemented statistical / predictive models utilizing diverse sources of data to predict demand, risk and price elasticity
- Interpret problems and provides solutions to business problems using data analysis, data mining, optimization tools, and machine learning techniques and statistics
- Designed and deployed data science and technology based algorithmic solutions to address business needs for customer service business identify, understand and evaluate new commerce data technologies to determine the effectiveness of the solution and its feasibility of integration with all the product environment
- Conducted in - depth analysis and predictive modelling to uncover hidden opportunities; communicate insights to the product, sales and marketing teams
- Strong analytical and problem solving skills for solving the core issues underneath complexity and ambiguity. Excellent communication, analytical & troubleshooting skills
TECHNICAL SKILLS:
Machine Learning: Classification, Regression, Clustering, Recommendation System, Association Rules
Operations Research: Optimization techniques like Linear, Integer Optimizations etc
Statistical Methods: Hypothesis testing & Confidence Intervals, Principal Component AnalysisProg. Languages: R, Python, SQL, PL/SQL, COBOL
Technologies/Tools: Azure Machine Learning
Data Visualization: Qlikview, Tableau 8.0, ggplot2(R), Matlab
DBMS: DB2, IMS-DB, Oracle
PROFESSIONAL EXPERIENCE
Data Scientist
Confidential, WA
Responsibilities:
- Created Churn Models to identify customers at high risk of churning, from Microsoft Cloud (Azure)
- Re-factored multiple Azure Cloud Customers Churn Analysis and Prediction Projects (Visual Studio Team Services)
- Performed customer profiling by combining billing status and features extracted from time series usage data
- Significantly improved model performance using Neural Networks; its predictions using local interpretable model explanations.
- Scaled and productionized training and scoring pipelines using open source tools on Microsoft cloud services
- Proposed business-focused metrics to measure model impact and optimized ROI for sales and marketing teams to take effective actions.
Confidential, Mountain View, CA
Data Scientist
Responsibilities:
- Created advanced supervised and unsupervised machine learning algorithms specifically for Time Series analysis (Time Series ARIMA approach) based on anomaly detection algorithm.
- Wrote custom python methods to identify features for application areas and re-organized the logic and inputs to fit the business metrics scenario including sentiment prediction, machine translation (Deep Learning).
- Used SQL for the data from production, compiled the code into Mapper Reducer function with each group id as Mapper key and set up the map-reduce data pipeline job using AWS data sets.
- Applied benchmark machine learning algorithms, such as SVMs, Logistic Regression and HMMs to numerous tasks in natural language processing, built Network Model for the same.
- Evaluated business characteristics of over 8 million enterprise records, developed, integrated Big Data sets using ‘R’ open source statistical tool.
- Created predictive models using machine learning algorithms, aggregated reports and visualizations to analyze business owner data sets. Performed quantitative analysis and data mining on them to identify client interaction with our business products and proposed increased revenue strategies.
- Worked on Data Infrastructure to perform ETL, requirement analysis on the tool to build data sets for operational analysis.
- Managed Product Operations and improved internal tool analysis by setting requirements, designing key metrics and change management on these metrics.
- Led product management team to follow best practices, to identify different levels for continuous improvement of sales channel by communicating with stakeholders.
Environment: Python, SQL, Hadoop, Tableau, R programming, DB2, Windows, Linux, Machine Learning.
Confidential, Santa Clara, CA
Operations Analyst
Responsibilities:
- Data Analytics: Integrated about 3 million records data set for multiple clients for better understanding and efficient analysis of their product portfolio.
- Remodeled Dashboards and delineated company’s product family, customer information and data integrity verification.
- The above project was used as a reference to gain more market in visualization area (MS SQL, Tableau).
- Evaluated client’s in-house revenue recognition system that captures transactional data. Developed scripts to link revenue system data to Confidential ’s payment gateways amounts and created meaningful revenue models for Confidential management.
- Audit - Data Analytics: Performed extract, transform and loading (ETL) using ACL Analytics 10 as part of analyzing P&L, balance sheet & trial balance for substantive audit testing of ERP systems (Oracle, SAP).
- Performed detailed data analysis to identify potential financial misstatements due to errors or fraud.
- Received outstanding contributor & innovative ideas Award for my performance in above areas.
- Designed Database for the entire system, developed the Front end for the Client’s system, provided solo support on Project management, Risk Assessment, and Quality Assurance.
- Provided financial analysis as part of investment mutual fund strategy. Performed recommendations for shortlisting best mutual fund based on company balance sheets, mutual fund reports.
- Created project management plan, work break down structure and HR plan with team of 5 for consulting company
- Performed risk identification, mitigation and contingency plan reducing risk impact by 20%.
Environment SQL, SAS Modeling, SAP Data Services, Windows.
Confidential
Lead Software Engineer
Responsibilities:
- Using proprietary statistical tools designed Quality Assurance Metrics to deliver reliable and robust source codes that slashed the cost of the product by 16%.
- Optimized the activities of team members by assigning tasks of each module according to its complexity.
- Developed algorithms and implemented features to resolve complex functionalities of product Model’s Interface using C++ at front end and SQL server at back end to extract API’s.
- Evaluated the modularity in specific module code before being delivered to the client.
- Analyzed the requirements and designed the state chart for the feature implementations in View Development (human interface).:
- Lead the project by applying a structured methodology and implemented a change management strategy and built the tool using Java.
- Created a list of approximately 15 potential clients, with a 22% success rate in setting up initial meetings.
- Communicated with potential clients, resulting in a quantified needs evaluation via a customized program.
- Got a major Change in the Company by introducing a Mobile Ticketing tool application which reduced complexity of creating tickets which was a huge success.
- Used the Time series approach to predict the stock Market, predicting whether the stock price would go up or down.
- Used the collaborative machine learning algorithms and global latitude longitude values to predict the stock market.
Environment: Python, R, Machine Learning Algorithms, Various Time series Modeling, SQL.
