To become the best “PRODUCT MANAGER ” by shipping end - to-end data driven products and enhance my product / program management skills.
- Product manager excelling at leadership skills with 5+ years of industry experience in field of data science, big data,software engineering, data analytics to solve challenging business problems.
- Hands-on experience in leveraging machine learning, SQL, Python, data mining, text mining to uncover business insights, understand consumer behavior, python scripting and automating tableau dashboards.
- Full stack python developer with great attitude and work ethic to create profits for business.
- I am also a AWS certified Developer.
- S trong communication and leadership skills to work around cross functional teams and vivid story teller using data to cover business insights for growth and strategy development.Product manager operating at intersection of business, design, data and technology.
- Passion to learn and implement new technologies in areas of software & business development.
- High Proficiency in English language.
- 5+ years of industry experience as a Data scientist.
- Excellent in product management skills and digital analytics. Strong inclination towards building data driven products.
- Full stack python web developer using frameworks like FLASK and DJANGO.
- Excellent understanding of machine learning algorithms such as Naive Bayes, Decision Trees, Logistic Regression,Linear Regresssion,clustering,SVM, K-Nearest Neighbour, Random forests.
TECHNICAL SKILLS AND TALENTS:
Data Science / Analytical: Statistics, Advanced Excel, SQL,Python,Pandas,NLP, NLTK, Text mining,Chatbots Machine Learning,Deep learning,Tableau, AWS,Ipython,Data Structures,Spacy,Tensorflow,Keras,Flask,Spark.
Product management: Google analytics, Tableau analytics, Digital analytics, Marketing analytics, Product analytics, Busines & Data Analytics, Soft skills, Project management, Green Belt Lean six, UX/UI,,Advanced MS excel,JIRA, MS word, MS powerpoint, Google documents,Google sheets,Scrum master,Agile and water fall methodologies.
Machine Learning / Deep learning: Supervised Learning,Unsupervised Learning,Random Forests,Linear Regression,Logistic Regression,Nautral language processing, machine learning techniques, Support Vector Machines, naive bayes classifier, Ensemble methods, Data Wrangling, Predictive Models.
Confidential, San Jose, CA
- Data preprocessing and feature exploration of multiple datasets and building Chatbots using NLTK,NLP.
- Use Python libraries such as scikit-learn, statsmodels, and matplotlib to perform pre-modeling steps. Handling missing values, variable types, outlier detection, multicollinearity, interaction terms, and visualizing variable distributions
- Buiding Chatbots from scratch using Natutal language processing and sciket-learn packages.
- Data visualization by creating Tableau dashboards.
Confidential, Santa Clara, CA
- Data collection and thorough understanding of SQL, Google Cloud (BigQuery) , omniture-hit data, clickstream data, Athena.
- Conducted data cleansing, data manipulation, data analysis and data validations using python and pandas.
- Data analysis using different attribution models.
- Extensive use of programming in python and libraries such as pandas, numpy, matplotlib ,scikit-learn,sci-py and understanding ad-hoc analysis based on site performance.
- Automating tableau dashboards,consumer-customer segmentation,AWS,python scripts,SEO,SEM,marketing channels,facebook retargeting, segmentation analysis, text mining, regular expressions.
- Extensive use of python programming language in developing a pattern recognition tool to improve ranking/CTR of Confidential on Google and developed models and scripts in python to be tested upon and validation data sets.
- Use of Regular expressions and NLP text mining to create patterns and analyze.
- Used ipython jupyter notebook to develop the model for pattern recognition.
- Developed idea,strategy,code and deployment of this tool into production environment.
- Created dashboards in tableau depicting ranks,CTR,impressions,phrases,patterns of search to executive level.
- Pattern recognition tool is being utilized on an everyday basis to track the real time performance of patterns users are searching on google or organic search.
- Formulated logic & algorithm to find estimated number of listings called as “Magic Number ” for each city and state to improve the prediction of median price.
- Concept of Moving average was applied successfully for this model.
- 52 magic numbers where generated for each city state combination to be be used every month as minimum number of listings on Confidential .
- Utilized code in ipython notebook to do data analysis.
- Utilzied Google cloud platforms such as google big query to write advanced SQL codes.
- Advanced window functions and rank functions were utilized for SQl codes.
- Graphs for each city state combination became smoother for users to see trend of price each month.
- Data exploration and data cleansing was performed to remove outliers and duplicate values.
- Successfully completed a project based on Confidential with more than 300+ variables to create a Multi-Dimensional view of Confidential users to derive insights and business recommendations.
- Applied various methods of clustering techniques to segment consumers based on behavior and patterns.
- Performed machine learning algorithms on dataset for each iteration.
- Developed a strategy to calculate minimum number of clusters required for given dataset.
- Utilized libraries such as numpy,sklearn,pandas,matplotlib,plotly for data exploration.
- Used Advanced Excel techniques for data analysis and conditional formatting.
- Data visualization using advanced excel and ipython jupyter notebook.
- Project was delivered across all the teams such as product,user research,software engineering and marketing.
- Automated nearly 30-35 graphs used every week at Confidential to view top metrics and make business decisions. Hands on experience on Tableau for data visualization.
- Created flow of automation and scripts written in python, SQL.
- Developed a idea,strategy, code to automate graphs on tableau server to be analyzed on a daily basis.
- Used AWS athena to create tables with metadata.
- Developed flow of files from data base to S3 buckets to AWS athena tables
- Self taught tableau desktop and tableau server to created graphs which are 100% automated.
- Developed back up codes in AWS athena and Google Big query.
- Bash commands, shell scripting,python codes,tableau server schedulling was performed.
- Product is being successfully utilized by the executives to make daily business decisions
- Segmentation of users based on city, state searched and distance to understand potential of remote buyers.
- SQL Scripting to calculate longitude and latitude distance and data validations.
- Utilized tools such as ipython notebook, AWS Athena, Google big query for data gathering
- Developed presentations in powerpoint and tableau and delivered with product and marketing teams.
- Advanced SQL using analytic thinking was developed to write scripts for Confidential .
- Utilized relationational database management systems such as Microsoft SQL server,RedShift.
- Cross colaboration with teams such as product and markeing to understand customer requirements.
- Google cloud platforms such as google big query was utilized to write scripts.
- Advanced queries using datetime, window functions and analytic functions were incorporated.
Data Science Engineer
Confidential, Brisbane, CA
- Conducted Data mining, Data validation and statistical modeling on large data sets.
- Extensive use of programming languages like SQL,R, Python, Tableau,Adanced Excel
- Developing presentations and using six sigma lean process to increase effeciency by 80%.
- Worked on wide variety of projects on data science, machine learning and text mining.
- Developed a time series model in order to identify an warehouse member has moved across multiple rooms.
- Worked on keras, tensorflow,LSTM,Dense, Sequential packages in ipython jupyter notebook.
- Data contained time series data with varying length so heavy prepossing of data was performed.
- Developed, test and validation sets of data.
- Utilized Long short term memory model and built a LSTM network.
- Monitored validation accuracy using sklearn accuracty score.
- Understanding tweets about the company and doing sentiment analysis on tweets.
- Worked on a binary classification problem and sentiment to be either 1 or 0.
- Utilized librairies like numpy,scikit learn,scipy and nltk.
- Utilized methods such as keras with tensorflow, logistic regression,MLP,CNN,LSTM,RNN.
- Utilized xgboost for XGBOOST and use Anaconda distribution of Python
- Preprocessed daatset and generated statistial information with pickle files showing frequency distribution of unigrams and bigrams.
- Performed naive bayes,maximum entropy, decision tree,random forest, XG BOOST,SVM,RNN, CNN.
- Utilized modules such as keras,utils,sequential,dense,embedding,modelcheckpoint.
- Predicting new clients based on historical usage patterns in relation with time, climate and other data.
- Companies use expeditors carrier service to trasnport goods from one location to another.
- Devloped hypothesis in relation to hourly trend,daily trend,rain,temperature,pollution, time, traffic.
- Understanding dataset by doing data exploration in R.
- Importing and test dataset and combinng them to understand distribution of independent variable together.
- Understanding distribution of numerical variables and generate frequency table for numeric variables.
- Performed Hypothesis testing or multivariate analysis for hypothesis generated.
- Performed Feature engineering to improve prediction power of model.
- Performed random forest algorithms,decision tree and conditional inference.
- Data was collected for more than 1500 partners out of 10 different location of expeditors.
- Developed hypothesis company type, company density, marketing,location,customer behavior, competitors.
- Data exploration was performed to gather some inferences about the data.
- Performed feature engineering and created and test datasets.
- Used describe() to generate statistics for numerical variables.
- Data cleaning for treating missing values and treating outliers. Imputed missing values and performed feature engineering.
- Performed Numerical and one-hot coding of categorical variables.
- Created a linear regression modela, ridge regresion model,decison tree model,random forest model.
- Calculated RMSE and mean CV error.