- Over 6 years of experience as a Web Application Developer and Data Science using python.
- Hands - on-experience on python web application development using Django, Flask and Serverless framework.
- Experience in Architecting Data Warehouse and Extraction, Transformation, and Loading (ETL) process for complex business that gets data from various sources into Data Warehouse.
- Good experience with Amazon Cloud EC2, Simple Storage Service S3.
- Expertise in working with server-side technologies including databases, Restful API and MVC design patterns.
- Experience in writing Sub Queries, Stored Procedures, Triggers, Cursors, Functions on MySQL, Dremel-SQL and NoSQL (Mongo DB, Cassandra) database.
- Good knowledge on bash shell-scripts to automate routine activities.
- Experience in data analysis and machine learning techniques using Python (Pandas, scikit-learn, NumPy, SciPy, Keras, TensorFlow), Tableau, R, SAS, MS Excel (Macros, VLOOKUP, Pivot table, Solver, Descriptive Statistics).
- Good understanding in Machine Learning Algorithms- Customer Churn Prediction, Predictive analytics, Natural Language Processing, text mining, Naive Bayes classifiers, gaussian distributions, KNN, Sentiment analysis, A/B testing, Principal Component Analysis, logistic classifiers, Null Hypothesis, simulations and optimizations, Linear & logistic predictions, clustering, time series analysis, statistical modeling, forecasting, Cluster analysis, factor analysis, discriminant analysis, neural network, decision tree.
- Result-oriented team player and quick grasping ability with problem solving capability.
Confidential, Sunnyvale, CA
- Involved in enhancement of iDAA framework for DQM (Data quality Management) which helps in data cleansing, processing and maintaining the metadata quality for multiple applications.
- Created API and framework for data ingestion and pipeline between various database platform using Python and Flask.
- Written several python scripts and java codes to verify the data recency, completeness, reconciliation and accuracy by connecting various platform and environments.
- Used PySpark and Kafka topics to extract file from HDFS and insert the data into druid database, where middle layer is Amazon S3 which stores the event details and workflow details.
- Reviewed, corrected the Scala code, and converted some module into python for various requirement.
- Created tables and key spaces in Cassandra to store events’ audit data of flash pipeline.
- Involved in complete ETL process for the data flow from Hadoop Oracle PostgreSQL.
- Bash shell-scripts to run the python script on server and executed the bash file through python.
- Created a wrapper script to run the iDAA codebase in different clusters.
- Analyzed and created the view on the oracle database for Tableau requirement.
Confidential, San Jose, CA
- Collaborated with stakeholder in the Local Inventory Ads (LIA) team to understand business analytics requirements, design and develop tools to analyze, monitor and visualize the key business performance matrices.
- Steered dashboard requirement gathering and created prototype to give clear picture of the assortment on Confidential Express that leads to capitalizing lost opportunities.
- Analyzed and verified the data quality for inventory, product and store related information from LIA feed provider by creating the predictive modeling and analysis.
- Worked with the partner solution teams to develop tools that address their technological and business needs and identify opportunities to grow Confidential ’s partner business.
- Investigates and troubleshoot issues/bugs and provided technical support for LIA operation.
Confidential, East Palo Alto, CA.
- Integrated 3rd party APIs ( Confidential, Facebook, Stripe, PayPal, Confidential ’s Natural language API) for Zebo bot platform and app.
- Created Visualizations using Tableau, Advance Excel Designed and developed data management system using PostgreSQL.
- Collaborated and extracted the structured and unstructured data from various system and performed EDA for statistical modeling.
- Conducted data cleansing, variable identification, univariate analysis, outlier detection and treatment, missing value treatment, variable transformation and creating analytical dataset for further analysis.
- Visualized descriptive analytics KPIs through charts and dashboards using tableau for customer churn and brand forecasting for Marketing and Sales team.
- Worked on developing various python code for automating model results.
- Wrote Python OO Design code for manufacturing quality, monitoring, logging, and debugging code optimization.
- Involved and played a leading role in database migration projects from Oracle to MongoDB
- Installed and maintained web servers Tomcat and Apache HTTP Web servers.
- Worked on automation, setup and administration of build and deployment tools such as Jenkins
- Used RESTful API with JSON for extracting Network traffic/Memory performance information.
- Created database using MySQL, wrote several queries and Django API’s to extract data from database.
- Building a relevant business story out of every model and selling it in a presentable form.
- Understanding the business sense based the drivers and their relationship with the category and the divisions and build business sense out of every forecast.
- Involved in the design, development and testing phases of application using AGILE methodology along with e xperienced in Building reusable code and libraries for future use.
- Designed the Web Application Using Python on Django Web Framework pattern to make it extensible and flexible, implemented code in python to retrieve and manipulate data.
- Used MVC framework to build modular and maintainable web applications.
- Created and executed various MYSQL database queries from python using Python-MySQL connector and MySQL dB package.
- Maintained and improved the security level of data and Responsible for security standard implementation and data protection.
- Used Python and Django creating graphics, XML processing of documents, data exchange and business logic implementation between servers.