Sr. Python Developer Resume
5.00/5 (Submit Your Rating)
San Jose, CA
SUMMARY:
- 9 years of experience in the full project lifecycle - Design, Development, and Deployment, Testing
- Expert level experience in Python - building web crawlers, REST API’s
- Intermediate level experience in Java, C and C++ -- Machine Learning Libraries
- Intermediate experience in cloud technologies -- Amazon and Azure.
- High- Intermediate Level Experienced in FluentD- building data pipelines for supporting real time analytics
- Strong analytical and problem-solving skills, ability to work on multiple tasks in a fast-paced environment independently and in a team
TECHNICAL SKILLS:
Programming Languages: Python 2.7, Java, C/C++
Data Mining Libraries: Scikit, Weka, Beautiful Soup, NumPy, SciPy, requests, pymysql
Search: SOLR
Big Data/Cloud: FluentD, Hadoop, Redis, Aurora(MySQL), Kinesis, AWS Lambda
Version control: Git, SVN
WORK EXPERIENCE:
Confidential, San Jose, CA
Sr. Python Developer
Responsibilities:
- Senior member of the data mining team, building a domain classification system and pipelines to store user behavioral data
- Building web crawling infrastructure combining multiprocessing and multithreading in python capable of crawling and classifying over 1 million URLs per day
- Achieved a turnaround time of approx. 2 seconds per URL
- Built machine learning model to analyze crawled data classifying them by applying machine learning and storing them in Redis
- Wrote python scripting for generating daily stats and sanitization of Redis to remove dead domains
- Worked on scripts to search specific keywords to Microsoft Bing and then classify the resulting domains and setup cron jobs to perform this action periodically
- Wrote scripts to automate testing of the entire crawling infrastructure and maintained the codebase in git
- Worked to build an end to end pipeline to process and store user logs in ElasticSearch and Splunk real time of multiple products using FluentD
- Installed and configured FluentD on production systems in the high availability architecture mode
- Worked on modifying existing FluentD plugins to write to Splunk
- Configured FluentD to achieve writing to multiple data sources based on type of data and also triggered AWS Lambda jobs from FluentD
- Worked on setting up and developing AWS (Amazon) lambda jobs in python to generate alerts in case of suspicious user activity.
- Wrote AWS lambda job which consumed user log and extracted information from it and added it to MySQL(Aurora)
- Used AWS Cloud Watch to monitor and store logging information
- The lambda job created a case for each suspicious activity and assigned the case to a student safety analyst using a round robin logic
Confidential
Python Developer
Responsibilities:
- Worked in the search and data mining team on designing, developing and testing RESTful APIs using apache and python
- Worked on setting up and configuring FluentD in production environment for storing user history for RediffMail in Hypertable
- Worked on building a Newsletter API system for serving third party advertisements to Rediffmail users using their history serving up to 3 million ads per day improving the revenue by 15%
- Wrote the algorithm to generate user interest real-time and match it with third party advertisements
- Set up and configured FluentD in production environment for storing user history for RediffShopping in Hypertable
- Built a self-updating caching layer(Redis) on Solr for Confidential Shopping products catalog
- Wrote an API for showing Recently viewed products for Confidential Shopping by fetching using user history from Hypertable and product metadata from Redis
- Wrote an API for generating trending books for Confidential Books
- Wrote a script to download pan India book sales data from third party source and process the data to generate trending scores for books
- Wrote an API for generating trending queries for Confidential Shopping
- Wrote map reduce jobs to generate daywise counts of queries assigning score by KL Divergence
- Worked on setting up and configuring FluentD in production environment for storing user history for RediffNews in Hypertable
- Wrote an API to generate user interest based on history and show recommended news to users based on interest
- Wrote map reduce jobs to process user logs and generate association sores to product.
- Built an API to show also viewed products using association scores
- Wrote an API for vendors on Confidential .com for searching products on market catalog acquired by Confidential .
Confidential
Software Engineer
Responsibilities:
- Worked on developing clicker an android app for classrooms.
- Worked on developing Intulearn, an android app for teaching algorithms and computer science concepts through
- Animations
- Performed design, development, Unit testing and Integration Testing using the spec.
- Performed research to explore and identify new technological platforms.
- Resolved ongoing problems and accurately documented progress of a project
Confidential
Software Engineer
Responsibilities:
- Post-acquisition of Wachovia by Confidential, designed a data migration solution to merge their product lines with those of Wachovia
- Solution was designed to seamlessly integrate the businesses of Confidential and Wachovia
- Comprehensive data migration error handling and recovery was implemented (in Informatica Power Centre)
- Wrote ETL mappings using Informatica Power Centre
- Worked on the Forensic Toolkit Product for windows platform for US based client or data extraction and transmission over a secured network
- Built the entire solution on windows platform using VC++
- Used SVN for version control