We provide IT Staff Augmentation Services!

Sr. Python Developer Resume

5.00/5 (Submit Your Rating)

Petaluma, CA

SUMMARY:

  • 7+ Years of experience in Analysis, Design, Development and Implementation of various stand - alone, client-server enterprise applications.
  • Experience with full software development life-cycle, architecting scalable platforms, objects oriented programming, database design and agile methodologies.
  • Extensive experience with Django, a high-level Python Web Framework.
  • Experience in Object Oriented Programming (OOP) concepts using Python, Core Java.
  • Extensive experience in designing and developing web-based applications using Python, Django, HTML, CSS, JavaScript, JQuery, XML and JSON.
  • Experience in implementing RESTful Web Services with server side technologies using restful API and MVC design patterns with Django REST framework and Django framework.
  • Good experience in handling errors/exceptions and debugging the issues in large scale applications.
  • Hands on experience working in WAMP (Windows, Apache, MySQL and Python) and LAMP (Linux, Apache, MySQL and Python) Architecture.
  • Experienced in caching large scale applications using Memcache, Redis.
  • Expertise in working with different databases like MySQL, PostgreSQL, Oracle and very good knowledge in NoSQL databases MongoDB, Redis.
  • Proficient in developing complex SQL queries, Stored Procedures, Functions along with performing DDL and DML operations on the database.
  • Excellent working knowledge in UNIX and Linux shell environment using command line utilities.
  • Expertise in Production support and Knowledge of deployment using Jenkins and Ansible.
  • Experience working in both Waterfall and Agile software methodologies.
  • Familiarity with development best practices such as code reviews, Unit Testing, System Integration Testing and User Acceptance Testing (UAT).
  • Hands on experience in working with various Version Control Systems, mostly GIT, subversion SVN, CVS and Mercurial.
  • Involved in all the phases of Software Development Life Cycle (SDLC) using Project Management tools JIRA and Redmine.
  • Expert level user of several project management tools including JIRA, Trello.
  • Well versed with Agile and Test Driven Development methodologies.
  • Followed the best practices of Python such as PEP-8.
  • Performed code reviews and implemented best Python Programming Practices.
  • Experience in writing test scripts, test cases, test specifications and test coverage.
  • Having 2+ years of hands on experience on Amazon Web Services (AWS).
  • Possess good interpersonal skills, ability to work in Self-Managed and Team Player.
  • Willing to take initiative and able to learn quickly, and apply new tools and technologies in the projects.

TECHNICAL SKILLS:

Programming Languages: Python (2.x, 3.x), Java, Shell Scripting

Python Framework(s): Django, Django REST Framework, Flask, Pyramid, Scrapy, Tornado

Python Libraries: Pandas, Numpy, Unittest, JSON, CSV, XML, XLS, Selenium, Boto(3)

Database Systems: MySQL, Sqlite3, Django ORM, SQLAlchemy, Oracle

NoSQL Database Systems: MongoDB, Redis

Web Development Languages: HTML, CSS, JavaScript, JQuery, JSON, NodeJS

Operating Systems: Linux (Ubuntu, CentOS), Windows® 7, 10, XP

Development Methodologies: Agile Methodology, Scrum Framework, OOP, MVC Architecture

Deployment Tools: Jenkins, Ansible

Version Control: Git, SVN, CVS, Mercurial

Development Tools / IDEs: Eclipse, NetBeans, Vi/Vim, Sublime Text, Komodo Edit, PyCharm, IDLE

Tracking Tools: JIRA, Redmine, Trello, Confluence

Amazon Web Services (AWS): EC2, S3, EBS, Lambda, Dynamo DB, Redshift, SNS, CloudWatch

PROFESSIONAL EXPERIENCE:

Sr. Python Developer

Confidential, Petaluma, CA

Responsibilities:

  • Understanding the project documentation, analyzing and converting into technical requirements.
  • Requirement analysis and Estimation of project timelines.
  • Participating in Sprint Planning and Releases.
  • Performed efficient delivery of code based on principles of Test Driven development (TDD) and continuous integration to keep in line with Agile Software Methodology principles and SCRUM process.
  • Implemented Utility Services for collecting and visualizing the Energy data.
  • Used Pandas Library for statistical analysis.
  • Designed the functionality to read the data from DynamoDB and clean the data and arrange the data into timeframes using Pandas.
  • Used Numpy along with Pandas for computing min, max and mean power voltages.
  • Implemented REST API calls to access the production data.
  • Implementing lambda functions to perform tasks when subscribed rules matched.
  • Creating s3 buckets and adding permissions to store and access the data files from external scripts.
  • Added support to sync all static/media files to AWS S3 bucket from Django.
  • Developed custom modules to parsing XML and JSON Files using lxml, simplejson and load the data into database.
  • Designed the entire web application using Python, Django, MySQL, MongoDB and Some Amazon Web Services like S3, RDS, and DynamoDB.
  • Utilized Elastic Beanstalk (EBS) service for deploying Django Application on AWS using EB Command Line Interface tool.
  • Implemented custom configuration files to install system and python dependency packages into EC2 instance.
  • Created some Cloud watch Alarms to monitor the CPU Utilization, Network and Bandwidth usage.
  • Added support to send E-Mail alerts when something went wrong in Django Application.
  • Utilized JIRA for bug reports, update reports and monitoring team work progress.
  • Integrated Confluence pages into JIRA tool for maintaining the Project planning, Project documentation and meeting notes.
  • Used JIRA tool for creating and estimating stories, building a sprint backlogs, reporting on team progress.
  • Used Jenkins to automate the Deployment process and run Unit Test cases.

Environment: Python 3.5, Linux, Eclipse, Vim, Django 1.9, Boto3, Html, CSS, JavaScript, JQuery, Bootstrap, Django REST Framework, Agile, Scrum Framework, JIRA, Jenkins, Amazon Web Services, Git, MongoDB, MySQL, Pandas, EC2, S3, EBS, Lambda, Dynamo DB, Redshift, Cloud Watch.

Sr. Python Developer

Confidential, New Jersey

Responsibilities:

  • Gathering requirements specifications and analyzing the requirements.
  • Created entire framework using Python, Scrapy, MySQL and Linux.
  • Implemented and customized Web Scraping Framework using Python’s Scrapy Framework.
  • Utilized python libraries Requests, urllib, urlparse, MySQLdb, xlrd, xlwt, JSON, Selenium, Facebook and Twitter.
  • Developed spiders for collecting the metadata from specified website.
  • Created databases and tables using MySQL, wrote several queries to extract/store data.
  • Implemented functionality to get the timeline information (Specified Doctors) from Facebook and Twitter websites using their Python APIs.
  • Developed some spiders using python selenium library to collect the metadata.
  • Developed functionality to verify the data quality and applying some sanity checks on top of the collected data.
  • Developed independent scripts to read the data from MySQL and generating Excel formatted using Xlrd, Xlwt, and CSV python libraries.
  • Followed PEP8 Coding Standards for maintaining the quality code.
  • Used JIRA tool for creating and estimating stories, building a sprint backlog, reporting on team progress.
  • Utilized JIRA tool for bug tracking.

Environment: Python 2.7, Ubuntu, Vim, Scrapy, Requests, Urllib, MySQL, Shell Scripting, Json, CSV, Xlrd, Xlwt, Selenium, APIs(Facebook, Twitter), JIRA, Scrum Framework, Agile.

Python Developer

Confidential

Responsibilities:

  • Analyzed the requirements specifications and in NCE interaction during requirements specifications.
  • Developed views and templates with Python and Django view controllers and template language to create a user friendly interface.
  • Used Django configuration to manage URLs and application parameters.
  • Created entire application using python, Django, MySQL and Windows.
  • Developed presentation layer using HTML, CSS, JavaScript, and JQuery.
  • Used JQuery libraries for all client side JavaScript manipulations and validations.
  • Utilized Python libraries Gzip, Selenium, Unittest, SQLAlchemy and Json.
  • Implemented SQLAlchemy which is a python library for complete access over SQL.
  • Implemented functionality to validate the selected or uploaded input file extension (must be .zip).
  • Developed backend functionality to unzip the uploaded input file and read the input files one by one.
  • Utilized existing scripts to Parse input lines into scopes (BGP, Interface, access-list, route-map) and instantiate a corresponding object class.
  • Implemented new feature to download the output into output.zip file.
  • Created database using MySQL, wrote several queries to extract/store data.
  • Used Selenium Library to automate the testing process with multiple browsers.
  • Deployed the application into Apache using Mercurial Version Control System.
  • Responsible for debugging and troubleshooting the web application.
  • Followed PEP8 Coding Standards for maintaining the quality code.

Environment: s: Python 2.7, Komodo Edit, Windows 7, Django, Pyramid, Gzip, Html, Css, Javascript, Jquery, Selenium, Mercurial, MySQL, SQL Alchemy, unittest, Django ORM.

Lead Python Developer

Confidential

Responsibilities:

  • Responsible for gathering requirements specifications and in client interaction during requirements.
  • Involved in system analysis, design, development, testing and deployment process.
  • Participated in the complete SDLC process.
  • Implemented and customized Web Scraping Framework using Python’s Scrapy Framework.
  • Created database using MySQL, wrote several queries to extract/store data.
  • Developed functionality to create new table schemas automatically when they are not exists.
  • Designed the framework to store and get the scraping urls from corresponding url queue table.
  • Written pipelines functionality to store the scraped items into files in json format.
  • Implemented functionality to validate the scraped items, if any item is not valid it will drop automatically.
  • Developed functionality to send alert emails when something went wrong in the framework.
  • Implemented functionality to validate the scraped items, if any item is not valid it will drop automatically.
  • Written scripts to read the data from files and populated into MySQL tables.
  • Written shell scripts to move files from one location to another location and removing empty data files.
  • Developed new system called Dead System, to check specific asset health status in source site, if it is not existing we will remove from our DB.
  • Developed new REST API using Django REST API Framework to display the records available in the database.
  • Followed PEP8 Coding Standards for maintaining the quality code and code standards.
  • Used tools like Pyflakes, Pylint and Pychecker for checking the code quality and finding bugs in code level very oftenly.
  • Implemented functionality for creating/deleting buckets on Amazon S3 machine.
  • Written automation script to download/upload the data files from local machine to S3 and S3 to local machine using Python AWS API Boto.

Environment: Python 2.7, Vim, Ubuntu, Scrapy, Requests, Urllib, MySQL, Shell Scripting, Json, CVS, Django REST Framework, Django ORM, pip, pychecker, pylint, AWS S3, Boto.

Lead Python Developer

Confidential

Responsibilities:

  • Implemented Login through various social networking sites - Facebook, Twitter, Netflix and Google+.
  • Fetching the user profile info, shares, posts, likes, friends and followers using APIs of the social network sites.
  • Implemented algorithms to get wiki merges for respective post/like/shares.
  • Used Selenium Library to write fully functioning test automation process that allowed the simulation of submitting different we request from multiple browser to web application.

Environment: Python 2.7, Vim, Django, Django ORM, Oauth2, Requests, Urllib, Apis(Facebook, Twitter, Netflix), Selenium, CVS, pip.

Lead Python Developer

Confidential

Responsibilities:

  • Implemented backend functionality to add, modify and delete the tacking keywords.
  • Supported to track keywords asynchronously using tornado.
  • Gathering tremendous amounts of data from the Internet for given keywords.
  • Built Pipelines to store the data in Redis Server.
  • Applying sentiment and Klout scores to enrich the data.
  • Written functionality to store enriched data into Elasticsearch.
  • Implemented automate scripts to backup the old records using mongodb export command and transfer these backup files into backup machine using ftplib.
  • Maintaining multiple copies of data in different database servers using MongoDB Replication concept.

Environment: Python 2.7, Vim, Django, Django ORM, Scrapy, Requests, Urllib, Apis(Facebook, Twitter, Klout), MongoDB, Redis, ElasticSearch, Tornado, ftplib, Github, pip.

Lead Python Developer

Confidential

Responsibilities:

  • Lead developer on a small team that oversees all levels of support includes maintenance, improvements and new development.
  • Daily tasks include front-end development and backend development.
  • Implemented stats table which displays the pass/fail statues of the test cases.
  • Implemented feature to download the test cases report in Json/Excel format.
  • Responsible for deploying the source code to production after adding any new feature into this and checking performance.

Environment: Python 2.7, Vim, Django, Django ORM, MySQL, Requests, Urllib, lxml, Html, Css, JavaScript, Jquery, CVS, pip.

Python Developer

Confidential

Responsibilities:

  • Emails formats are vary from agency to agency for travel mails or for product order mails. First we have to check how many different pattern emails are available and maintain a config file that have the email format and a template filename which parse this type of email content.
  • Written scripts to save emails content as HTML files, for ease while parsing data.
  • Written scripts to identify the email format and select template file name from config file and run the template file and write parsed email content into data files.
  • Implemented a loader functionality which loads parsed data files into MySQL tables.
  • Email parsing templates code is developed by crawling team.
  • Written an automated script that will check and inform through mail whether there is any new format email appears or not. If yes inform the crawling team and once the code developed, add new email format details in config file so that new email formats also get parsed.
  • Based on the city names between which cities the person is traveling, get the latlong details using internal tools.

Environment: Python 2.7, Vim, Beautiful Soup, Requests, MySQL, CVS, Redmine.

Python Developer

Confidential

Responsibilities:

  • Finding the websites related to media.
  • Analyze the website to reach the client requirement.
  • Writing code templates and implementing them with different methods to get the metadata (Developer).
  • Pumping records into database and verifying the records in database.
  • Scheduling the Script to run regularly.

Environment: Python 2.7, Urllib, lxml, Requests, MySQL, Vim, CVS, pip.

We'd love your feedback!