We provide IT Staff Augmentation Services!

Sr Python Developer Resume

2.00/5 (Submit Your Rating)

Seattle, WA

SUMMARY

  • 6 years of IT experience in Analysis, Design, Development, Implementation and Testing of various stand - alone and client-server architecture-based enterprise application software in Python on various domains.
  • Expertise in implementing Object-Oriented technologies, Web based client-server architecture, service-oriented architecture and Object Relational Mappings (ORM).
  • Experience in AWS Cloud services such as EC2, S3, EBS, VPC, ELB, Route53, Cloud Watch, Security Groups, Cloud Trail, IAM, Cloud Front, Snowball, RDS and Glacier.
  • Experienced in developing Web-based Applications using Python, CSS, HTML, JavaScript, Angular JS and JQuery.
  • Skilled experience in Python with using new tools and technical developments (Libraries Used: libraries - Beautiful Soup, Jasy, numpy, Scipy, matplotlib, Pickle, PySide, Pandas dataframe, SQLAlchemy, networkx, numpy, scipy, urllib2, Pychart, Highcharts, PySide, SciPy, PyTables) to drive improvements throughout entire SDLC.
  • Experience with Open VMS 6.2, 7.1 & 7.2, Unix (SOLARIS), Linux, and NT systems. Performed system tuning functions and installed software on NT, Unix, and Alpha-servers.
  • Expertise in working with server-side technologies including databases, Restful APL and MVC design patterns.
  • Experience in creating Open stack services for Identity, Compute, Image Service, Block Storage, Networking (Neutron, Keystone).
  • Superior Troubleshooting and Technical support abilities with Migrations, Network connectivity, Security, and Database applications.
  • Experience in using third party tools like Telerik, DevExpress and kendo Controls and worked containerizing applications using Docker and Vagrant and familiar with JSON based REST, SOAP, and Confidential Web Services.
  • Good experience in Shell Scripting, SQL Server, UNIX and Linux, Open stack. Involved in Unit testing, Integration testing, User-Acceptance testing, and Functional testing.
  • Experience in working with different operating systems Windows 98 / NT / 2000 / XP / Vista / 7 / 8, UNIX, and MAC OS X.
  • Experience with automation/configuration management using tools like Ansible, Puppet, Chef and SaltStack.
  • Experienced in developing API services in Python/Tornado, NodeJS while leveraging AMQP and RabbitMQ for distributed architectures.
  • Expertise in working with GUI frameworks-Pyjamas, Jython, guidata, PyGUI, PyQt PyWebkitGtk and Experienced with Elasticsearch, Log stash and Kibana (ELK).
  • Expertise in Crypto Blockchain (Bitcoin, Monaro, Bitcoin Cash) E-commerce platform built utilizing Python with Flask back-end and Jinja/JavaScript front-end.

TECHNICAL SKILLS

  • Python
  • Scala
  • SQL
  • PL/SQL
  • SAS
  • PEP8
  • PIP
  • Spark
  • Requests
  • Scrapy
  • SQLAlchemy
  • BeautifulSoup
  • NumPy
  • SciPy matplotlib
  • PyGame
  • Pyglet
  • PyQT
  • PyGtk pywin32 ntlk nose
  • OpenCV
  • SymPy
  • Ipython
  • Caffe
  • Torch
  • TensorFlow
  • Django
  • Flask
  • Pyramid
  • Twisted
  • Muffin
  • CherryPy
  • TastyPie
  • Pyjamas gui2py
  • PySide
  • TkInter
  • PyForms
  • CVS
  • Git
  • Mercurial
  • SVN
  • GitHub
  • Jenkins
  • Chef
  • Puppet
  • Ansible
  • Docker
  • Kubernetes
  • PyUnit
  • PyTest
  • PyMock
  • Mocker
  • Antiparser
  • Webunit webtest
  • Selenium
  • Splinter
  • PyChecker
  • Komodo
  • PyCharm
  • PyDev
  • PyScripter
  • PyShield
  • Spyder
  • Jupyter
  • MySQL
  • Teradata
  • SQL Server
  • InfluxDB
  • MongoDB
  • IntelliJ
  • Cassandra
  • PostgreSQL
  • Splunk
  • Bugzilla
  • Jira
  • HP ALM
  • HP Quality Center
  • Software Development Life Cycle (SDLC)
  • Agile
  • Waterfall
  • Hybrid
  • TDD
  • XP
  • BDD
  • EDD
  • Pair Programming
  • Scrum
  • ELK (Elasticsearch, Logstash, Kibana)
  • Solr
  • Kanban
  • Kafka
  • Swagger
  • OpenStack
  • Confidential Web Services (AWS)
  • Microsoft Azure
  • Boto3
  • Jinja
  • Mako
  • AMQP
  • Celery
  • Apache Tomcat
  • RabbitMQ
  • Heroku
  • Samba
  • Confluence
  • Bamboo
  • AJAX jQuery
  • JSON
  • XML
  • XSLT
  • LDAP
  • OAuth
  • SOAP
  • REST
  • Microservices
  • Active Directory design patterns
  • HTML/HTML5
  • CSS/CSS3
  • JavaScript
  • PhosphorJS
  • AngularJS
  • NodeJS
  • EmberJS
  • ReactJS
  • Bootstrap
  • Big Data
  • Hadoop technologies
  • Linux
  • Unix

PROFESSIONAL EXPERIENCE

Sr Python Developer

Confidential - Seattle, WA

Responsibilities:

  • Created Python and Bash tools to increase efficiency of application system and operations, data conversion scripts, AMQP/RabbitMQ, REST, JSON, and CRUD scripts for API Integration. Worked on AJAX framework to transform Datasets and Data tables into HTTP Serializable JSON strings. Placed data into JSON files using Python to test Django websites.
  • Added support for Confidential AWS S3 and RDS to host static/media files and the database into Confidential Cloud.
  • Developed Kafka producer and consumers, HBase clients, Spark, shark, Streams and Hadoop MapReduce jobs along with components on HDFS, Hive.
  • Leveraged AWS cloud services such as EC2, auto scaling and VPC to build secure, highly scalable and flexible systems that handled load on the servers.
  • Implemented TFS Build Archival to AWS Simple Storage Service S3 and created Life Cycles for managing the files in S3. Implemented cloud watch for setting alarm for monitoring the Ec2 instances.
  • Used Kubernetes to deploy scale, load balance, and worked on Docker Engine, Docker HUB, Docker Images, Docker Compose for handling images for installations and domain configurations.
  • Extracted data from the database using SAS/Access, SAS SQL procedures and create SAS data sets. Responsible for Data Backups, Restores and Recovery using tape.sys software.
  • Used python packages cx oracle, pyodbc and MySQL dB for working with Oracle, sql server and MySQL DB respectively. Code conventions, pushed for PEP8 and Pylint compliance Analyzing the data in existing PIP schema. Modifying data using SAS/BASE, SAS/ MACROS.
  • Created server monitoring daemon with Psutil, supported by Elasticsearch app for analytics which I created. Also researched big data solutions with Cassandra database.
  • Worked on migration of Splunk to AWS(cloud) instances. Involved in standardizing Splunk forwarder deployment, configuration and maintenance across UNIX and Windows platforms.
  • Configured Elastic Load Balancer and Auto scaling to design cost effective, fault tolerant and highly available systems.
  • Wrote Python code embedded with JSON and XML to produce HTTP GET request, parsing HTML data from websites. Implemented SOAP/RESTful web services in JSON format.
  • Using Chef, deployed and configured Elasticsearch, Logstash and Kibana (ELK) for log analytics, full text search, application monitoring in integration with AWS Lambda and CloudWatch.
  • Built Elastic search, Log stash and Kibana (ELK) to store logs and metrics into S3 bucket using Lambda function.
  • Worked on developing CRUD applications using MERN stack (MongoDB, ExpressJS, ReactJS and NodeJS) and REST based API.
  • Wrote python scripts using Boto3 to automatically spin up the instances in AWS EC2 and OPS Works stacks and integrated with Auto scaling to automatically spin up the servers with configured AMIs.
  • Developing python programs by using boto3 sdk to implement security by using AWS Cognito service.
  • Using Python, included Boto3 to supplement automation provided by Ansible and Terraform for tasks such as encrypting EBS volumes backing AMIs and scheduling Lambda functions for routine AWS tasks.
  • Used ReactJS to create Controllers to handle events triggered by clients and send request to server. Maintained states in the stores and dispatched the actions using redux.
  • Developed Docker images to support Development and Testing Teams and their pipelines and distributed images like Jenkins, Selenium, JMeter and Elastic Search, Kibana and Logstash (ELK) and handled the containerized deployment using Kubernetes.
  • Log monitoring and generating visual representations of logs using ELK stack. Implement CI/CD tools Upgrade, Backup, Restore, DNS, LDAP and SSL setup.
  • Worked with Data migration from Sqlite3 to Apache Cassandra database. Cassandra data model designing, implementation, maintaining and monitoring using DSE, Dev Centre, Data stax Ops center.
  • Build the Silent Circle Management System (SCMC) in Elasticsearch, Python, and Node.JS while integrating with infrastructure services.
  • Automated various infrastructure activities like Continuous Deployment, Application Server setup, Stack monitoring using Ansible playbooks and has Integrated Ansible with Run deck and Jenkins.
  • Wrote Ansible Playbooks with Python SSH as the Wrapper to Manage Configurations of AWS nodes and Tested Playbooks on AWS instances using Python. Run Ansible Scripts to Provide Dev Servers.
  • Used SQL Alchemy as Object Relational Mapper (ORM) for writing ORM queries.
  • Developed a module to build Django ORM queries that can pre-load data to greatly reduce the number of databases queries needed to retrieve the same amount of data.
  • Creating and configuring virtual development environments with Chef and Virtual Box part of the SOA (Service Oriented Architecture) team enforcing best practices for services (REST and SOAP).
  • Built application interface and web scrapping scripts using OO designing, UML modeling and dynamic data structures.
  • Implemented discretization and binning, data wrangling: cleaning, transforming, merging and reshaping data frames.
  • Determined optimal business logic implementations, applying best design patterns.
  • Increased speed and memory efficiency by implementing code migration to convert python code to C/C++ using Cython.
  • Written and Maintained Automated Salt scripts for Elasticsearch, Logstash, Kibana, and Beats.
  • Developed full stack Python web framework with an emphasis on simplicity, flexibility, and extensibility. It is built atop excellent components and reinvents zero wheels. WSGI, routing, templating, forms, data, plugins, config, events, SQL Alchemy, Storm, CouchDB, OpenID, App Engine, jQuery, etc.
  • Validated BI Support events, transformed and batched events which are sent to HNM and Kafka by triggering these events using Kafka, Mesos.
  • Developed a fully automated continuous integration system using Git, Gerrit, Jenkins, MySQL and custom tools developed in Python and Bash.
  • Involved in front end and utilized Bootstrap and Angular.js for page design. Created Data tables utilizing PyQt to display customer and policy information and add, delete, update customer records.
  • Used PyQuery for selecting particular DOM elements when parsing HTML.Used Wireshark, live http headers, and Fiddler debugging proxy to debug the Flash object and help the developer create a functional component.Used Pandas API to put the data as time series and tabular format for east timestamp data manipulation and retrieval.
  • Worked with python editors like PyCharm, PyScripter, PyStudio, PyDev, Wing IDE and Spyder.
  • Developed tools using Python, Shell scripting, XML to automate some of the menial tasks.
  • Working in team of Architects and Developers to build and deploy Python Flask Peewee Linux AWS.
  • Used RAD 7.0 for implementing Static and Dynamic web services for consuming and providing services related to the business.
  • Worked on Jenkins continuous integration tool for deployment of project. Played a key role in a development wide transition from Subversion to Git, which resulted in increase in efficiency for the development community.
  • Programmatically controlled the COMSOL Multiphysics model in the MATLAB® graphical user interface to perform case studies and customize plots and data processing.
  • Used TKinter to implement GUI for the user to create, modify and view reports based on client data.
  • Used Python Library Beautiful Soup for web Scrapping. Responsible for debugging and troubleshooting the web application.
  • Worked on Cloud platform engineering Kubernetes, Spinnaker, Docker, Terraform, Consul, drone Jenkins, Chef, Kitchen. Scheduled, deployed and managed container replicas onto a node cluster using Kubernetes.
  • Wrote Ansible Playbooks with Python SSH as the Wrapper to Manage Configurations of AWS nodes and Tested Playbooks on AWS instances using Python. Run Ansible Scripts to Provide Dev Servers.
  • Used Pandas library for statistical Analysis. Panda's library was used for flexible reshaping and pivoting of data sets.
  • Used Celery as task queue and RabbitMQ, Redis as messaging broker to execute asynchronous tasks.

Python Developer

Confidential - Seattle, WA

Responsibilities:

  • Developed the presentation layer using HTML, CSS, JavaScript, JQuery and AJAX. Utilized Python libraries wxPython, numPY, Pandas, Twisted and mat PlotLib.
  • Worked with Python libraries: requests, python-ldap, suds, pexpect, pip, subprocess.
  • Developed frontend and backend modules using Python on Django including Tasty Pie Web Framework using Git.
  • Implemented SQL Alchemy which is a python library for complete access over SQL.
  • Developed views and templates with Python and Django's view controller and templating language to created user-friendly website interface.
  • Used Pandas library for statistical Analysis. Panda's library was used for flexible reshaping and pivoting of data sets.
  • Developed cross-browser/platform with ReactJs, nodeJs, JQuery, AJAX and HTML5/CSS3 to desired design specs for single page layout using code standards. Created UI from scratch using ReactJs.
  • Used Pandas, NumPy, seaborn, SciPy, Matplotlib, Scikit-learn, NLTK in Python for developing various machine learning algorithms and utilized machine learning algorithms such as linear regression.
  • Used Django configuration to manage URLs and application parameters.
  • Installed, configured, and managed the AWS server. AWS data pipeline for Data Extraction, Transformation and Loading from the homogeneous or heterogeneous data sources.
  • Accessed database objects using Django Database APIs. Worked on python-based test frameworks and test-driven development with automation tools.Worked with real time streaming application and batch style large scale distributed computing applications using tools like Spark Streaming.
  • Implemented advanced procedures like text analytics and processing using the in-memory computing capabilities like Apache Spark written in Scala.
  • Responsible for debugging and troubleshooting the web application. Manage the configurations of multiple servers using Ansible.
  • Deployed mircoservices2, including provisioning AWS environments using Ansible Playbooks.
  • Provisioned load balancer, auto-scaling group and launch configuration for mirco service using Ansible.
  • Used Ansible playbooks to setup Continuous Delivery pipeline. This primarily consists of a Jenkins and Sonar server, the infrastructure to run these packages and various supporting software components such as Maven, etc.
  • Created cloud service using AWS, managed Virtual machines and websites using AWS-EC2, ELB, Autoscaling, Lambda.
  • Developed installer scripts using Python(boto3) for various products to be hosted on Application Servers. Written Python utilities and scripts to automate tasks in AWS using boto3 and AWS SDK. Automated backups using AWS SDK (boto3) to transfer data into S3 buckets.
  • Writing playbooks for Ansible and deploying applications using Ansible. Automated various infrastructure activities like Continuous.
  • Deployment, Application Server setup, Stack Monitoring using Ansible playbooks and has integrated Ansible with Run deck and Jenkins.
  • Provisioned and patched servers regularly using Ansible. Implemented Ansible to manage all existing servers and automate the build/configuration of new servers.
  • Developed an Ansible role for Zabbix-agent which will be integrated into the to the CICD pipeline.
  • Used Ansible to document all infrastructures into version control. Used Ansible to document application dependencies into version control.
  • Develop python code to automate the ingestion of common formats such as JSON, CSV by using Logstash from elastic search to Kibana dashboard to be viewed by clients.
  • Responsible for designing and deploying new ELK clusters (Elasticsearch, Logstash, Graphite Kibana, beats, Kafka, zookeeper etc.
  • Design, build and manage the ELK (Elasticsearch, Logstash, graphite, Kibana) cluster for centralized logging and search functionalities for the App.
  • Designed, Automated the process of installation and configuration of secure DataStax Enterprise Cassandra cluster using puppet.
  • Configured internode communication between Cassandra nodes and client using SSL encryption.
  • Docker container deploying micro services, and scaling the deployment using Kubernetes.
  • Developed Chat Ops interfaces with slack and Kubernetes on GKE.
  • Working on Spinnaker platform for Multi-Cloud Continuous Delivery (Bake, Test, & Deploy/Container Pipelines) using Packer, Terraform, Kubernetes, AWS, GCP.
  • Responsible for on boarding Application teams to build and deploy there code using Git Hub Jenkins, Nexus and Ansible.
  • Migrated out core repository from SUBVERSION to GIT. Managed GitHub Projects and migrated from SVN to GitHub with history.
  • Used Cloud Trail, TESSA, Cloud Passage, Check Marx, Qualys Scan tools for AWS security and scanning.
  • Automated various service and application deployments with ANSIBLE on CentOS and RHEL in AWS.
  • Wrote ANSIBLE Playbooks with Python, SSH as the Wrapper to Manage Configurations of AWS Nodes and Test Playbooks on AWS instances using Python. Run Ansible Scripts to provision Dev servers.
  • Created monitors, alarms and notifications for EC2 hosts using Cloud Watch. Performed web testing automation using selenium API.
  • Worked on Redux making to do list reduces, reducers functions and implementing store method PI Services, JavaScript, Bootstrap, GIT, and JSON.
  • Responsible for Configuring Kafka Consumer and Producer metrics to visualize the Kafka System performance and monitoring. Written and Maintained Automated Salt scripts for Elasticsearch, Logstash, Kibana, and Beats.
  • Performed parameterization of the automated test scripts in Selenium to check how the application performs against multiple sets of data Contributed in developing.
  • Automation Framework that uses Java, Selenium Web Driver and Testing. Wrote automation test cases and fixing automation script bugs. Worked with migration to Confidential web Services AWS.

Python Developer

Confidential

Responsibilities:

  • Worked on developing Web Services using - SOAP, WSDL and developing DTDs, XSD schemas for XML (parsing, processing, and design).
  • Implemented logging mechanism to capture exceptions and errors using Log4j tool. Used Tortoise SVN as a version-control client for Subversion
  • Responsible for entire data migration from Sybase ASE server to Oracle. Migration of API code written for Sybase to Oracle.
  • Debug the application using Firebug to traverse the documents and manipulated the Nodes using DOM and DOM Functions using Firefox and IE Developer Tool bar for IE.
  • Developed web applications in Django Framework's model view control (MVC) architecture.
  • Creation of REST Webservices for the management of data using Apache CXF.
  • Worked on Python to place data into JSON files for testing Django Websites. Created scripts for data modeling and data import and export.
  • Worked on ReactJS for its code reusability and integrated Bootstrap. Used Redux architecture in the whole process to connect Actions.
  • Designed and Developed the input/output data formats in XSD for the WSDL files and accordingly implementing services using Apache AXIS2 & Used NetBeans IDE to develop the application.
  • Implemented Docker containers to create images of the applications and dynamically provision slaves to Jenkins CI/CD pipelines.
  • Developed Hadoop integrations for data ingestion, data mapping and data processing. Used XML for dynamic display of options in select box and description on web page.
  • Implemented the MVC architecture using Apache Struts Framework & Experience in Python open stack API's. Worked with Rational ClearCase to provide sophisticated version control, workspace management and parallel development support. Build and maintained a Selenium Regression test suite.
  • Involved in building database Model, APIs and Views utilizing Python, to build an interactive web-based solution.
  • Wrote Python code using Ansible Python API to automate cloud deployment process. Managed Web applications, configuration files, users, file systems and packages using Ansible.
  • Built various graphs using Matplotlib package which helped in taking business decisions.
  • Created Restful API's to integrate and enhance functionalities of the application. Also Utilized Restful API in communicating with third parties.
  • Worked on automation using the Python scripting language, Git on Cygwin32 and XML.
  • Worked on monitoring tools like Nagios, Zabbix, AWS Cloud Watch to health check the various deployed resources and services.
  • Involved and played a leading role in database migration projects from Oracle to MongoDB.
  • Designed and managed API system deployment using fast http server and Confidential AWS architecture.
  • Designed and developed Use-Case, Class and Object Diagrams using UML Rational Rose for Object Oriented Analysis/Object Oriented Design techniques.
  • Excellent understanding and knowledge of Hadoop Distributed file system data modelling, architecture and design principles and Developed Python Mapper and Reducer scripts and implemented them using Hadoop streaming.
  • Involved in writing application level code to interact with APIs, Web Services using JSON and involved in AJAX driven application by invoking web services/API and parsing the JSON response.
  • Developed views and templates with Python and Django view controller and templating language to create a user-friendly interface using MVC architecture. Worked on resulting reports of the application and Tableau reports and involved in modifying data using SAS/BASE, SAS/ MACROS.
  • Involved in installing software using pip command for python libraries like Beautiful Soup, NumPy, SciPy, python-twitter, RabbitMQ, Celery, matplotlib, Pandas data-frame and used the PEP8 coding convention.
  • Migration of API code written for Sybase to Oracle and was involved in Overlook the migration activity of PL/SQL programs.
  • Build back-end application with Python / Django, Worked on Dockers, RabbitMQ, Celery, and Jenkins. Involved in migrating the Libraries written using Sybase API's to Oracle OCCI API.
  • Used Python Libraries Pandas and NumPy, SQL and Tableau to procure, clean and aggregate data from Relational database to generate status reports and dashboards.

We'd love your feedback!