We provide IT Staff Augmentation Services!

Python/cloud Developer Resume

4.00/5 (Submit Your Rating)

SUMMARY

  • 7 years of professional experience as a software Developer, proficient coder in multiple languages and experience in Design, Development, Implementation of Python, Django, Flask, Pyramid and client - server technologies-based applications, RESTful services, AWS, Azure and SQL.
  • Experienced in working with various stages of Software Development Life Cycle (SDLC), Software Testing Life Cycle (STLC) and QA methodologies from project definition to post-deployment documentation.
  • Experience with Design, code, debug operations, reporting, data analysis and Web Applications utilizing Python.
  • Experienced in implementing Object Oriented Python, Hash Tables (Dictionaries) and Multithreading, Django, MYSQL, Exception Handling and Collections using Python.
  • Worked with MVW frameworks like Django, Angular JS, HTML, CSS, XML, Java Script, jQuery, Bootstrap.
  • Good experience in writing Spark applications using Python and Scala.
  • Experience in writing JSON REST APIs using Golang.
  • Strong experience of software development in Python (libraries used: libraries- Beautiful Soup, NumPy, SciPy, matplotlib, python-twitter, Pandas data frame, urllib2, MySQL dB for database connectivity) and IDEs - sublime text, Spyder, emacs.
  • Hands-on experience working with various Relational Database Management Systems (RDBMS) like MySQL, Microsoft SQL Server, Oracle & non- relational databases (NoSQL) like MongoDB and Cassandra.
  • Proficient in writing unit and integration test cases using Python unit test, Pytest and Selenium web driver frameworks.
  • Hands on experience in installing configuring and using Hadoop ecosystem components like Hadoop MapReduce HDFS HBase Hive Sqoop Pig Zookeeper and Flume.
  • Experienced in developing Web Services with Python programming language - implementing JSON based Restful and XML based SOAP web services.
  • Proficient in performing Data analysis and Data Visualization using Python libraries.
  • Proficient in scaling up job times using libraries like Celery, RabbitMQ technologies.
  • Experience in using Version Control Systems like GIT, SVN and CVS to keep the versions and configurations of the code organized.
  • Experience on VM ware to build cloud-based applications.
  • Experience in UNIX/Linux shell scripting for job scheduling, batch-job scheduling, automating batch programs, forking and cloning jobs.
  • Has good experience in CI/CD tools - Jenkins for Continuous Integration, Ansible for continuous deployment.
  • Extensively worked on Jenkins build pipelines for continuous integration and for End to End automation for all build and deployments.
  • Experienced with containerizing applications using Docker.
  • Experience in maintaining and executing build scripts to automate development and production builds.
  • Experience in Amazon Web Services (AWS) cloud platform like EC2, Virtual private clouds (VPCs), Storage models (EBS, S3, instance storage), Elastic Load Balancers (ELBs).
  • Experienced in developing API services in Python/Tornado, while leveraging AMQP and RabbitMQ for distributed architectures.
  • Good Experience on tracking tools like JIRA and Bugzilla for bug tracking.
  • Excellent interpersonal and communication skills, efficient time management and organization skills, ability to handle multiple tasks and work well in team environment.

TECHNICAL SKILLS

Primary Languages: Python, Java Script, GoLang, C, C++

Python Libraries: NumPy, scipy, matplotlib, Pandas data frame, urllib2, PySpark

Frameworks: Bootstrap, Django, Node.JS, Flask

Database: Sqlite3, MSSQL, MySQL, Mongo DB, Oracle 11g, Hive, MongoDB

IDE s: PyCharm, Eclipse, MS Visual Studio Code

Cloud Technologies: MS Azure, Amazon Web Services (EC2, S3, EBS, Lambda, API Gateway)

Web Technologies: HTML, CSS, DHTML, XML, Java Script

Operating systems: Windows, Mac, Fedora Linux, Red hat Linux, Ubuntu

SDLC Methods: SCRUM, Agile, Waterfall

Bug Tracking Tools: JIRA, Azure (ADO), Bugzilla.

Version Controls: Git hub, Git, SVN

PROFESSIONAL EXPERIENCE

Confidential

Python/Cloud Developer

Responsibilities:

  • Gathering requirements and translating the Business details into Technical design.
  • Participated in all the stages of software development lifecycle (SDLC) like design, testing development and implementation.
  • Created Python and Bash tools to increase efficiency of call center application system and operations; and data conversion scripts, REST, JSON and CRUD scripts for API Integration.
  • Developed entire frontend and backend modules using Python on Django Web Framework by implementing MVC architecture.
  • Developed RESTAPI’s using Python Django framework and written extensive test cases in PyTest and integrated with Jenkins pipeline.
  • Used several Python libraries like wxPython, NumPy, matPlotLib and PySpark.
  • Implemented responsive user interface and standards throughout the development and maintenance of the website using the HTML, CSS, JavaScript, Bootstrap, jQuery.
  • Extensively worked with spark Data frames for ingesting data from flat files into RDD's to transform unstructured data and structured data.
  • Designed and developed a data management system using PostgreSQL.
  • Worked on MongoDB database such as locking, transactions, indexes, Sharding, replication, schema design.
  • Developed applications using Java8 and new features of java
  • Used Scala sbt to develop Scala coded spark projects and executed using spark-submit
  • Involved in application development for Cloud platforms using technologies like Java/J2EE, Spring Boot, Spring Cloud, Micro Services, REST.
  • Experience in creating Docker containers leveraging existing Linux Containers and AMI's in addition to creating Docker containers from scratch.
  • Setup Docker on Linux and configured Jenkins to run under Docker host.
  • Developed various API's for Django applications using Django-restframework pie.
  • Used JIRA for tracking development process.
  • Developed Wrapper in Python for instantiating multi-threaded application.
  • Creating RESTful web services for Catalog and Pricing with Django MVT, MySQL, and MongoDB.
  • Fixed bugs, providing production support, enhanced applications by improving code reuse and performance by making effective use of various design patterns.
  • Experience in managing and reviewing Hadoop log files.
  • Working on Freeform Calculation, Lasso and Radial Selection using Tableau.
  • Deployed and monitored scalable infrastructure on Amazon web services (AWS).
  • Also working on Infrastructure and Application Azure DevOps environment.
  • Bug tracking and Working on creating CI/CD pipeline process in Azure Dev Ops.
  • Implemented monitoring and established best practices around using Elasticsearch and used Aws Lambda to run code without managing servers.
  • Front-end web development using HTML/CSS, jQuery, bootstrap as well as back-end development using Golang and SQL.
  • Written UNIX / Linux shell scripts for various misc tasks.

Environment: Python, Django, CherryPy, Golang, HTML5, CSS, Bootstrap, JSON, JavaScript, AJAX, RESTful webservice, MongoDB, MySQL, jQuery SQLite, Elasticsearch, Docker, RHEL, AWS (EC2, S3), PyTest, Jenkins, Selenium Automation Testing.

Confidential

Python Developer

Responsibilities:

  • Designed front end and backend of the application utilizing Python on Django Web Framework.
  • Designed and developed Use-Case Diagrams, Class Diagrams, and Object Diagrams using UML Rational Rose for OOA/OOD techniques.
  • For the development of the user interface of the website used HTML, CSS, and JavaScript.
  • Experience in developing views and templates with Python and Django's view controller and templating language to create a user-friendly website interface.
  • Expertise in developing consumer-based features and applications with Python, Django, HTML, Behavior Driven Development (BDD) and pair-based programming.
  • Created Servlets and Beans to implement Business Logic.
  • Used SAX/DOM Parser for parsing the data to Oracle database.
  • Used standard Python modules like csv, itertools, and jinja for development.
  • Wrote custom User Defined Functions (UDF) in Python for Hadoop (Hive and Pig).
  • Modify the existing Python/Django modules to deliver certain format of data.
  • Have work knowledge on JIRA (Agile) for the bug tracking of the project.
  • Worked on deployment on AWS EC2 instance with Postgres RDS and S3 file storage
  • Developed Chef Cookbooks to install and configure Apache Tomcat, Jenkins and deployment automation.
  • Worked with HDFS file formats like Avro and Sequence File and various compression formats like Snappy and bzip2.
  • Used Celery with the help of RabbitMQ message broker for processing of larger jobs with multiple dependencies.
  • Consumed the data from Kafka using Apache Spark.
  • Experience with Streaming platforms like Apache Kafka.
  • Worked on Python scripts to parse JSON documents and load the data in database.
  • Build SQL queries for performing various CRUD operations.
  • Developed full stack Python web framework with an emphasis on simplicity, flexibility, and extensibility.
  • Worked with Sqoop jobs to import data from RDBMS and used various optimization techniques to optimize Hive, Pig and Sqoop.
  • Worked on AWS Cloud Platform and its features which includes EC2, S3, EBS
  • Worked on Spark using Python and Spark SQL for faster testing and processing of data.
  • Integrated Redis-Cache with Django Rest Framework for reading the data faster.
  • Involved in Design, Development, Deployment, Testing, and Implementation of the application.
  • Implemented the application in LINUX environment and comfortable with all its commands.
  • Tested entire frontend and backend modules using Python on Django Web Framework
  • Responsible for handling the integration of database system.
  • Cleansed data generated from weblogs with automated scripts in Python.
  • Used object-relational mapping (ORM) solution, technique of mapping data representation from MVC model to Oracle Relational data model with an SQL-based scheme.
  • Implemented Spark using Scala and utilizing Data frames and Spark SQL API for faster processing of data.
  • Helped the Big Data analytics team with implementation of Python scripts for Sqoop, Spark and Hadoop batch data streaming.
  • Implemented Performance tuning and improved the Performance of Stored Procedures and Queries.
  • Used Selenium Library to write fully functioning test automation process that allowed the simulation of submitting different we request from multiple browsers to web application.

Environment: Python, MySQL, Hive, JSON, RESTful, Pandas, Machine Learning, Version One, Linux, Shell Scripting, IBM Netezza, JavaScript, AngularJS, Toad Data Point (SQL), Putty (Linux), Informatica.

We'd love your feedback!