We provide IT Staff Augmentation Services!

Senior Python Developer Resume

4.00/5 (Submit Your Rating)

Austin, TX

SUMMARY

  • I have IT experience in design, development, testing, and implementation of various stand - alone and client-server architecture-based enterprise application software using various technologies, analysing complex business requirements & mapping them to system specifications.
  • Experienced in developing web applications, implementing Model View Template architecture using Django web framework.
  • Experience in developing web-based applications using Python 3.x (3.6/3.7), Django 2.x and Flask.
  • Developed web applications and Web APIs using different frameworks and libraries like Flask/ Flask-Rest Plus, Django, Django REST Framework.
  • Design and Developed Web Services, RESTful APIs for Mobile Apps using Python Django-REST and Flask Frameworks on Nginx and Usagi servers.
  • Have a hand - on experience on fetching the live stream data from DB2 to HDFS using Spark Streaming and Apache Kafka.
  • Experience in real time data from various data sources through Kafka data pipelines and applied various transformations to normalize the data stored in HDFS Data Lake.
  • Expertise with different tools in Hadoop Environment including Pig, Hive, HDFS, MapReduce, Sqoop, Spark, Kafka, Yarn, Oozie, and Zookeeper.
  • Expertise in generating Python PostgreSQL Forms to record d of online users and used Python and PostgreSQL for creating graphics, data exchange and XML processing.
  • Experience of working with relational databases like oracle, SQLite, PostgreSQL, MYSQL, DB2 and on NoSQL databases like Apache Cassandra and MongoDB.
  • Rust Programming Language its runs to blazingly fast, prevents segfaults and guarantees thread Safety.
  • Experienced in developing web-based applications by following model view controller architecture (MVC).
  • Experience in developing the applications using Python 3.6/3.7, Flask web framework backed up by MS SQL/PostgreSQL databases using SQL Alchemy for Object Relational Mapper (ORM).
  • Designed and developed API’s to share data with cross functional teams using Hug and Fast API frameworks.
  • Experienced in working with various Python IDE’s using PyCharm, Spyder, Microsoft Visual Studio, sublime text editor.
  • Experience in job workflow scheduling and monitoring tools like Airflow and Autosys.
  • Expertise in python scripting with focus on Develops tools, CI/CD and AWS Cloud Architecture and hands on Engineering experience with deep learning tools such as Tensor Flow, H2O Flow, Theano.
  • Good experience in working with variousPythonIntegrated Development Environments likePyCharm, Spyder,Jupyter Notebook, Anacondaand UBUNTU.
  • Handled Business logics by backend Python programming to achieve optimal results and Wrote Python scripts to parse XML, CSV and text files and load the data into AWS S3 buckets and call files from AWS S3 to Tableau using AWS Athena Service.
  • Good experience in using Object-oriented design patterns, multi-threading, multi-processing, exception handling and knowledge in client server environment.
  • Extensively involved in developing and consuming web services/API’s/micro-services using requests library in python, implemented security using OAuth2 protocol etc.
  • Working experience on different high performance scientific and data visualization libraries like Pandas, NumPy, SciPy, Matplotlib, Seaborn, Bokeh and Stats models for statistical modelling etc.
  • Writing well designed, testable and efficient code in Python 3 by following best software development practices and standards.
  • Knowledge on Hadoop eco-system, HDFS, Map/Reduce functionality, also worked on processing large data sets using PySpark library in Python applications.
  • Designing the user interactive web pages/ templates as the front-end part of the application using various technologies like HTML, CSS, JavaScript, jQuery, JSON and implementing Bootstrap framework for better user experience.
  • Proficient in writing SQL Queries, Stored procedures, functions, packages, tables, views, triggers using relational databases like PostgreSQL, Oracle, MS-SQL etc.
  • Implemented backend asynchronous task queue system for data processing pipelines using libraries/frameworks like Celery, Flask etc.
  • Implemented automated data processing systems using libraries like Subprocess, also sending notifications to users using python SMTP library, Flask-Mail extension.
  • Performed mapping of json/xml formatted data with relational data bases like MySQL, PostgreSQL, and SQLite.
  • Experience in developing applications using amazon web services like EC2, Cloud Search, Elastic Load balancer ELB, S3, CloudFront, and Route 53.
  • Experience in agile environment using a CI/CD model methodology
  • Experience in working with continuous deployment using Heroku and Jenkins.
  • Proficient in writing unit testing code using Unit Test/Pytest and integrating the test code with the build process.
  • Well versed with Agile, SCRUM and Test-driven development methodologies.
  • Hands on experience in using version control systems Git and GitHub and GitLab.
  • Development Experience in Linux (CentOS, Debian, and Ubuntu), Mac OS and Windows environments.
  • Solving day to day data wrangling/data munging challenges using high performance scientific library stack.
  • Using Django evolution and manual SQL modifications was able to modify Django models while retaining all data, while site was in production mode.
  • Worked with Terraform to create AWS components like EC2, IAM, VPC, ELB, Security groups
  • Designed, developed, implemented, and maintained solutions for using Docker, Jenkins, Git, and Puppet for microservices and continuous deployment.
  • Hands on experience with bug tracking tools JIRA and Bugzilla.
  • Using tools like Jupiter Notebook to test and accomplish day to day challenges with the data, measuring and improving performance wherever required.
  • Skilled in debugging/troubleshooting issues in complex applications.
  • Experience in Agile development processes ensuring rapid and high-quality software delivery.
  • Developing or updating the technical documentation to accurately represent application design for user support.
  • Reviewed requirement documents with the business and development team to understand the architecture and functionality of the application and consolidated these requirements in appropriate modules in Test strategy.
  • Highly motivated and quality minded developer with proven ability to deliver applications against tight deadlines.
  • Excellent interpersonal and communication skills, efficient time management and organization skills, ability to handle multiple tasks and work well in a team environment.

TECHNICAL SKILLS

Primary Languages: Python, JavaScript, C/C++

Python Libraries: Beautiful Soup, SciPy, matplotlib, Panda’s data frame, urllib2, requests, json

Frameworks: Bootstrap, Django, Flask, Hug, FastAPI

Database: Sqlite3, MySQL, PostgreSQL, Mongo DB

IDE’s: PyCharm, MS Visual Studio, atom

Deployment tools: MS Azure, Heroku, Amazon Web Services (EC2, S3)

Web Technologies: HTML, CSS, DHTML, XML, Java Script, Bootstrap

Operating systems: Windows, ubuntu, Fedora Linux, Red hat Linux

SDLC Methods: SCRUM, Agile

CI/CD tools: Jenkins, Docker

Testing Frameworks: Pytest, Unittest, ROBOT, Lettuce

Bug Tracking Tools: JIRA, Bugzilla

Version Control Tools: VSS, SVN, GitHub, Git, GitLab

PROFESSIONAL EXPERIENCE

Confidential, Austin, TX

Senior Python Developer

Responsibilities:

  • Developed a web-based reporting system with Java, J2EE, Servlets, EJB and JSP using spring framework HTML, JavaScript.
  • Developed python code using oracle to retrieve data from oracle database, also retrieved data from different data models and pass the data through other data models.
  • Wrote and executed various MYSQL database queries from python using Python MySQL connector and MySQL dB package.
  • Develop framework for converting existing PowerCenter mappings and to PySpark (Python and Spark) Jobs.
  • Worked on HTML5, CSS3, JavaScript, AngularJS, Node.JS, Git, REST API, MongoDB.
  • Created Pyspark frame to bring data from DB2 to Amazon S3.
  • Build numerous Lambda functions using python and automated the process using the event created.
  • With using of Python OS module on Linux environment, made job cloning and forking.
  • Worked on Python scripts to parse JSON documents and load the data in database.
  • Developed tools using Python, Shell scripting, XML to automate some of the menial tasks.
  • Used Pandas API to put the data as time series and tabular format for east timestamp data manipulation and retrieval.
  • Worked onAnaconda Python Environment.
  • Setting up the CI/CD pipeline using GitHub, Jenkins, Maven, Chef, Terraform and AWS.
  • Involved in writing Hive QL scripts on beeline, impala, hive cli for the consumer data analysis to meet business requirements.
  • Develop programs to automate the testing of controller in CI/CD environment using Python, Bash script, Git and Linux command line
  • Experienced in NoSQL technologies like MongoDB, CouchDB, Cassandra, Redis and relational databases like Oracle, SQLite, PostgreSQL and MySQL databases.
  • Designed and developed machine learning/ deep learning algorithms to classify the behavioral patterns of the insurance claimers using random forests and decision trees.
  • Developed custom airflow operators using Python to generate and load CSV files into GS from SQL Server and Oracle databases.
  • Updating the Test Automation suite regularly to ensure its accuracy and usefulness.
  • Designed and created backend data access modules using PL/SQL stored procedures and Oracle
  • Wrote JUNIT Test cases for Spring Controllers and Web Service Clients in Service Layer using Mockito.
  • Working primarily with Ruby on Rails and MySQL in UNIX environment. Extensive experience with the Rails MVC framework including complex model relationships, controllers, views, and helpers.
  • Experience in creating Kafka producer and Kafka consumer for Spark streaming.
  • Developed an information pipeline utilizing Kafka and Storm to store data into HDFS.
  • Maintained and developed Docker images for a tech stack including Cassandra, Kafka, Apache, and several in house written Java services running in Google Cloud Platform (GCP) on Kubernetes.
  • Worked on Element Tree XMLAPI in Python to parse XML documents and load data into database and from database to XML documents.
  • Performance tuning by analyzing and comparing the turnaround times between SQL and Cognos.
  • Improved overall AD replication health by developing an automated process using PowerShell to ensure that the organizations site-link topology was consistent with intended design, resulting in stable and efficient replication environment.
  • Responsible for design & development ofSpark SQL ScriptsusingPythonbased on Functional Specifications to load data to snowflake. Implemented the workflows usingAirflowinPythonto automate tasks.
  • Used Celery with RabbitMQ, MySQL, Django, and Flask to create a distributed worker framework.
  • Designed some of the SAS data models using Base SAS and SAS Macros.
  • Modified and created SAS datasets from various input sources like flat files, CSV, and other formats, created reports and tables from existing SAS datasets.
  • Responsible for manipulating, transferring, managing, and processing financial data in SAS using SAS Enterprise Guide under UNIX Platform.
  • Wrote PL/SQL views, Stored Procedures, database triggers & Packages.
  • Worked on AWS EC2/VPC/S3/SQS/SNS based on automation Terraform, Ansible, Python, Bash Scripts.
  • Worked on validating data resulting from data source migration from Netezza to SAS.
  • Modified existing SAS programs and created new programs using SAS macro variables to improve ease and speed of modification as well as consistency of results.
  • Developed micro services on boarding tools leveraging Python and VSTS allowing for easy creation and maintenance of build jobs and Kubernetes deploy and services.
  • Generated JUnit test cases for testing various Java components.
  • Developed the Pyspark code for AWS Glue jobs and for EMR
  • Handed Java Multi-Threading part in back-end component, one thread will be running for each user.
  • Developed Project Specific Java API's for the new requirements with the Effective usage of Data Structures, Algorithms, and Core Java, OOPS concepts.
  • Designed the Analytical application using Python, Spark, HDFS, AWS EMR.
  • Extracted Data from Multiple Systems and Sources using Python and Loaded the Data into AWS EMR.
  • Worked closely with the QA Manager, Team leads and developer to evaluate and enhance automation script to cover test area and test cases.
  • Involved in maintenance of Qlikview services, Servers, Log files.
  • Developed testing steps for GUI components.
  • Involved in test execution and performed system Integration testing and regression testing.
  • Involved in pre-UAT.
  • UtilizeAirflow Backfillfeature to (re)-populate past data.
  • Develop programs to automate the testing of controller in CI/CD environment using Python, Java, Bash script, Git, Linux command line, Java Script
  • Developed machine learning strategies for risk analysis using Multiple Regression.
  • Experience with Django and Flask a high-level Python Web framework.
  • Used Test driven approach (TDD) for developing services required for the application.
  • Response Time was monitored while running Baseline, Performance, Load, Stress and Endurance testing.
  • Good Experience in Database Backups and Recovery Strategies and Expert experience in Hot and Cold Backup of databases. Eureka service registry of PCF configured for each service to enable communicate via cloud.
  • Developed Complex transformations, Mapplets using Informatica to Extract, Transform and load (ETL) data into Data marts, Enterprise Data warehouse (EDW) and Operational data store (ODS).
  • Setting up Oracle read only replication with materialized views and developed scripts for automating the rebuilt process of replication.
  • Working with Oracle utilities like SQL*Loader, Export/Import, Data Pump and External Table.

Environment: Python 2.7, Base SAS, SAS Macros, CI/CD, Flask, oracle database, SAS Enterprise guide, putty, jQuery, WinSCP, Cognos, MySQL, HTML5, CSS3, Impala, Hive, JavaScript, Toad, XML, Restful Web Services, JSON, Terraform, IBM Sterling, EMR, Bootstrap, PL/SQL, SQL, Jenkins, Jira, confluence, eclipse, IntelliJ, Spark, Linux.

Confidential

Senior Python Developer

Responsibilities:

  • Developed entire frontend and backend modules using Python on Django Web Framework.
  • Worked on designing, coding, and developing the application in Python using Django MVC.
  • Experience in working with Python ORM Libraries including DjangoORM.
  • Worked on integrating python with Web development tools and Web Services.
  • Experience with Django and Flask a high-level Python Web framework.
  • Responsible for writing code in Object Oriented Programming supported by Ruby on Rails in Agile SCRUM environment.
  • Created custom fully automated solution using Windows PowerShell to export individual mailboxes from the Exchange environment and save them as separate PST files. (Confidential)
  • Developed tools using Python, Shell scripting, XML to automate some of the menial tasks. Interfacing with supervisors, artists, systems administrators, and production to ensure production deadlines are met.
  • Designed User Interfaces using jQuery, Bootstrap, JavaScript, CSS3, XML and HTML5.
  • Created a Handler function in Python using AWS Lambda that can invoke when the service is executed.
  • Developed server side using PHP in both WAMP and LAMP server framework.
  • Developed Business Logic using Python on Django Web Framework.
  • Developed views and templates with Python and Django’s view controller and templating language to create a user-friendly website interface.
  • Used ETL (SSIS) to develop jobs for extracting, cleaning, transforming, and loading data into data warehouse. Wrote PL/SQL stored procedures & triggers, cursors for implementing business rules and transformations.
  • Writing REST APIs, as part of developing web-based applications for insurance premium calculations, using Django’s REST framework.
  • Correspondingly involved in writing REST APIs using Django framework for data exchange and business logic implementation.
  • Designed a stimulation model with machine learning algorithms and deep learning models for automation system.
  • Developed REST Microservices which are like API’s used for Home Automation. They also keep the data in synchronization between two database services.
  • Configuring auto scalable and highly available microservices set with monitoring and logging using AWS, Docker, Jenkins and Splunk
  • Create, activate and program inAnaconda environment.
  • Developed spark applications in python (PySpark) on distributed environment to load huge number of CSV files with different schema in to Hive ORC tables.
  • Worked on reading and writing multiple data formats like JSON, ORC, Parquet on HDFS using PySpark.
  • Troubleshoot IBM WebSphere, IBM Commerce & Sterling Order Management System (OMS) and Apache Webservers supported activities like cloning, patching, deployment related issues on IBM WebSphere application, IBM Http Web servers and IBM Sterling Agent servers
  • Designed and developed integration methodologies between client web portals and existing software infrastructure using SOAP API's and vendor specific frameworks.
  • Used Django Database API's to access database objects.
  • Used jQuery and Ajax calls for transmitting JSON data objects between frontend and controllers.
  • Created a simple AWS Lambda function using python for deployment management in AWS.
  • Design, investigation, and implementation of public facing websites on Amazon Web Services (AWS).
  • Designed web UI components for various modules and used JavaScript for client-side validation.
  • Involved in building database Model, APIs and Views utilizing Python, to build an interactive web-based solution.
  • Monitoring spark jobs using Yarn application.
  • Developed Spark/Scala code to ingest data leveraging memory and optimizing performance
  • Assist in the migration of existing SASprograms from SAS9.2 to SAS9.4 and validate the resultant datasets.
  • Used Golangto log the different host system event and alert information to Cassandra database.
  • Deployed Core KubernetesClusters to manage Docker containers in the production environment with light weight Docker Images as base files.
  • Modified and created SASdatasets from various input sources like flat files, CSV, and other formats, created reports and tables from existing SASdatasets.
  • Worked on different data formats such as JSON, XML and performed machine learningalgorithms in Python.
  • Used pandas, NumPy, seaborn, matplotlib, scikit-learn, scipy, NLTK in Python for developing various machine learningalgorithms.
  • Working in oathgroup to support 2-leg and 3-leg oathincluding OIDC protocol.
  • Used Python based GUI components for the front-end functionality such as selection criteria,
  • Created test harness to enable comprehensive testing utilizing Python.
  • Used Amazon Web Services (AWS) for improved efficiency of storage and fast access.
  • Added support for Amazon AWS S3 and RDS to host static/media files and the database into Amazon Cloud. Deployed on AWS EC2 using nginx and unicorn
  • Built a scalable, cost effective, and fault tolerant data warehouse system on Amazon EC2 Cloud. Developed MapReduce/EMRjobs to analyze the data and provide heuristics and reports. The heuristics were used for improving campaign targeting and efficiency.
  • Developed a functional design of AWSElastic Map Reduce (EMR) specifications with respect to business requirements and technology alternatives.
  • Configuration of AWS EC2 Auto Scalinggroups and auto scalingpolicies.
  • Developed PySpark code to read data from Hive, group the fields and generate XML files.
  • Involved in front end and utilized Bootstrap and AngularJS for page design.
  • Involved in Developing a Restful API'S service using Python Flask framework.
  • Involved in Rust Programming Language blazingly fast like C and C++.
  • Created Data tables utilizing PyQt to display customer and policy information and add, delete, update customer records
  • Used Scalato convert Hive/SQL queries into RDD transformations in Apache Spark.

Environment: Python 3.0, PyCharm, Django, Docker, Amazon Web Services, AWS Lambda, AWS S3, jQuery, PyQuery, MySQL, HTML5, CSS3, JavaScript, Ajax, XML, Rust Programming, Restful Web Services, C and C++, IBM Sterling, JSON, EMR, Bootstrap, AngularJS, NodeJS, Flask, SQL, MySQL, Jenkins, Ansible, Git, GitHub, Linux.

Confidential

Python Developer

Responsibilities:

  • Responsible for analyzing various cross-functional, multi-platform applications systems enforcing Python best practices and provide guidance in making long term architectural design decisions.
  • Experience with ORM's such as Django and SQLAlchemy, database design and normalization.
  • Used MVT pattern to encapsulate client/server interactions helps to illustrate software-pattern roles as well as developer roles by separating object, components, and services into multi-tiers with well-defined boundaries.
  • DNS53 bed facility. Responsible for the delivery of nursing services to include planning, implementing, and evaluating the care plan of each resident to maximize resident quality of life and quality of care with the integration of resident rights.
  • Worked in Developing a Restful API'S service using Python Flask framework as well as Used SOAP and RESTful API for information extraction.
  • Used popular Node.js frameworks like Express and Restify to mock a Restful API.
  • Created numerous Django apps and extensively used Django Session and authentication management.
  • Created new connections through applications for better access to MySQL database and involved in writing several SQL/PLSQL -functions, sequences, stored procedures, triggers, cursors, and object types.
  • Worked as ADNS/ infection control/ wound care.
  • Site Reliability Engineering responsibilities for Kafkaplatform that scales 2 GB/Sec and 20 million messages/sec.
  • Cleaned data and processed third party spending data into maneuverable deliverables within specific formats with Excel macros and Python libraries.
  • Develop consumer-based features and applications using Django, HTML, Python Behavior Driven Development (BDD) and pair-based programming.
  • Developed and tested many features in an agile environment using Python, Django, HTML5, CSS, JavaScript, Bootstrap and Spec.
  • Used numerous jQuery third party libraries in Django framework- feedback plugin, photo light, social-likes, jQuery tables, slick grid and google charts.
  • Added support for Amazon AWS S3 and RDS to host static/media files and the database into Amazon Cloud.
  • Implemented LDAP authentication using device to integrate with an in-place web seal/TAM infrastructure.
  • Extensive experience of working on Management tool and Automationtools HP Quality Center and Selenium
  • Implemented product used to replace a large Nagiosbased monitoring system that was used to control "Just In Time" manufacturing of interior parts and exhaust systems.
  • Our Continuous Integration stack consists of GIT, Jenkins, CI/CD, the Docker Trusted Registry and OpenShift Enterprise
  • Wrote script to generate IP address from CIDR and write into JSON files and assign them to virtual machine while cloning.
  • Developed Restful Microservices using Flask and Django and deployed on AWS servers using EBS and EC2.
  • Consumed REST based Microservices with Rest template based on RESTful APIs and designed, developed and tested HTML, CSS, jQuery and React.js that meets accessibility and web browser standards for car dealerships websites.
  • Develop Python microservices with Django/Flask framework for Confidential & Confidential internal Web Applications.
  • Used REST based microservices with REST format considering RESTful APIs and outlined, built up the UI for the customer sites by utilizing HTML, CSS, jQuery and React.js.
  • Used Vagrant to implement environment for microservices deployments and testing in Docker images.
  • This project also used other technologies like jQuery for JavaScript manipulations, bootstrap for the front-end html layout.
  • Understood the project scope, identified activities/ tasks, task level estimates, schedule, dependencies, risks and provided inputs to Module Lead for review.

Environment: Python 3.4, Django 1.8, Linux, HTML5, CSS, Bootstrap, IBM Sterling, MySQL, SQL, PLSQL, XML, Heroku, JavaScript, jQuery, JSP, JSON, Restful API, MVC architecture, AWS EC2, GitHub, Spec, Cucumber, Swagger.

Confidential

Java Developer

Responsibilities:

  • Involved in development of Java concepts like Collections, Exception Handling, Multi-Threading.
  • Worked on MVC pattern, using various frameworks.
  • Worked on WebSphere as application deployment servers.
  • Designed the Database, written triggers, and stored procedures.
  • Developed screens based on jQuery to dynamically generate HTML ad display data to the client side.
  • Worked on JavaScript framework to augment browser-based applications with MVC capability.
  • Involved in development of various controller classes as a part of MVC framework.
  • Worked in referential data service module to interface with various databases using JDBC.
  • Implemented REST and SOAP based web services and published using JAX-WS.
  • Worked on Oracle Database to store and retrieve information using SQL Developer.
  • Worked on Eclipse IDE’s for build, debugging and deploy it using Apache Tomcat.
  • Developed Build Script using MAVEN.

Environment: Java 1.6, 1.7, Java EE 6, JavaScript, jQuery, Struts, Eclipse, Tomcast, SQL Developer, Oracle Database 11g, JDBC, Template, WebSphere, SOAP UI, Maven, REST, Windows.

We'd love your feedback!