We provide IT Staff Augmentation Services!

Python Developer Resume

0/5 (Submit Your Rating)

Charlotte, NC

SUMMARY

  • 7+ years of IT experience in developing and deploying projects with Python and related frameworks such as Django/Flask integrating with MySQL, HTML, JavaScript, Node.js, BASH, Linux. Used Python libraries like NumPy, Pandas, Requests, PySpark, Tkinter, urllib, Pytest.
  • Experience in building cloud systems and Microservices. Developed and tested RESTful APIs that work as a middleware between client and third - party APIs, using Python and Postman.
  • Solid Knowledge of Objected Oriented Concepts(OOPS), Standard Template Library, Smart Pointers, Data Structures and Design patterns.
  • Good experience with AWS services such as EC2, Lambda, DynamoDB, Boto3, Glue, Textract, VPC, CloudWatch, SNS, SageMaker, API Gateway, Athena, IAM, EMR, S3, SQS.
  • Solid experience in statistical programming languages like Python, SAS, Apache Spark, MATLAB, RDBMS and knowledge of Big Data technologies like Hadoop, Hive, Pig.
  • Hands on experience developing applications using microservices and data engineering with PySpark, especially on data frames, data processing techniques, performance improvement.
  • Hands-on experience using software development tools such as Debuggers and Compilers.
  • Proficient with SQL and scripted using Perl, Go, PowerShell.
  • Hands on experience inAzureDevelopment, worked onAzure web application,App services,Azure storage,Azure SQL Database
  • Extensively usedPythonand Data science Libraries NumPy, Pandas, SciPy, PySpark, Pytest, PyExcel, Boto3, embedPy and Beautiful Soup.
  • Experience in working with NoSQL databases like DynamoDB, MongoDB, Redis, HBase and SQL databases such MongoDB, MySQL.
  • Deployed applications on AWS DevOps tools (CodeBuild, CodeDeploy, CodePipeline) as well GitHub, Maven, SVN, Jenkins.
  • Worked in fast paced environments using Agile (Scrum) or Waterfall software development methodologies. Participate in daily scrum meetings and bi-weekly sprint meetings to review the tasks accomplished and plan for upcoming sprints. Perform demo for tasks completed.
  • Built ETL pipelines on batch and steaming data using PySpark and SparkSQL.
  • Worked with server-side technologies and databases, restful API and MVC design patterns.
  • Designed, developed and implemented new classes and objects in C++ using web services.
  • Experience in Database Administration, development, design, maintenance, and production support of relational databases(RDMS), business applications. MySQLServer installation, upgrade, and migration.
  • Experience with implementing web services using protocols such as SOAP, REST.
  • Hands-on experience developing Lambdas with Python for AWS infrastructure automation.
  • Designed the application's frontend using Python 2.7, 3.6, HTML, CSS, JSON, and jQuery.

TECHNICAL SKILLS

Programming Languages: Python, C++, Java, JavaScript.

Python Libraries: NumPy, Pandas, Requests, PySpark, Tkinter, boto3.

Web Technologies: CSS, HTML, XML, JavaScript.

Frameworks: Django, Flask, Robot, Manual testing.

Databases: DynamoDB, MongoDB, Oracle, Redis, MySQL.

Reporting Tools: Tableau, PowerBI, SAS.

IDE s: PyCharm, PyDev, Jupyter, Visual Studio Code.

Testing Frameworks: UnitTest, Pytest Junit, TestNG, PyUnit, ROBOT, JXL.

Bug Tracking: Siebel Help Desk, JIRA, ClearQuest.

Version Controls: GitHub, ClearCase, BitBucket.

PROFESSIONAL EXPERIENCE

Confidential, Charlotte, NC

Python Developer

Responsibilities:

  • Designed and developed applications using Python with Django framework. Implemented code in Python to retrieve and manipulate data.
  • Good experience with AWS services such as EC2, Lambda, DynamoDB, Boto3, Glue, VPC, CloudWatch, SNS, SageMaker, API Gateway, Athena, IAM, EMR, S3, SQS.
  • Developed automation and processes to enable teams to deploy, manage, configure, scale, monitor applications using Python in AWS Cloud.
  • Developed the back-end web services using Python and flask REST framework using JWT.
  • Managed storage in cloud using Elastic Block Storage(EBS), S3, created Volumes and configured Snapshots.
  • Developed spark applications in Python using PySpark on distributed environment to load huge number of CSV files with different schema in to Hive ORC tables.
  • Integrated Apache Kafka with Elasticsearch using kafka electricsearch connector to stream all messages.
  • Deployed applications on AWS DevOps tools (CodeBuild, CodeDeploy, CodePipeline) as well GitHub, Maven, SVN, Jenkins
  • Experience working GraphQL APIs via client side JavaScript or server side via Node.js. Developed web applications using react.js, D3.js, jQuery. Used frameworks such as Bootstrap and Angular.
  • ETL pipelines in and out of data warehouse using combination of Python and Snowflakes Writing SQL queries against Snowflake.
  • Created Spark Streaming jobs using Python to read messages from Kafka & download JSON files from AWS S3 buckets.
  • Written python scripts integrating Boto3 to supplement automation provided by Ansible and Terraform for tasks such scheduling lambda functions for routine AWS tasks.
  • Involved in developing Python microservices which are interconnected in the AWS cloud also involved in consuming and building web services both and SOAP and RESTful.
  • Worked on data cleaning to ensure data quality, consistency, integrity using Pandas, NumPy.
  • Responsible for User Management, Plugin Management and End-to-End automation of Build and Deployment process using Jenkins.
  • Experience with Snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple source system which include loading nested JSON formatted data into snowflake table.
  • Performed SQL queries using AWS Athena to analyse data in AWS S3 buckets.
  • Developed Automation Regressing Scripts for validation of ETL process between multiple databases like AWS Redshift, Oracle, Mongo DB, T-SQL, and SQL Server usingPython.
  • Generated graphical reports using python package NumPy and matplotlib.
  • Knowledge and usage of the open-source machine learning frameworks available on the market, like TensorFlow, Python ML and others.
  • Utilized Apache Spark with Python to develop and execute Big Data Analytics and Machine learning applications, executed machine Learning use cases under Spark ML and MLlib.
  • Created data frames in particular schema from raw data stored at Amazon S3, lambda using PySpark.
  • Designed and maintained databases using Python and developed Python based API (RESTful Web Service) using Flask, SQL Alchemy and PostgreSQL.
  • Develop Python microservices with Flask framework for Confidential & Confidential internal Web Applications.
  • Automating the build, deployment (CI/CD) of Front-End applications, Middleware and Database components using Jenkins.
  • Added support for Amazon AWS S3 and RDS to host static/media files and the database into Amazon Cloud.
  • Experience working on Chef/Puppet as Configuration management tool, to automate repetitive tasks, quickly deploy critical applications, and proactively manage change.

Environment: Python 3.6, 3.9, SQL, PowerShell, SOAP, REST, PySpark, PyCharm, AWS, Jira, Git, Jenkins, Pylint, PEP-8.

Confidential, Charlotte, NC

Python Developer

Responsibilities:

  • Develop automation and processes to enable teams to deploy, manage, configure, scale, monitor applications in Data Centres and in AWS Cloud.
  • Managed storage in AWS using Elastic Block Storage, S3, created Volumes and configured Snapshots.
  • Experience in creating S3 buckets and managed policies for S3 buckets and utilized S3 Buckets and Glacier for storage, backup and archived in AWS.
  • Experience in set up and maintenance of Auto-Scaling AWS stacks.
  • Strong experience in self-healing Server Infrastructure development on AWS cloud, extensive usage of AWS-EC2, VPC, CLI, S3
  • Developed Applications in Windows and going to deploy it in Linux server.
  • Utilized Active Record eager loading to improve rendering time of index pages, incorporated up/down voting, reviewing, and several custom sorting methods for shows to provide smooth user experience.
  • Supported the development of BI portal with development in SQL.
  • Worked with AWS Lambda functions to create automatic loading data into S3 buckets and also transfer them to DynamoDB
  • Used AWS boto3 module to handle all the functions and events related to AWS
  • Created and maintained user accounts, profiles, network security and security groups, using AWS-IAM.
  • Experience using AWS - Cloud Watch and created alerts for instances
  • Created AWS Lambda functions to generate different events, permissions and maintain deployment packages in Dev, QA and PROD using JSON Module
  • Extensive experience in implementation of change request process and defect management process
  • Experience developing and testing dynamic, database-driven applications with web interfaces
  • Hands on knowledge of System-integration, functional, regression and end to end testing
  • Developed the back-end web services using Python and flask REST framework using JWT
  • Responsible for User Management, Plugin Management and End-to-End automation of Build and Deployment process using Jenkins.
  • Used Boto3's 'client' and 'resource' interfaces to dynamically generated classes driven by JSON models that describe AWS APIs.

Environment: Python 2.7, 3.6, SOAP, REST, PyCharm, Azure, MongoDB, AWS, ECS, Jira, Git, Jenkins, Pylint.

Confidential

Python Developer

Responsibilities:

  • Designed architecture of real time processing microservices workflow considering the upstream and downstream system capabilities.
  • Developing Microservices, and creating API’s with Python Django framework using Jenkins as a build tool and enterprise level database.
  • Developed tools using Python, C++, Shell scripting, XML to automate some of the menial tasks, modified existing Perl scripts to Python.
  • Designed, developed, and implemented the ETL pipelines using python API (PySpark) of Apache Spark on AWS EMR.
  • Developed test scripts using Python and Robot framework for automation purposes.
  • Migrated quality monitoring tool from AWS EC2 to AWS lambda and built logical datasets to administer quality monitoring on data warehouses.
  • Decent experience in UNIX in developing the application and familiar with all its commands.
  • Experienced with Unit Testing, Functional Testing and Regression Testing on the embedded software for Powertrain modules which includes Application layer and core Features.
  • Understanding of message queuing, steam processing, and highly scalable ‘big data’ data stores.
  • Created data base tables, functions, stored procedures and wrote prepared statements using PL/SQL.
  • Coded and tested embedded software in C++ for train control system to add functions and fix defects.
  • Optimized the PySpark jobs to run on Kubernetes Cluster for faster data processing.
  • Designed, built and deployed application using the AWS stack along with docker, Kubernetes orchestration container focusing on high-availability, fault tolerance and auto-scaling.
  • Designed applications which use caching database Redis, to access backend database.
  • Build, Enhance, optimized Data Pipelines using Reusable frameworks to support data need for the analytics and Business team using Spark and Kafka.
  • Developed software drivers for multiplexer devices and embedded controllers using C++.
  • Developed multi-threaded standalone app in Python and PHP to view performance.
  • Developed the back-end web services using Python and Django REST framework.
  • Designed and maintained databases using Python and developed Python based API (RESTful Web Service) using Flask, SQL Alchemy and PostgreSQL.
  • Used Python Library to write fully functioning test automation process that allowed the simulation of embedded controllers.
  • Developed and tested many features for dashboard using Python, CSS, JavaScript and jQuery.
  • Used JIRA for requirements and test case management in the agile software methodology.
  • Experience with SQL/NoSQL databases to manage and analyze large data sets.
  • Created and managed fully automatic CI/CD pipelines for Jenkins’s code deployment.

Environment: Python 3.4, Django, UNIX, Linux, C++, PowerShell, PySpark, MySQL, AJAX, SOAP, JQuery, JavaScript, Bootstrap, PyCharm, AWS (Lambda, DynamoDB, boto SDK, EC2, S3, RDS).

We'd love your feedback!