We provide IT Staff Augmentation Services!

Python Developer Resume

3.00/5 (Submit Your Rating)

Memphis, TN

SUMMARY

  • 5 years of IT experience in developing and deploying projects with Python and related frameworks such as Django/Flask integrating with MySQL, HTML, JavaScript, Node.js, BASH, Linux. Used Python libraries like NumPy, Pandas, Requests, PySpark, Tkinter, urllib, Pytest.
  • Solid Knowledge of Objected Oriented Concepts(OOPS), Standard Template Library, Smart Pointers, Data Structures and Design patterns.
  • Experience in Database Administration, development, design, maintenance, and production support of relational databases(RDMS), business applications. MySQLServer installation, upgrade, and migration.
  • Good experience with AWS services such as EC2, Lambda, DynamoDB, Boto3, Glue, Textract, VPC, CloudWatch, SNS, SageMaker, API Gateway, Atana, IAM, EMR, S3, SQS.
  • Experience in building cloud systems and Microservices. Developed and tested RESTful APIs dat work as a middleware between client and third - party APIs, using Python and Postman.
  • Solid experience in statistical programming languages like Python, SAS, Apache Spark, MATLAB, RDBMS and noledge of Big Data technologies like Hadoop, Hive, Pig.
  • Worked in fast paced environments using Agile (Scrum) or Waterfall software development methodologies. Participate in daily scrum meetings and bi-weekly sprint meetings to review the tasks accomplished and plan for upcoming sprints. Perform demo for tasks completed.
  • Hands on experience developing applications using microservices and data engineering with PySpark, especially on data frames, data processing techniques, performance improvement.
  • Proficient with SQL and scripted using Perl, Go, PowerShell.
  • Deployed applications on AWS DevOps tools (CodeBuild, CodeDeploy, CodePipeline) as well GitHub, Maven, SVN, Jenkins.
  • Extensively usedPythonand Data science Libraries NumPy, Pandas, SciPy, PySpark, Pytest, PyExcel, Boto3, embedPy and Beautiful Soup.
  • Experience in working with NoSQL databases like DynamoDB, MongoDB, Redis, HBase and SQL databases such MongoDB, MySQL.
  • Built ETL pipelines on batch and steaming data using PySpark and SparkSQL.
  • Worked with server-side technologies and databases, restful API and MVC design patterns.
  • Designed, developed and implemented new classes and objects in C++ using web services.
  • Experience with implementing web services using protocols such as SOAP, REST.
  • Hands-on experience developing Lambdas with Python for AWS infrastructure automation.
  • Designed the application's frontend using Python 2.7, 3.6, HTML, CSS, JSON, and jQuery.

TECHNICAL SKILLS

Programming Languages: Python, C++, Java, JavaScript.

Python Libraries: NumPy, Pandas, Requests, PySpark, Tkinter, boto3.

Web Technologies: CSS, HTML, XML, JavaScript.

Frameworks: Django, Flask, Robot, Manual testing.

Databases: DynamoDB, MongoDB, Oracle, Redis, MySQL.

Reporting Tools: Tableau, PowerBI, SAS.

IDE’s: PyCharm, PyDev, Jupyter, Visual Studio Code.

Testing Frameworks: UnitTest, Pytest Junit, TestNG, PyUnit, ROBOT, JXL.

Bug Tracking: Siebel Help Desk, JIRA, ClearQuest.

Version Controls: GitHub, ClearCase, BitBucket.

PROFESSIONAL EXPERIENCE

Confidential

Python Developer

Responsibilities:

  • Designed and developed applications using Python with Django framework. Implemented code in Python to retrieve and manipulate data.
  • Knowledge and usage of the open-source machine learning frameworks available on the market, like TensorFlow, Python ML and others.
  • Developed spark applications in Python using PySpark on distributed environment to load huge number of CSV files with different schema in to Hive ORC tables.
  • Developed automation and processes to enable teams to deploy, manage, configure, scale, monitor applications using Python in AWS Cloud.
  • Developed the back-end web services using Python and flask REST framework using JWT.
  • Managed storage in cloud using Elastic Block Storage(EBS), S3, created Volumes and configured Snapshots.
  • Integrated Apache Kafka with Elasticsearch using kafka electricsearch connector to stream all messages.
  • Experience working GraphQL APIs via client side JavaScript or server side via Node.js. Developed web applications using react.js, D3.js, jQuery. Used frameworks such as Bootstrap and Angular.
  • Utilized Apache Spark with Python to develop and execute Big Data Analytics and Machine learning applications, executed machine Learning use cases under Spark ML and MLlib.
  • ETL pipelines in and out of data warehouse using combination of Python and Snowflakes Writing SQL queries against Snowflake.
  • Experience with Snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple source system which include loading nested JSON formatted data into snowflake table.
  • Created Spark Streaming jobs using Python to read messages from Kafka & download JSON files from AWS S3 buckets.
  • Written python scripts integrating Boto3 to supplement automation provided by Ansible and Terraform for tasks such scheduling lambda functions for routine AWS tasks.
  • Involved in developing Python microservices which are interconnected in the AWS cloud also involved in consuming and building web services both and SOAP and RESTful.
  • Worked on data cleaning to ensure data quality, consistency, integrity using Pandas, NumPy.
  • Responsible for User Management, Plugin Management and End-to-End automation of Build and Deployment process using Jenkins.
  • Performed SQL queries using AWS Atana to analyse data in AWS S3 buckets.
  • Developed Automation Regressing Scripts for validation of ETL process between multiple databases like AWS Redshift, Oracle, Mongo DB, T-SQL, and SQL Server usingPython.
  • Generated graphical reports using python package NumPy and matplotlib.
  • Created data frames in particular schema from raw data stored at Amazon S3, lambda using PySpark.
  • Designed and maintained databases using Python and developed Python based API (RESTful Web Service) using Flask, SQL Alchemy and PostgreSQL.
  • Develop Python microservices with Flask framework for Confidential & Confidential internal Web Applications.
  • Added support for Amazon AWS S3 and RDS to host static/media files and the database into Amazon Cloud.
  • Automating the build, deployment (CI/CD) of Front-End applications, Middleware and Database components using Jenkins.
  • Experience working on Chef/Puppet as Configuration management tool, to automate repetitive tasks, quickly deploy critical applications, and proactively manage change.

Environment: Python 3.6, 3.9, SQL, PowerShell, SOAP, REST, PySpark, PyCharm, AWS, Jira, Git, Jenkins, Pylint, PEP-8.

Confidential, Memphis, TN

Python Developer

Responsibilities:

  • Implemented Business logic, worked on data exchange, processed XML and HTML using Python and its familiar framework Django.
  • Wrotepythonscripts to manage AWS resources from API calls usingBOTOSDK and also worked withAWS CLI.
  • Involved in development of Web Services using and REST for sending and getting data from the external interface in XML and JSON format
  • Used Pandas API to put the data as time series and tabular format for east timestamp data manipulation and retrieval.
  • Involved in various phases of the project like Analysis, Design, Development, and Testing.
  • Developed Restful Microservices using Flask and deployed on AWS servers using EBS and EC2.
  • Used Python IDE's such as Pycharm, sublime text for developing the code and performing unittest and SIT.
  • Extensive experience in implementation of change request process and defect management process.
  • Proven strong experience with databases (NoSQL and RDMS) and good SQL Skills.
  • Integrated and managed services like mongo dB, MySQL servers with Azure Kubernetes Services.
  • Developed Python Project, which is used for handling data using OOPS concepts, PANDAS, NUMPY, BitBucket.
  • Worked with Passport and JSON web tokens for autantication and authorization security configurations usingNode.js.
  • Designing, implementing, and maintaining solutions for using Docker, Jenkins, Git, and Puppet for microservices and continuous deployment.
  • Design and Develop ETL Processes in AWS Glue to migrate data from sources like S3, ORC/Parquet/Text Files into AWS Redshift.
  • Supported the development of services for Business users.
  • Configuring auto scalable and highly available microservices set with monitoring and logging using AWS, Docker, Jenkins.
  • Developed modules in python dat are used for cross environment in Windows
  • Conducted ETL Data Integration, Cleansing, and Transformations using AWS glue Spark script.
  • Performs day-to-day, after-hours, tactical, and strategic efforts associated with ServiceNow.
  • Created Proof-of-concept using responsive web design, Node.js, HTML and CSS.
  • Developed the back-end web services using Python REST framework.
  • Implementation exposure on service based, SOAP, RESTful technologies.
  • Wrote scripts in Python for Extracting Data from JSON and SIP text files.

Environment: Python 2.7, 3.6, SOAP, REST, PyCharm, Azure, MongoDB, AWS, ECS, Jira, Git, Jenkins, Pylint.

Confidential, Phoenix, AZ

Python Developer

Responsibilities:

  • Designed architecture of real time processing microservices workflow considering the upstream and downstream system capabilities.
  • Created data base tables, functions, stored procedures and wrote prepared statements using PL/SQL.
  • Developing Microservices, and creating API’s with Python Django framework using Jenkins as a build tool and enterprise level database.
  • Coded and tested embedded software in C++ for train control system to add functions and fix defects.
  • Designed, built and deployed application using the AWS stack along with docker, Kubernetes orchestration container focusing on high-availability, fault tolerance and auto-scaling.
  • Designed, developed, and implemented the ETL pipelines using python API (PySpark) of Apache Spark on AWS EMR.
  • Understanding of message queuing, steam processing, and highly scalable ‘big data’ data stores.
  • Developed test scripts using Python and Robot framework for automation purposes.
  • Migrated quality monitoring tool from AWS EC2 to AWS lambda and built logical datasets to administer quality monitoring on data warehouses.
  • Developed and tested many features for dashboard using Python, CSS, JavaScript and jQuery.
  • Developed tools using Python, C++, Shell scripting, XML to automate some of the menial tasks, modified existing Perl scripts to Python..
  • Designed and maintained databases using Python and developed Python based API (RESTful Web Service) using Flask, SQL Alchemy and PostgreSQL.
  • Build, Enhance, optimized Data Pipelines using Reusable frameworks to support data need for the analytics and Business team using Spark and Kafka.
  • Used Python Library to write fully functioning test automation process dat allowed the simulation of embedded controllers.
  • Experienced with Unit Testing, Functional Testing and Regression Testing on the embedded software for Powertrain modules which includes Application layer and core Features.
  • Decent experience in UNIX in developing the application and familiar with all its commands
  • Optimized the PySpark jobs to run on Kubernetes Cluster for faster data processing.
  • Created and managed fully automatic CI/CD pipelines for Jenkins’s code deployment.
  • Designed applications which use caching database Redis, to access backend database.
  • Developed software drivers for multiplexer devices and embedded controllers using C++.
  • Developed multi-threaded standalone app in Python and PHP to view performance.
  • Developed the back-end web services using Python and Django REST framework.
  • Used JIRA for requirements and test case management in the agile software methodology.
  • Experience with SQL/NoSQL databases to manage and analyze large data sets.

Environment: Python 3.4/2.7, Django, UNIX, Linux, C++, PowerShell, PySpark, MySQL, AJAX, SOAP, JQuery, JavaScript, Bootstrap, PyCharm, AWS (Lambda, DynamoDB, boto SDK, EC2, S3, RDS).

We'd love your feedback!