We provide IT Staff Augmentation Services!

Sr Python/api Developer Resume

5.00/5 (Submit Your Rating)

Dearborn, MI

SUMMARY

  • Sr. Python Developer with specialization in design, development, testing and implementation of various stand - alone and client-server architecture based enterprise application software in Python along with strong experience in analytical programming using Python 3.6/3.4.6, Django 1.9/1.8, Flask and C++, XML, CSS3, HTML 5, JavaScript, jQuery & AngularJS and AWS in cloud.
  • Experienced in developing Web Services with Python programming language - implementing JSON based RESTful and XML based SOAP webservices.
  • Experienced in NoSQL technologies like MongoDB, Cassandra, DynamoDB and relational databases like Oracle, SQLite, PostgreSQL, Dynamo DB and MySQL databases.
  • Private Cloud Environment - Leveraging AWS and Puppet to rapidly provision internal computer systems for various clients.
  • Experience data processing like collecting, aggregating, moving from various sources using Apache Flume & Kafka.
  • Performed Unit testing, Integration testing and generating of test cases for web applications using Junit and Python Unit test framework.
  • Expertise in developing the presentation layer components using JSPs, JavaScript, Node.js, XML, CSS and HTML.
  • Good working experience in processing large datasets with Spark using Scala and Pyspark and Familiar with JSON based REST Web services.
  • Experienced in implementing Object Oriented Python, Hash Tables (Dictionaries) and Multithreading, Flask, MYSQL, Exception Handling and Collections using Python.
  • Experience with Unit testing/ Test driven Development (TDD), Load Testing and worked on Celery Task queue and service broker using RabbitMQ.
  • Worked on standard python packages like boto and boto3 for AWS.
  • Good experience using various Python libraries to speed up development (Beautiful Soup, NumPy, SciPy, Matplotlib, Pandas data frame, MySQL dB for database connectivity, JSON libraries).
  • Good experience in Hadoop technologies like Apache Spark, Scala, and Spark SQL.
  • Experience in Amazon Web Services (AWS) cloud platform like EC2, Virtual private clouds (VPCs), Storage models (EBS, S3, instance storage), Elastic Load Balancers (ELBs), RDS, Redshift, Cloud Formation, Cloud Watch, IAM, Lambda.
  • Strong experience in DevOps Environment by enhancing Continuous Delivery and infrastructure change by using Chef, Ansible, Kubernetes and Docker to deploy code with GIT, Jenkins.
  • Strong Experience in Big data technologies including Apache Spark, Hadoop, HDFS, Hive, Cassandra & MongoDB.
  • Experienced in using Version Control Systems like GIT, SVN and CVS to keep the versions and configurations of the code organized.
  • Good hands on experience with Pyspark for using Spark libraries by using python scripting for data analysis.
  • Experienced in working with various Python IDE's using PyCharm, PyScripter, Spyder, Py Studio and PyDev.
  • Experienced in developing web-based applications using HTML/HTML5, DHTML, CSS/CSS3, JavaScript, Angular JS, AJAX, XML and JSON.
  • Extensive experience in developing and maintaining build, deployment scripts for test, Staging and Production environments usingANT, Maven, Shell, and Perl Scripts.
  • Extensive experience with data modeling along with Oracle MS SQL Server, MySQL & PostgreSQL. writing PL/SQL, trigger, query optimization and in NO-SQL databases MongoDB, Cassandra.
  • Experience in implementing with server-side technologies with restful API and MVC design patterns with node JS and Flask framework.

PROFESSIONAL EXPERIENCE

Confidential, Dearborn, MI

Sr Python/API Developer

Responsibilities:

  • Created API’s to consume data from REST endpoints and cleansing the data and store it in database.
  • Used Spark for streaming APIs to perform transformations and actions on the fly for building the common learner data model which gets the data from Kafka in near real time and persists into Cassandra.
  • Developed the ETL jobs as per the requirements to update the data into the staging database (Postgres) from various data sources and REST API’s.
  • Designed and developed the UI of the website using HTML5, CSS3, JavaScript, AngularJS, and jQuery.
  • Wrote python scripts integrating Boto3 to supplement automation provided by Ansible and Terraform for tasks such as encrypting EBS volumes backing AMI's and scheduling lambda functions for routine AWS tasks.
  • Used AWS Glue Studio to visually create, run, and monitor AWS Glue ETL jobs.
  • Containerized and Deployed the ETL and REST services on AWS ECS through the CI/CD Jenkins pipe.
  • Used python for end to end processing of documents with Spark Streaming, Kafka RPC framework and AWS.
  • Implemented a Continuous Delivery pipeline with Docker, Jenkins, GitHub, and AWS AMI's, also worked on creating the Docker containers and Docker consoles for managing the application life cycle.
  • UsedJavaScriptandJSONto update a portion of a webpage. Development of Python APIs to dump the array structures in the Processor at the failure point for debugging.
  • Worked with AWS Lambda to manages all the infrastructure to run the code on highly available, fault-tolerant infrastructure, freeing you to focus on building differentiated back-end services.
  • Worked on functions in Lambda that basically aggregates the data from incoming events, and then I stored the result data in Amazon DynamoDB.
  • Managed the Docker container through the pods and performing the load balance between the pods through Kubernetes.
  • Used Kubernetes to deploy scale, load balance, and worked on Docker Engine, Docker HUB, Docker Images.
  • Used NumPy along with Pandas for computing min, max and mean broadcasting signal levels. Developed server-based web traffic statistical analysis tool using Django, Pandas.
  • Worked on building the base infrastructure of our application instances on the cloud using Terraform & configuring s3 buckets, RDS, and EC2 instances using these templates.
  • Wrote validation rules for the datasets with the given requirements using the Pyspark in the pipe environment. Worked on conversation of use cases from SQL to Python using the Pyspark.
  • Worked on MongoDB database concepts such as locking, transactions, indexes, Sharding, replication, schema design.
  • Built a new CI pipeline and performed testing and deployment automation with Docker, Jenkins, and Puppet.
  • Wrote and executed various MySQL database queries from python using Python-MySQL connector and MySQL DB package.
  • Utilized Kubernetes and Docker for the runtime environment of the CI/CD system to build, test deploy

Confidential, Houston, TX

Python AWS Developer

Responsibilities:

  • Designed and managed API system deployment using fast HTTP server and Amazon AWS architecture & Setup database in AWS using RDS and configuring backups for S3 bucket.
  • Designed RESTful Webservices using FLASK, with emphasis on improved Security for the service using FLASK-HTTP AUTH with HTTPS.
  • Created Restful API in NodeJS and communicate with Clojure server via protocol and use Backbone to generate template.
  • Experience in setting up Elastic Load Balancers (ELB's) and Auto Scaling groups on Production EC2 Instances to build Fault-Tolerant and High Availability applications.
  • Used the AWS Glue Data to quickly discover and search across multiple AWS data sets without moving the data.
  • Designed and maintained databases using Python and developed Python-based API (RESTful Web Service) using Flask, SQL Alchemy, and PostgreSQL.
  • Wrote automation script to download/upload the data files from local machine to S3 and S3 to local machine using Python AWS API Boto.
  • Utilized AWS Lambda platform to upload data into AWS S3 buckets and to trigger other Lambda functions.
  • Worked on automated email notification using celery and RabbitMQ for status of jobs and pending task list manager to users and admin.
  • Developed Splunk Queries and the dashboards for the debugging the logs generated by the ETL and the REST services.
  • Used GitHub repository to submit code changes & for source code version control and deploying in AWS Lambda.
  • Used Pandas API to put the data as time series and tabular format for east timestamp data manipulation and retrieval.
  • Used Python and Flask creating graphics, XML processing, data exchange and business logic implementation & Utilized PyUnit, the Python unit test framework, for all Python applications.
  • Used Apache Kafka (Message Queues) for reliable and asynchronous exchange of important information between multiple business applications.
  • Analyzed the SQL scripts and designed solutions to implement using Pyspark. Also created custom new columns depending up on the use case while ingesting the data into Hadoop lake using Pyspark.
  • Created S3 buckets and managing policies for S3 buckets and Utilized S3 bucket and backup on AWS. Designed code to transform logs from AWS CloudWatch to Kibana.
  • Developed tools using Python, Shell scripting, XML to automate some of the menial tasks. Interfacing with supervisors, artists, systems administrators, and production to ensure production deadlines are met.

Confidential, Irving, TX

Python Developer

Responsibilities:

  • Developed RESTful Web Services for getting and sending data from the external interface in the JSON format.
  • Created Jenkins build and deployment pipeline jobs to deploy the docker images into AWS ECR repositories and integrated with GITHUB.
  • Used AWS Lambda to create new back-end services for your applications that are triggered on-demand using the Lambda API or custom API endpoints built using Amazon API Gateway.
  • Managed datasets using Panda data frames and MySQL, queried MYSQL database queries from python using Python-MySQL connector and MySQL dB package to retrieve information.
  • Implemented continuous integration using Jenkins and involved in the deployment of application with Ansible automation engine.
  • Implemented and modified various SQL queries and Functions, Cursors and Triggers as per the client requirements.
  • Cleaned data and processed third party spending data into maneuverable deliverables within specific format with Excel macros and python libraries such as NumPy, Boto3, Pandas and Matplotlib.
  • Used GitHub for Python source code version control, Jenkins for automating the build Docker containers, and deploying in Mesos.
  • Used PyUnit, the Python unit test framework, for all Python applications. Worked on Jenkins continuous integration tool for deployment of project.

Confidential, San Jose, CA

Software Developer

Responsibilities:

  • Designed and developed Application based on the Django MVC Framework using MVC design pattern. Developed the REST API using Django for fetching the data from MongoDB database.
  • Used Angular JS framework for single page applications and validating the client-side user information based on their business rules.
  • Designed and managed API system deployment using fast http server and Amazon AWS architecture.
  • Responsible for Configuring Kafka Consumer and Producer metrics to visualize the Kafka System performance and monitoring.
  • Used Django configuration to manage URLs and application parameters. Responsible for gathering requirements, system analysis, design, development, testing and deployment.
  • Implemented RESTful Web-Services for sending and receiving the data between multiple systems.

We'd love your feedback!