We provide IT Staff Augmentation Services!

Python Developer Resume

Pittsburg, PA

SUMMARY

  • Around 6 years of experience in Application Development & Designing using Python .
  • Expertise in Object Oriented Programming (OOP) Concepts, Object Oriented Design (OOD), Object Oriented Analysis(OOA) and Programming.
  • Experience in Object Oriented Design and Programming concepts in Python, OpenStack, AWS, PySpark, Python and SCALA .
  • In - depth experience in Amazon Cloud (AWS) including EC2, VPC, Identity and Access Manager (IAM), EC2 Container Service, Elastic Beanstalk, Lambda, S3, CloudFront, Glacier, RDS, DynamoDB, ElastiCache, Redshift, Direct Connect, Route 53, CloudWatch, CloudFormation, CloudTrail, OpsWorks, Amazon Elastic MapReduce(EMR), AWS IoT, SNS, API Gateway, SES, SQS, WorkSpaces, WorkDocs and EFS .
  • Created functions and assigned roles in AWS Lambda to run python scripts, and AWS Lambda using java to perform event driven processing.
  • Worked with MVW frameworks like Django, AngularJS , HTML, CSS, XML, JavaScript, jQuery, JSON and Node.js.
  • Extensive experience in JAVA/J2EE technologies like Core Java, Servlets, JSP, JSTL, JDBC, Hibernate, Spring, Struts, Web Services, JMS , multi threading, MVC architecture and Design Patterns
  • Designing, developing and test ODG Machine Learning Images Classification project using ODG Images BigData , images-preprocessing, Numpy , TensorFlow , Conda , CNN , Convolution model, Panda , Hidden-Layers, Images classifications by working with ODG Scientists.
  • Strong programming skills in designing and implementation of multi-tier applications using web-based technologies like Spring MVC and Spring Boot .
  • Performed Java web application development using J2EE and Netbeans .
  • Experience in Server infrastructure development on Gateway, ELB, Auto Scaling, DynamoDB, Elasticsearch, Virtual Private Cloud (VPC) , Kinesis, Cloud Watch, ECS.
  • Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume .
  • Good experience in writing Spark applications using Python and Scala .
  • Knowledge on Automation using Selenium, JBehave (BDD), Test Driven Development (TDD) and involved in UNIT test and sanity testing.
  • Expertise with different tools in Hadoop Environment including Pig, Hive, HDFS, MapReduce, Sqoop , Spark, Kafka, Yarn, Oozie , and Zookeeper
  • Knowledge on integrating different eco-systems like Kafka - Spark - HDFS .
  • Implemented pre-defined operators in spark such as map, flat Map , filter, reduceByKey , groupByKey , aggregateByKey and combineByKey etc.Used Scala sbt to develop Scala coded spark projects and executed using spark-submit.
  • Having experienced in Agile Methodologies , Scrum stories and sprints experience in a Python & SCALA based environment, along with data analytics, data wrangling and Excel data extracts.
  • Vast experience with Python using most of the advanced features of Python including, ORM, Django, Flask, Pyramid, Tornado.
  • Good experience using various Python libraries to speed up development (libraries used: Beautiful Soup , NumPy, SciPy, Matplotlib, python-twitter, Pandas data frame, network, urllib2, MySQL dB for database connectivity, JSON libraries
  • Used Scala sbt to develop Scala coded spark projects and executed using spark-submit
  • Expertise in integrated environment for Python & SCALA such as Spyder, Atom, PYcharm, IDlE, Anaconda.
  • Experience in developing and implementing Web Services using REST, SOAP, and WSDL , Open Stack and developing slack appication.
  • Test Driven Programmer with thorough knowledge in Unit testing with JUnit, Mockito and using SoapUI, Postman for Web Service testing, performance testing with JMeter and automated testing with Test Driven Development (TDD) in Extreme Programming model.
  • Proficient in developing websites and web applications using PHP , MYSQL , AWS, Flask, Jinga, REDIS, HTML, XML, JSON, CSS, Java Script & AJAX .
  • I have a Deep understanding of Micro service based architecture.
  • Hands on UML compliant high level design with data flow diagram, Class Diagrams, Sequence Diagrams, Activity Diagram and Use Cases and documenting for peer developer.
  • Developed and designed an API (RESTFUL Web Service) for the company’s website.
  • Maintained customers relationship management databases (MySQL / PostgreSQL)
  • Developed server based web traffic statistical analysis tool using Flask, Pandas.

TECHNICAL SKILLS

  • Python
  • PySpark
  • Django
  • Amazon Web Services (AWS)
  • Lambda
  • EMR
  • ETL
  • Java
  • Spring Boot
  • Kafka
  • Docker
  • MySQL
  • MongoDB
  • Apache Cassandra
  • PostgreSQL
  • Kubernetes
  • Apache Cassandra
  • ReactJS
  • VueJS
  • NodeJS
  • Bootstrap
  • EXT-JS
  • Ajax
  • JQuery
  • Spring
  • Hibernate
  • Open Stack
  • JDBC
  • C
  • C++.

PROFESSIONAL EXPERIENCE

Python Developer

Confidential - Pittsburg, PA

Responsibilities:

  • Developed entire frontend and backend modules using Python on Django and flask Web Framework.
  • Worked on designing, coding and developing the application in Python using Django MVC.
  • Experience in working with Python ORM Libraries including Django ORM, and SQLalchamy.
  • Wrote and executed various MYSQL database queries from python using Python MySQL connector and MySQL dB package.
  • Automated setting up server infrastructure for the DevOps services, using Ansible, shell and python scripts.
  • Developed Hadoop ETL solutions to move data to the data lake using big data tools Sqoop, Hive, Spark, HDFS, Talend .
  • Utilize PyUnit , the Python Unit test framework, for all Python applications.
  • Worked with Terraform to create AWS components like EC2, IAM, VPC, ELB , Security groups
  • Utilized Python Libraries like Boto3, numPY for AWS.
  • Used the AWS -CLI to suspend on Aws Lambda function used AWS CLI to automate backup of ephemeral data stores to S3 buckets EBS.
  • Used Amazon EMR for map reduction jobs and test locally using Jenkins .
  • Migrated an existing on-premises application to AWS . Used AWS services like EC2 and S3 for small data sets processing and storage, Experienced in Maintaining the Hadoop cluster on AWS EMR .
  • Experience in administrating, deploying and managing RedHat, Ubuntu and CentOS servers.
  • Wrote python & SCALA scripts to parse XML documents and load the data in database.
  • Used Python and Django to interface with the JQuery UI and manage the storage and deletion of content.
  • Worked on several python packages numPy, scyPy, Tensorflow etc.
  • Proficient in developing Web Services (SOAP, RESTful) in Python using XML, JSON.
  • Implemented AWS solutions using EC2, S3, DynamoDB, EBS, Elastic Load Balancer, Auto scaling groups .
  • Implemented REST APIs using Python Django and flask framework . Implemented various micro services using Flask framework .
  • Used Pandas for a data alignment and data manipulation.
  • Managed datasets using Panda data frames and MySQL, queried MYSQL database queries from python using ETL and ELT tools like SSIS
  • Virtualized the servers using Docker for the test environments and dev-environments needs, also configuration automation using Docker containers .
  • Experience in creating Docker Containers leveraging existing Red Hat Linux Containers and AMI's in addition to creating Docker Containers from scratch.
  • Utilized Kubernetes and Docker for the runtime environment of the CI/CD system to build, test and deploy.
  • Used Jenkins pipelines to drive all micro services builds out to the Docker registry and then deployed to Kubernetes, Created Pods and managed using Kubernetes
  • Used Spark and Spark-SQL to read the parquet data and create the tables in hive using the Scala API.
  • Created Oracle database tables, stored procedures, sequences, triggers, views .
  • Worked on the MySQL migration project to make the system completely independent of the database being used. Used Spring IBatis to implement this.
  • Developed web-based applications using Python, Django, Flask, XML, Open Stack, ReactJS and JQuery .
  • Security, Route 53, Direct Connect, IAM, Cloud Formation, AWS OpsWorks (Automate operations), Elastic Beanstalk, AWS S3, Glacier and Cloud Watch Monitoring Management.
  • Worked with search business and search team to implement dynamic rule updates to search using elastic search .
  • Analysis the logs data and filter required columns by logstash configuration and send it to elastic search .

Python Developer

Confidential - Omaha, NE

Responsibilities:

  • Developed web applications and RESTful web services and APIs using Python, Django and PHP .
  • Experience with Django , a high-level Python Web framework .
  • Installed, configured, monitored and maintained Hadoop cluster on Big Data platform .
  • Automated JIRA processes using Python and bash scripts.
  • Designed and developed the UI for the website with HTML, XHTML, CSS, Java Script and AJAX.
  • To fetch data of definite options that are selected, Python routines were written to log into websites.
  • Automated AWS S3 data upload / download using Python scripts.
  • Developed data transition programs from DynamoDB to AWS Redshift (ETL Process) using AWS Lambda by creating functions in Python for the certain events based on use cases.
  • Written bash and python scripts integrating Boto3 to supplement automation provided by Ansible and Terraform for tasks such as encrypting EBS volumes backing AMI's and scheduling lambda functions for routine AWS tasks.
  • Developed/Modified GUI using HTML, CSS and JavaScript (jQuery).
  • Loading spilling data using Kafka, Flume and real time Using Spark and Storm .
  • Worked on Spark streaming using Apache Kafka for real time data processing and implemented Oozie job for daily import.
  • Worked on initiating spark context Using Kafka and Kafka brokers and processed live streaming information with the help of RDD as is.
  • Used Spring MVC with Hibernate framework to build the application on server side.
  • Created automated pipelines in AWS Code Pipeline to deploy Docker containers in AWS ECS using services like Cloud Formation, Code Build, Code Deploy, S3 and puppet.
  • Implemented a production ready, load balanced, highly available, faulttolerant Kubernetes infrastructure .
  • Having good knowledge in using NoSQL databases like Apache Cassandra 3.11 Couchbase, Kubernetes and MongoDB 4.0 Orient DBF net.
  • Created private cloud using Kubernetes that supports DEV, TEST, and PROD environments.
  • Developed Merge jobs in Python to extract and load data into MySQL database .
  • Created Python and Bash tools to increase efficiency of application system.
  • Worked with modules like MongoDB and mongoose for database persistence using NodeJS to interact with DynamoDB .
  • Built various graphs for business decision making using Python matplotlib library . Worked on Python OpenStack APIs and used Numpy for Numerical analysis .
  • Developed a Spark job in Java which indexes data into ElasticSearch from external Hive tables which are in HDFS .
  • Experience in developing SPARK applications using Spark tools like RDD transformations , Spark core and Spark SQL .
  • Implemented business logic using Python/Django . Worked on server applications with Django framework programming.
  • Developed a fully functional prototype application using JavaScript (JQuery and Backbone.js,Vue.js) and Bootstrap, connecting to a REST service hosted on AWS using API Gateway and used DynamoDB.
  • Used DynamoDB to store data on AWS server.
  • Worked on ElementTree XML API in python to parse XML documents and load the data in database.
  • Created entire application using Python, Django, Flask, Java, Spring, Hibernate, MySQL and Linux .
  • Worked on python based test frameworks and test driven development with automation tools.
  • Developed a fully automated continuous integration system using Git, MySQL, Jerkins, and custom tools developed in Python.
  • Setup and build AWS infrastructure various resources, VPC EC2, S3, IAM, EBS, Security Group, Auto Scaling, RDS in Cloud Formation JSON templates.
  • Utilized standard Python modules such as csv, itertools and pickle for development.
  • Worked on Python OpenStack APIs and used Numpy for Numerical analysis.
  • Implemented the application using the concrete principles laid down by several Java/J2EE Design patterns like Business Delegate, MVC, Session Façade, Factory Method, Service Locator, Singleton and Data Transfer Objects (DTO).
  • Worked with GIT version control , Dockers vagrant environment using node.js, Gulp for compiling, and JIRA ticketing system .
  • Built a RESTful API to save and retrieve geolocations using a remote server in Java using Spring, Cassandra DB, Apache CXF, and JAX-RS.
  • Worked in RDBMS implementation using, SQL, PL/SQL, DB2, MySQL on Oracle database for complex queries & PL/SQL , for Stored Procedures , Triggers & Events , for generating some important responses needed by application at times.
  • Performed Unit Testing and developed Unit Test Classes using Unittest and Pytest .

Python Developer

Confidential

Responsibilities:

  • Managed datasets using data frames and MySQL, queried MYSQL database queries from python using Python-MySQL connector MySQLdB package to retrieve information.
  • Created data access modules in python.
  • Experienced in working with spark eco system using Spark SQL and Scala queries on different formats like Text file, CSV file.
  • Expertized in implementing Spark using Scala and Spark SQL for faster testing and processing of data responsible to manage data from different sources.
  • Utilized Apache Spark with Python to develop and execute Big Data Analytics and Machine learning applications, executed machine learning use cases under Spark ML and Mllib.
  • Identified areas of improvement in existing business by unearthing insights by analyzing vast amount of data using machine learning techniques.
  • Interpret problems and provides solutions to business problems using data analysis, data mining, optimization tools, and machine learning techniques and statistics.
  • Used Pandas, NumPy, seaborn, SciPy, Matplotlib, Scikit-learn, NLTK in Python for developing various machine learning algorithms and utilized machine learning algorithms such as linear regression and multivariate regression.
  • Designed and developed NLP models for sentiment analysis.
  • Designed and developed components using Python with Django framework. Implemented code in python to retrieve and manipulate data.
  • Developed backend services and created many API's using Python Programming Language and SQL.
  • Involved in developing a video calling application using python web-sockets.
  • Performed research regarding Python Programming and its uses and efficiency.
  • Created Node.js middleware application server to encapsulate modern JS widget framework worked with JSON objects and JavaScript and JQuery intensively to create interactive web pages.
  • Integrate the Oracle BPM with the Spring Framework in the enterprise layer.
  • Involved in packaging, deployment and upgrade of different modules of SAS on JBoss App Server. VB code Analysis and Sybase store procedure converted into SQL.

Software Engineer

Confidential

Responsibilities:

  • Developed supplier portal using Python and Django Web Framework.
  • Performed EDI integration using Python, Java and C++.
  • Tested the database management software Fast Load, MultiLoad and TPump and FastExport functions in Teradata for integrity and efficiency.
  • Identified the issues in the software and tracked them using JIRA tracking tool.
  • Design, develop, test, deploy and maintain the website.
  • Designed, configured, and deployed Amazon Web Services (AWS) for applications utilizing the AWSstack (Including EC2, Route53, S3, RDS, Cloud Formation, Cloud Watch, SQS, IAM), focusing on high-availability, fault tolerance, and auto-scaling
  • Worked on AWS Lambda, Auto scaling, Cloud Front, RDS, Route53, AWS SNS, SQS, SES.
  • Designed and developed data management system using My SQL, Oracle 11g.
  • Developed module for predicting issues/risk which may occur during any enquiry using Tensorflow, So that software will automatically suggest any solution in regards to the problem faced.
  • Rewrite existing Java J2EE modules to Python.
  • Wrote python scripts to parse XML documents and load the data in database.
  • Creating unit test/regression test framework for working/new code
  • Using Subversion version control tool to coordinate team-development.
  • Responsible for debugging and troubleshooting the web application.
  • Excellent understand of web applications - UI experience, security, logging, backend services.
  • Solid experience working with Django framework.
  • Created numerous Django apps and extensively used Django Session and management.
  • Implemented PEP8 coding standards across all projects.
  • Experienced in developing web-based applications using Python, Flask, PHP, Django, XML,
  • Using Django evolution and manual SQL modifications was able to modify Django models while retaining all data, while site was in production mode.
  • Configuring, automation and maintaining build and deployment CI/CD tools Git/GitLab
  • Experience in writing Sub Queries, Stored Procedures, Triggers, Cursors, and Functions on My SQL and Oracle database.
  • Experience with Test Driven Development (TDD) using rspec, factory girl and JUnit
  • Used Redis Server for Website Optimization by frequently suggesting solution for same types of problems.
  • Good experience working on huge database.
  • Expertise in Object-Oriented design, coding and open stack development.
  • Efficient in demonstrating all phases of software development life cycle.
  • Experienced in Agile Methodologies, Scrum stories and sprints experience in a Python based environment.

Hire Now