We provide IT Staff Augmentation Services!

Python/aws Developer Resume

4.00/5 (Submit Your Rating)

Bloomfield, CT

SUMMARY

  • 7+ Years of Experience in developing web - based applications using Python 3.x/2.7, Django 1.8.
  • Good experience of software development in Python (libraries used: Beautiful Soup, numpy, scipy, Pandas dataframe, network, urllib2, MySQL dB for database connectivity) and IDEs - sublime text, Spyder, pycharm, Visual Studio Code.
  • Created AWS services using Cloud Formation and boto3.
  • Experienced with Terraforms for automation in AWS job creation
  • Hand full experience on AWS services like S3 bucket, lambda, Glue, API Gateway, Cognito, Step Function, RedShift and Cloud Watch using CDK.
  • Created network architecture on AWS VPC, subnets, Internet Gateway, Route. Perform S3 buckets creation, configured the storage on S3 buckets, policies and the IAM role-based policies.
  • Experienced in Design, Development and support of Data warehousing solutions for Extraction, Transformation and Loading (ETL) mechanisms.
  • Expertise in handling logging, backend services.
  • Developed Azure DevOps CI/CD pipeline for deploying the code.
  • Developed Rest API and tested using Postman.
  • Good experience in Big data databases like Hadoop and creating python scripts for ETL
  • Good experience in developing web applications implementing MVT/MVC architecture using Django, Flask and spring web application frameworks.
  • Experienced in MVC frameworks like Django.
  • Experienced in web applications development using Django/Python, Flask/Python
  • Worked closely with designer, tightly integrating Flash into the CMS with the use of Flashvars stored in the Django models.
  • Good Experience in Linux Bash scripting and following PEP Guidelines in Python.
  • Worked with NOSQL databases MongoDB, Cassandra.
  • Expertise in establishing database connections for Python by configuring packages like MySQL-Python.
  • Successfully migrated the Django database from SQLite to MySQL to PostgreSQL with complete data integrity.
  • Good knowledge of version control software - CVS, GitHub.
  • Good experience in UNIX and Linux and Expertise python scripting with focus on DevOps tools, CI/CD and AWS Cloud Architecture.
  • Worked on Docker container snapshots, attaching to a running container, removing images, managing director structures and managing container.
  • Experienced in Agile Methodologies, Scrum stories and sprints experience in a Python based environment, along with data analytics, data wrangling and Excel data extracts.
  • Excellent experience with Python development under Linux OS (Debian, Ubuntu, SUSE Linux, Red Hat Linux, Fedora).

TECHNICAL SKILLS

Primary Languages: Python 3.x/2.7, Core Java, C, C++

Python Libraries: Beautiful Soup, numpy, scipy, Pandas dataframe, urllib2

Frameworks: Flask, Django, PyJamas, Pyramid

Database: Aurora Postgres, Sqlite3, MySQL, Dynamo DB, MongoDB, Oracle 11g, Hadoop, Redshift

IDE’s: Visual Studio Code, PyCharm, Eclipse, Sublime text

Servers: Apache Tomcat, WebSphere, JBoss, WebLogic, XAMPP

Deployment tools: Amazon Web Services (EC2, S3, lambda, RDS, API Gateway, Cognito, Step Function, SES)

Web Technologies: HTML, CSS, DHTML, XML

Operating systems: Windows, Mac, Fedora Linux, Red hat Linux

SDLC Methods: SCRUM, Agile

Testing Frameworks: Pytest, Junit, JTest

Bug Tracking Tools: JIRA, Bugzilla, Rally

Version Control: GitHub, GitLab

PROFESSIONAL EXPERIENCE

Confidential, Bloomfield, CT

Python/AWS Developer

Responsibilities:

  • Responsible for gathering requirements, system analysis, design, development, testing and deployment.
  • Worked on extensively in ETL to transfer and load data into Hadoop and Amazon Redshift.
  • Created AWS services using Cloud Formation and boto3.
  • Worked on Terraforms Glue jobs creation and automation
  • Hand full experience on AWS services like S3 bucket, lambda, AWS Glue, Cognito, Step Function, RedShift and Cloud Watch using CDK
  • Created network architecture on AWS VPC, subnets, Internet Gateway, Route. Perform S3 buckets creation, configured the storage on S3 buckets, policies and the IAM role-based policies.
  • Used Cloud-watch for monitoring AWS cloud resources and the applications that deployed on AWS by creating new alarm, enable notification service.
  • Expertise in Production support and knowledge of deployment using AWS and Jenkins
  • Developed Restful Microservices and deployed on AWS API Gateway.
  • Utilized Python Libraries like Boto3, numPY for AWS.
  • Hands on Experience in AWS like Amazon EC2, Amazon S3, Amazon Redshift, Amazon EMR and Amazon SQS.
  • Exported/Imported data between various data sources.
  • Wrote Python modules to extract/load asset data from the Hadoop source database.
  • Integration of Git, Confluence, Jira.
  • Having experienced in Agile Methodologies, Scrum stories and sprints experience in a Python based environment, along with data analytics, data wrangling and Excel data extracts.
  • Created multiple Python Scripts for various application level tasks.
  • Created unit test/acceptance test and integration test framework for working/new code .
  • Using Git as version control tool to coordinate team-development.
  • Responsible for debugging and troubleshooting the ETL load issues.

ENVIRONMENT: Python 3.8, AWS, Linux, CA workstation, Oracle, Hadoop, HTML, Git, IBIS, Rally, Salesforce, RedShift.

Confidential

Python Developer

Responsibilities:

  • Responsible for gathering requirements, system analysis, design, development, testing and deployment.
  • Participated in the complete SDLC process. Written many programs to parse excel file and process many user data with data validations.
  • Created AWS services using Cloud Formation and boto3.
  • Hand full experience on AWS services like S3 bucket, lambda, API Gateway, Cognito, Step Function, RDS and Cloud Watch using CDK
  • Created network architecture on AWS VPC, subnets, Internet Gateway, Route. Perform S3 buckets creation, configured the storage on S3 buckets, policies and the IAM role-based policies.
  • Used Cloud-watch for monitoring AWS cloud resources and the applications that deployed on AWS by creating new alarm, enable notification service.
  • Developed Restful Microservices and deployed on AWS API Gateway.
  • Worked on Git, REST API, and Aurora Postgres SQL.
  • Developed Azure DevOps CI/CD pipeline for deploying the code.
  • Created Mock Data using Pandas and Pyarrow and some other external libraries in python for testing in sandbox and non-prod systems.
  • Managed and trained peers on the installation of MYSQL and in writing queries to achieve performance optimization.
  • Wrote Python modules to extract/load asset data from the MySQL source database.
  • Integration of Git, Confluence, Jira.
  • Having experienced in Agile Methodologies, Scrum stories and sprints experience in a Python based environment, along with data analytics, data wrangling and Excel data extracts.
  • Created multiple Python Scripts for various application level tasks.
  • Created unit test/acceptance test and integration test framework for working/new code .
  • Using Git as version control tool to coordinate team-development.
  • Responsible for debugging and troubleshooting the web APIs.

ENVIRONMENT: Python 3.8, AWS, Django, MySQL, Django, Eclipse, HTML, Git, CSS, PHP, jira, MySQL, Aurora Postgres SQL.

Confidential, Springfield, MA

Python Developer

Responsibilities:

  • Developed Architecture for Parsing applications to fetch the data from different services and transforming to store in different formats.
  • Developed parsers for Extracting data from different sources of web services and transforming to store in various formats such as CSV, Database files, HDFS storage, etc. then to perform analysis.
  • Parsers written in Python for extracting useful data from the design data base. Used Parsekit (Enigma.io) framework for writing Parsers for ETL extraction.
  • Implemented Algorithms for Data Analysis from Cluster of Web services.
  • Worked with lxml to dynamically generate SOAP requests based on the services. Developed custom Hash-Key (HMAC) based algorithm in Python for Web Service authentication.
  • Worked with ReportLab PDF library to dynamically generate the PDF documents with Images and data retrieved from various sources of Web services.
  • Built the Web API on the top of Django framework to perform REST methods. Used MongoDB and MySQL databases in Web API development. Developed database migrations using SQLAlchemy Migration.
  • Generated graphical reports using python package Numpy and matPlotLib.
  • Usage of advance features like pickle/unpickle in python for sharing the information across the applications.
  • Managed datasets using Panda data frames and MySQL, queried MYSQL database queries from python using Python-MySQL connector and MySQL db package to retrieve information.
  • Utilized Python libraries wxPython, numPY, Twisted and matPlotLib.
  • Wrote Python scripts to parse XML documents and load the data in database.
  • Used Wireshark, live http headers, and Fiddler2 debugging proxy to debug the Flash object and help the developer create a functional component. The PHP page for displaying the data uses AJAX to sort and display the data. The page also outputs data to .csv for viewing in Microsoft Excel.
  • Added support for Amazon AWSS3 and RDS to host static/media files and the database into Amazon Cloud.
  • Writing Python scripts with Cloud Formation templates to automate installation of Auto scaling, EC2, VPC, and other services.
  • Used Docker containers for development and deployment.
  • Familiar with UNIX / Linux internals, basic cryptography & security.
  • Developed multiple spark batch jobs in Scala using Spark SQL and performed transformations using many APIs and update master data in Cassandra database as per the business requirement.
  • Written Spark-Scala scripts, by creating multiple udf's, spark context, Cassandra sql context, multiple API's, methods which support dataframes, RDD's, dataframe Joins, Cassandra table joins and finally write/save the dataframes/RDD's to Cassandra database.
  • As part of the POC migrated the data from source systems to another environment using Spark, SparkSQL.
  • Developed and implemented core API services using Python with spark.
  • Created dataframes in particular schema from raw data stored at Amazon S3 using PySpark.
  • Used PySpark Data frame for creation of table and performing analytics over it.
  • Using Jenkins AWS Code Deploy plugin to deploy to AWS.
  • Developed tools using Python, XML to automate some of the menial tasks. Interfacing with supervisors, artists, systems administrators and production to ensure production deadlines are met.

ENVIRONMENT: Python 3.x, Parsekit (Enigma.io), Django, Flask, lxml, SUDS, HMAC, pandas, Numpy, matplotlib, MongoDB, MySQL, SOAP, REST, PyCharm, Docker, AWS (EC2, S3).

We'd love your feedback!