We provide IT Staff Augmentation Services!

Python Developer Resume

4.50/5 (Submit Your Rating)

SUMMARY

  • Around 5 years of experience in IT services and consulting working as a Python Developer with expertise in Python, Django/Flask, design patterns, libraries, Amazon Web Services (AWS), Data Science, NoSQL (MongoDB, Cassandra) and Relational databases (MySQL, SQL Server, Oracle, Postgres), working in full Software Development Life Cycle (SDLC).
  • Hands on experience with different AWS services like S3, EC2, EMR, SNS, SQS, Lambda, Redshift, Data pipeline, Athena, AWS Glue, S3 Glacier, Cloud Watch, Cloud Formation, IAM, AWS Single Sign - On, Key Management Service, AWS Transfer for SFTP, VPC, SES, Code Commit, Code Build, Amazon RDS, Amazon Elastic Load Balancing, Amazon SQS, AWS Identity and access management, AWS Cloud Watch, Amazon EBS and Amazon Cloud Front.
  • Expertise in implementing Object-Oriented technologies, Web based client-server architecture, service-oriented architecture and Object Relational Mappings (ORM).
  • Experience and proficient in working on large data sets, data analysis, data visualizations using tools like Tableau to solve business issues, interacting with databases using MySQL and Oracle.
  • Extensive experience in developing and designing data integration solutions using ETL tool such as Informatica Powercenter, Teradata Utilities for handling large volumes of data.
  • Experienced in developing Web-based Applications using Python, React, CSS, HTML, JavaScript, Angular JS, and JQuery.
  • Highly experienced with DevOps technologies including Jenkins, Splunk, Chef, RabbitMQ, ELK Stack, Apache ActiveMQ, Docker, Kubernetes, Ansible.
  • Experience with automation/configuration management using tools like Ansible, Puppet, Chef and SaltStack.
  • Experienced in developing API services in Python/Tornado, NodeJS while leveraging AMQP and RabbitMQ for distributed architectures.
  • Experienced in handling Unit Testing of UI using Jasmine, Karma and Junit along with Test-Driven Development (TDD) methodology.

PROFESSIONAL EXPERIENCE

Python Developer

Responsibilities:

  • Design/Develop Data pipelines using AWS solutions, Python/Spark using Airflow
  • Design/Lead Snowflake based Consumer Intelligence Datawarehouse
  • Design/Lead framework and implementation for Consumer Privacy initiatives like GDPR/CCPA
  • Develop remote integration with third party platforms by using RESTful web services and Successful implementation of Apache Spark and Spark Streaming applications for large scale data.
  • Used Python modules like Restful and Pandas library for statistical analysis and generating complex graphical data and NumPy for numerical analysis.
  • Involved in understanding requirements and in modelling activities of the attributes identified from different source systems which are in Oracle, Teradata, CSV FILES. Data is Staged, integrated, Validated, and finally loaded the data into Teradata Warehouse using Informatica and Teradata Utilities.
  • Develop Data Pipelines using Python libraries, Data staging, transformation, and aggregation in MongoDB.
  • Implement Data validation and cleansing rules to combine and prep Geospatial data from multiple sources.
  • Used Pandas API to put the data as time series and tabular format for east timestamp data manipulation and retrieval. Used Pandas library for statistical Analysis. Worked on Python Open stack API's.
  • Implemented Microservices architecture to convert monolithic heavy application into smaller applications and Implemented Microservices and modules using cutting edge reactive technologies. Moving the microservices as cloud bases in AWS EC2 server & deployed using Elastic bean stalk services of AWS.
  • Used Chef, deployed and configured Elasticsearch for log analytics, full text search, application monitoring in integration with AWS Lambda and CloudWatch.
  • Managed large datasets using Pandas API ecosystem to analyse the different segments of customers based on Location.
  • Developed full stack Python web framework with an emphasis on simplicity, flexibility, and extensibility. It is built atop excellent components and reinvents zero wheels. WSGI, routing, templating, forms, data, plugins, config, events, SQL Alchemy, Storm, CouchDB, OpenID, App Engine, jQuery, etc.
  • Developed views and templates using Python and created a user-friendly website interface using Django’s view controller and template language.
  • Implemented advanced procedures like text analytics and processing using the in-memory computing capabilities like Apache Spark.
  • Created a Handler function in Python using AWS Lambda that can invoke when the service is executed.
  • Added support for Amazon AWS S3 and RDS to host static/media files and the database into Amazon Cloud.
  • Developed Kafka producer and consumers, HBase clients, Spark, shark, Streams and Hadoop MapReduce jobs along with components on HDFS, Hive.
  • Developed Spark code using Scala and Spark -SQL for batch processing of data Utilized in-memory processing capability of Apache Spark to process data using Spark SQL, Spark Streaming using Pyspark and Scala scripts.
  • Create Pyspark scripts to load data from source files to RDDs, create data frames from RDD and perform transformations and aggregations and collect the output of the process.
  • Consuming high volumes of data from multiple sources (Such as Hive, MySQL, HBase, xls) and performing transformations using Pyspark.
  • Leveraged AWS cloud services such as EC2, auto scaling and VPC to build secure, highly scalable and flexible systems that handled load on the servers.
  • Implemented TFS Build Archival to AWS Simple Storage Service S3 and created Life Cycles for managing the files in S3. Implemented cloud watch for setting alarm for monitoring the Ec2 instances.
  • Worked on NoSQL database MongoDB and developed custom MongoDB applications as per the client specification.
  • Created server monitoring daemon with Psutil, supported by Elasticsearch app for analytics which I created. Also researched big data solutions with Cassandra database.
  • Worked on migration of Splunk to AWS(cloud) instances. Involved in standardizing Splunk forwarder deployment, configuration and maintenance across UNIX and Windows platforms.
  • Developed user interface by using the React JS, Flux for SPA development.
  • Implemented React JS code to handle cross-browser compatibility issues in Mozilla, IE 7, 8, 9, Safari and FF.
  • Used React -Router to turn application into Single Page Application
  • Worked in using React JS components, Forms, Events, Keys, Router, Animations, and Flux concept.
  • Used Web services (SOAP and RESTful) for transmission of large blocks of XML/JSON.
  • Worked on responsive design and developed a single ISOMORPHIC responsive website that could be served to desktop, Tablets and mobile users using React JS.
  • Used React -Auto complete for creating Google maps location search on the webpage Added Excel-Builder to download the Tabular data in Excel format using React JS.
  • Re-writing application code using React framework and created high level design documents
  • Created/Build new features using React for UI and Spring Boot for backend
  • Worked on developing CRUD applications using MERN stack (MongoDB, ExpressJs, ReactJS and NodeJS) and REST based API.
  • Using Chef, deployed and configured Elasticsearch, Logstash and Kibana (ELK) for log analytics, full text search, application monitoring in integration with AWS Lambda and CloudWatch.
  • Built Elastic search, Log stash and Kibana (ELK) to store logs and metrics into S3 bucket using Lambda function.
  • Wrote python scripts using Boto3 to automatically spin up the instances in AWS EC2 and OPS Works stacks and integrated with Auto scaling to automatically spin up the servers with configured AMIs.
  • Developing python programs by using boto3 sdk to implement security by using AWS Cognito service.
  • Using Python included Boto3 to supplement automation provided by Ansible and Terraform for tasks such as encrypting EBS volumes backing AMIs and scheduling Lambda functions for routine AWS tasks.
  • Worked on ETL Migration services by developing and deploying AWS Lambda functions for generating a serverless data pipeline which can be written to Glue Catalog and can be queried from Athena.
  • Writing to Glue metadata Catalog which in turn enables us to query the refined data from Athena achieving a serverless querying environment.
  • Processed data in AWS Glue by reading data from MySQL data stores and loading in to RedShift Data warehousing.
  • Used ReactJS to create Controllers to handle events triggered by clients and send request to server. Maintained states in the stores and dispatched the actions using redux.
  • Log monitoring and generating visual representations of logs using ELK stack. Implement CI/CD tools Upgrade, Backup, Restore, DNS, LDAP and SSL setup.
  • Worked with Data migration from Sqlite3 to Apache Cassandra database. Cassandra data model designing, implementation, maintaining and monitoring using DSE, Dev Centre, Data stax Ops center.
  • Build the Silent Circle Management System (SCMC) in Elasticsearch, Python, and Node.JS while integrating with infrastructure services.
  • Automated various infrastructure activities like Continuous Deployment, Application Server setup, Stack monitoring using Ansible playbooks and has Integrated Ansible with Run deck and Jenkins.
  • Used SQL Alchemy as Object Relational Mapper (ORM) for writing ORM queries.
  • Developed a module to build Django ORM queries that can pre-load data to greatly reduce the number of databases queries needed to retrieve the same amount of data.
  • Creating and configuring virtual development environments with Chef and Virtual Box part of the SOA (Service Oriented Architecture) team enforcing best practices for services (REST and SOAP).
  • Implemented discretization and binning, data wrangling: cleaning, transforming, merging and reshaping data frames.
  • Determined optimal business logic implementations, applying best design patterns.
  • Increased speed and memory efficiency by implementing code migration to convert python code to C/C++ using Python.
  • Written and Maintained Automated Salt scripts for Elasticsearch, Logstash, Kibana, and Beats.
  • Involved in front end and utilized Bootstrap and Angular.js for page design. Created Data tables utilizing PyQt to display customer and policy information and add, delete, update customer records.
  • Worked on Jenkins continuous integration tool for deployment of project. Played a key role in a development wide transition from Subversion to Git, which resulted in increase in efficiency for the development community.
  • Used Python Library Beautiful Soup for web Scrapping. Responsible for debugging and troubleshooting the web application.
  • Integration with CICD (Continuous integration and continuous Deployment) Process for all microservices and serve them to external innovations.
  • Worked with Docker for containing microservices and write docker file, Jenkins for Automated Integration

Python Developer

Confidential

Responsibilities:

  • Developed frontend and backend modules using Python on Django including Tasty Pie Web Framework using Git.
  • Implemented SQL Alchemy which is a python library for complete access over SQL.
  • Developed views and templates with Python and Django's view controller and templating language to created user-friendly website interface.
  • Developed the presentation layer using HTML, CSS, JavaScript, JQuery and AJAX. Utilized Python libraries wxPython, numPY, Pandas, Twisted and mat PlotLib.
  • Developed cross-browser/platform with Reactjs, NodeJS, JQuery, AJAX and HTML5/CSS3 to desired design specs for single page layout using code standards. Created UI from scratch using Reactjs.
  • Involved in using React JS components, Forms, Events, Keys, Router, Animations, and Flux concept.
  • Involved in building stable React components and stand-alone functions to be added to any future pages.
  • Used React -Auto complete for creating Google maps location search on the webpage.
  • Added Excel-Builder to download the Tabular data in Excel format using React.
  • Used Pandas, NumPy, seaborn, SciPy, Matplotlib, Scikit-learn, NLTK in Python for developing various machine learning algorithms and utilized machine learning algorithms such as linear regression.
  • Automated labelling of unlabelled image data by returning labels with the help of existing NSFW classifiers into CSV file, using Python and batch processing; classified and labelled 157,000+ images.
  • Parallelized script over 4 CPU cores with batch size of 5,000 images with Python and concurrent processing library, improving time efficiency for classification by around 40 - 45%.
  • Analyzed roughly-labelled data to better understand direction to proceed in project using Python and Excel; results helped to identify 60 - 65% of false positives for existing NSFW model classification.
  • Completed thorough research on how to use multimodal data for efficient risk flagging system to build more efficient NSFW classification system using models such as ResNet for image classification of 90 - 92%.
  • Used Django configuration to manage URLs and application parameters.
  • Designed and deployed a multitude application utilizing almost all AWS stack (EC2, S3, VPC, ELB, Auto Scaling Groups, SNS, SQS, IAM, CloudFormation, Lambda, Glue, SQS) focusing on high-availability, fault-tolerance and auto-scaling.
  • Installed, configured, and managed the AWS server. AWS data pipeline for Data Extraction, Transformation and Loading from the homogeneous or heterogeneous data sources.
  • Accessed database objects using Django Database APIs. Worked on python-based test frameworks and test-driven development with automation tools. Worked with real time streaming application and batch style large scale distributed computing applications using tools like Spark Streaming.
  • Implemented advanced procedures like text analytics and processing using the in-memory computing capabilities like Apache Spark written in Scala.
  • Responsible for debugging and troubleshooting the web application. Manage the configurations of multiple servers using Ansible.
  • Deployed mircoservices2, including provisioning AWS environments using Ansible Playbooks.
  • Provisioned load balancer, auto-scaling group and launch configuration for mirco service using Ansible.
  • Created cloud service using AWS, managed Virtual machines and websites using AWS-EC2, ELB, Autoscaling, Lambda.
  • Developed installer scripts using Python(boto3) for various products to be hosted on Application Servers. Written Python utilities and scripts to automate tasks in AWS using boto3 and AWS SDK. Automated backups using AWS SDK (boto3) to transfer data into S3 buckets.
  • Developed Data Mapping, Transformation and Cleansing rules for the Master Data Management Architecture involved OLTP, ODS and OLAP.
  • Deployment, Application Server setup, Stack Monitoring using Ansible playbooks and has integrated Ansible with Run deck and Jenkins.
  • Provisioned and patched servers regularly using Ansible. Implemented Ansible to manage all existing servers and automate the build/configuration of new servers.
  • Developed an Ansible role for Zabbix-agent which will be integrated into the to the CICD pipeline.
  • Used Ansible to document all infrastructures into version control. Used Ansible to document application dependencies into version control.
  • Responsible for designing and deploying new ELK clusters (Elasticsearch, Logstash, Graphite Kibana, beats, Kafka, zookeeper etc.
  • Design, build and manage the ELK (Elasticsearch, Logstash, graphite, Kibana) cluster for centralized logging and search functionalities for the App.
  • Configured internode communication between Cassandra nodes and client using SSL encryption.
  • Docker container deploying microservices, and scaling the deployment using Kubernetes.
  • Developed Chat Ops interfaces with slack and Kubernetes on GKE.
  • Working on Spinnaker platform for Multi-Cloud Continuous Delivery (Bake, Test, & Deploy/Container Pipelines) using Packer, Terraform, Kubernetes, AWS, GCP.
  • Responsible for on boarding Application teams to build and deploy their code using Git Hub Jenkins, Nexus and Ansible.
  • Migrated out core repository from SUBVERSION to GIT. Managed GitHub Projects and migrated from SVN to GitHub with history.
  • Created monitors, alarms and notifications for EC2 hosts using Cloud Watch. Performed web testing automation using selenium API.
  • Worked on Redux making to do list reduces, reducers functions and implementing store method PI Services, JavaScript, Bootstrap, GIT, and JSON.
  • Responsible for Configuring Kafka Consumer and Producer metrics to visualize the Kafka System performance and monitoring. Written and Maintained Automated Salt scripts for Elasticsearch, Logstash, Kibana, and Beats.
  • Performed parameterization of the automated test scripts in Selenium to check how the application performs against multiple sets of data Contributed in developing.
  • Automation Framework that uses Java, Selenium Web Driver and Testing. Wrote automation test cases and fixing automation script bugs. Worked with migration to Amazon web Services AWS.

Python Developer

Confidential

Responsibilities:

  • Developed front-end using HTML, CSS, JavaScript and jQuery.
  • Responsible for understanding the functional requirements, writing the technical design and developing the requirements
  • Application server upgrade testing support.
  • Client engagement developer and implementation engineer for code install to production.
  • Designing mobile search application system requirements and coded back-end and front-end in Django/ Python. Authored encryption scripts.
  • Collaborated with cross-functional teams to scope project requirements, author documents, etc.
  • Interactive in providing change requests, trouble reports and requirements collection with the client.
  • Maintained the client-server environment and implemented the updates successfully.
  • Created and modified required views/tables/triggers using SQL and worked on database bug fixes.
  • Involved in Developing a Restful service using Python Flask framework.
  • Involved in the Design, development, test, deploy and maintenance of the website.
  • Using Django evolution and manual SQL modifications was able to modify Django models while retaining all data, while site was in production mode.
  • Worked with JSON based REST Web services and Created a Git repository and added the project to GitHub.
  • Helped create interactive prototypes and UI specifications, including screen layouts, color palettes, typography, and user-interface elements.
  • Wrote and executed various SQL queries from python using Python-MYSQL connector and MYSQL DB package.
  • Performed front-end development for web initiatives to ensure usability, using HTML, CSS, Bootstrap, and JavaScript.
  • Developed the required XML Schema documents and implemented the framework for parsing XML documents.
  • Queried MySQL database queries from Python using-MySQL connector and MySQL DB package to retrieve information.

We'd love your feedback!