We provide IT Staff Augmentation Services!

Sr. Python Developer Resume

5.00/5 (Submit Your Rating)

Washington, DC

SUMMARY

  • Self - motivated team player with the ability to effectively prioritize work to meet deadlines. Full-stack developer with years of experience including system design, coding, testing and maintenance with the ability to work collaboratively and independently on large scale projects.
  • My goal is to build user-friendly, responsive web applications that can adapt through multiple screens with emphasis on functionality and accessibility.
  • Extensive experience with Python and Django with expertise in several python packages. Strong knowledge of RDBMS and NoSQL with experience in Oracle, SQL Server, MYSQL, Elasticsearch, AWS, Big Data, CI/CD, REST API and JavaScript.
  • Full stack developer having experience in Designing, Development, Deployment and maintenance of web apps based onspringframework inJavaandDjango - Pythonplatforms.
  • Have good knowledge in various Python Integrated Development Environments like PyCharm and sublime text and wrote Lambda functions in python for AWS's Lambda, Kinesis and Elastic Search which invokes python scripts to perform various transformations and analytics on large data sets in AMAZON EMR clusters.
  • Experience of Programming using Object Oriented Programming (OOPsconcept) and software development life cycle (SDLC), architecting scalable platforms, Object oriented programming, database design and agile methodologies.
  • Proficient knowledge in Web services likeGoogle Cloudand AmazonAWSServices.
  • Good experience in Shell Scripting, SQL Server, UNIX and Linux, Open stock and Expertise python scripting with focus on DevOps tools, CI/CD and AWS Cloud Architecture.
  • Well versed inobject oriented technologieson client server, multi - tier applications, and Web technologies with experience inJava/J2EE technologiesinvolving System Analysis, Technical Architecture,Design, Development, Testingand Implementation.
  • Experience to build CI/CD Pipeline to automate the code release process using Integration tools like Git/SVN, GitHub, Jenkins.
  • Strong knowledge of Data Structures and Algorithms, Object Oriented Analysis, machine learning and software design patterns and hands on Experience in Data mining and Data warehousing using ETL Tools.
  • Experienced in design patterns such as MVC using Django, Flask and deploying application on Apache tomcat server, containerizing applications using Docker.

TECHNICAL SKILLS

Operating Systems: Windows, MAC OS, Linux

Languages: Python, Java, C++, SQL, PL/SQL, NoSQL

Frameworks: Django, Flask, AngularJS, React JS

Scripting languages: JavaScript, Shell Scripting.

Mark-up languages: HTML, XML, JSON

Databases: MySQL, MongoDB, NoSQL, PostgreSQL, Cassandra, Oracle

Servers: Apache Tomcat and IBM

Automation tools: Jenkins, Chef, Puppet.

Methodologies & tools: Agile Scrum, Waterfall Model

Tracking Tools: Jira

Version Control Systems: GIT, SVN

Cloud Computing Platforms: AWS, Puppet, Chef, Cloud Front, Beanstalk, EC2, S3

Python Libraries: Pandas, NumPy, Unit Test. SciPy.

API Integration: JSON, REST, XML, SOAP

PROFESSIONAL EXPERIENCE

Confidential, Washington, DC

Sr. Python Developer

Responsibilities:

  • Involved in development of Python based Web Services using REST for sending and getting data from the external interface in the JSON format and to track sales and perform sales analysis using Django and PostgreSQL.
  • Engaged in development of RESTful Web Services for getting and sending data from the external interface in the JSON format and Developed user interface using CSS, PHP, HTML5, JavaScript and jQuery.
  • Improved automation efforts with relevant scripting and appropriate tools. Work closely with the testing team during User Acceptance Testing (UAT), fixes the bugs raised during the UAT based on their priority.
  • Developed Microservices by creating REST APIs and used them to access data from different suppliers and to gather network traffic data from servers.
  • Involved in Agile Methodologies, played a major role in Design and developed data management system using PostgreSQL.
  • Worked on Swagger spec to create API using a specific JSON or YAML schema that outlines the names, order, and other details of the API.
  • Used react for fetching the real-time data and dynamically showcasing it made it to the internetwithout any delay of the information.
  • Designed and Developed Real time Stream processing Application using Spark, Kafka, Scala and Hive to perform Streaming ETL and apply Machine Learning.
  • Involved in loading data from edge node to HDFS using shell scripting and also involved in managing Hadoop Log Files and Hadoop infrastructure with Cloudera Manager.
  • UsedReact-Routerto create a single page application. AppliedRouter Guardto deny unauthorized access.
  • Used Jenkins pipelines to drive all Microservices builds out to the Docker registry and then deployed to Kubernetes, Created Pods and managed using Kubernetes.
  • Developed the notification service by posting the JSON request in AWS API Gateway, Validating the response in Lambda by getting the data from MongoDB and sending the notification through AWS SNS.
  • Responsible for creating backups, checking for idle resources, generating reports and other tasks which frequently occur can be implemented in no time by using the boto3 Python libraries and hosted in AWS Lambda.
  • Deployed Kubernetes for the runtime environment of the CI/CD system to build, test deploy applications on AWS EC2, S3 and EKS.
  • Worked on ETL tools and Web Services like REST API and SOAP API to integrate Salesforce with other applications within the organization.
  • DevelopedMap Reduceprograms to parse the raw data, and create intermediate data which would be further used to be loaded intoHiveportioned data.
  • Implemented NodeJS for Web scrapping and automation and also using NodeJS makes data streaming easier because data comes in through a stream and can be processed online without being interrupted.

Confidential, New York, NY

Python Developer

Responsibilities:

  • Designed and developed highly generic and efficient applications to extract financial information from unstructured data including logs, csv and xml document utilizing Python and related technologies framework like Flask, MySQL, EC2, S3, Glue, Athena, Dynamo DB and Red Shift.
  • Deployed and configured Elastic search, Log stash and Kibana (ELK) for log analytics, full text search, application monitoring in integration with AWS Lambda and Cloud Watch and then store that logs and metrics into S3 bucket using Lambda function.
  • Build the Silent Circle Management System (SCMC) in Elastic search, Python, and Node.JS while integrating with infrastructure services.
  • Created a Python/Elastic search based web application using Python scripting for data processing, MySQL for the database, and HTML/CSS/Ruby and High Charts for data visualization of the served pages.
  • Develop python code to automate the ingestion of common formats such as JSON, CSV by using Log stash from elastic search to Kibana dashboard to be viewed by clients.
  • Designed RESTful Web services usingFLASK, with emphasis on improved Security for the service usingFLASK-HTTP AuthwithHTTPS. Also utilized Hug libraries to develop HTTP REST API’s to provide validations and usedCherryPyframework to model and bind HTTP.
  • Involved in the design and maintenance of the databases using Python and Python based Restful API using Flask, SQL and PostgreSQL.
  • Documented company RestfulAPI's using Swagger for internal and third part use and worked on Unit testing and Integration testing.
  • Involved in performance tuning usingPartitioning, bucketingof Hive tables and also involved in optimizing and tuning of ETL processes & SQL Queries for better performance.
  • Conducted root analysis figures to solve logic issue related all event-based algorithmic engines designed for high-volume, low latency automated order management to now execute millions of orders and scan multiple markets and exchanges in a matter of seconds.
  • Involved in the complete end-to-end continuous integration and continuous delivery (CI/CD) process, building and deploying the application on Apache servers using Jenkins, Deploy and Release.
  • Implemented Model View Control (MVC) architecture using server-side applications like Flask for developing web applications.
  • Build ETL pipeline end to end from AWS S3 to Key, Value store DynamoDB, and Snowflake Data warehouse for analytical queries and specifically for cloud data.
  • We used a multiple row data storage strategy calledMVCCto make effectivePostgreSQLresponsive in Querying and storing in database.
  • Provided technical assistance formaintenance, integration and testingof software solutions during development and release processes.
  • Developing Micro services RESTful APIs that provides fast and efficient data exchange against SQL and NoSQL Databases for the BI SaaS product.
  • Developed reporting dashboards, which processes large amount data that are stored in Elastic Search and My SQL platforms and generates reports and displays in dashboards.
  • Developed web Components using JSP, Servlets and Server-side components using EJB under J2EE Environment.Worked on implementing Microservices architecture by using docker images and deploy them on the AWS ECS services.

Confidential

Software Engineer

Responsibilities:

  • Develop a data platform from scratch and took part in requirement gathering and analysis phase of the project in documenting the business requirements.
  • Worked in designing tables in Hive, MYSQL using SQOOP and processing data like importing and exporting of databases to the HDFS, involved in processing large datasets of different forms including structured, semi-structured and unstructured data.
  • CreatedRFP Microserviceto provideRESTful APIutilizingSpring5withMicroservices and Implemented CI/CD pipelines using Jenkins and build and deploy the applications.
  • Developed rest API's using python with flask and Django framework and done the integration of various data sources including Java, JDBC, RDBMS, Shell Scripting, Spreadsheets, and Text files.
  • Worked with Hadoop architecture and the daemons of Hadoop including Name-Node, Data Node, Job Tracker, Task Tracker, and Resource Manager.
  • Used AWS data pipeline for Data Extraction, Transformation and Loading from homogeneous or heterogeneous data sources and built various graphs for business decision-making using Python Mat plot library.
  • Developed scripts to load data to hive from HDFS and involved in ingesting data into Data Warehouse using various data loading techniques.
  • Involved in Kafka and building use case relevant to our environment.
  • Implement Continuous Integration and Continuous Delivery process using Git Lab along with Python and Shell scripts to automate routine jobs, which includes synchronize installers, configuration modules, packages and requirements for the applications.
  • Integrated Salesforce.com with external systems like Oracle and SAP using SOAP API and REST API and also integrated applications with salesforce.com using SOAP web services API.
  • Build Cassandra queries for performing various CRUD operations like create, update, read and delete, also used Bootstrap as a mechanism to manage and organize the html page layout
  • Developed entire frontend and backend modules using Python on Django Web Framework and created User Interface (UI) using JavaScript, bootstrap, Cassandra with MySQL and HTML5/CSS.
  • Utilized Kubernetes and Docker for the runtime environment for the CI/CD system to build, test, and deploy.
  • Importing and exporting data jobs, to perform operations like copying data from HDFS and to HDFS using Sqoop and developed Spark code and Spark-SQL/Streaming for faster testing and processing of data.
  • Designed and developed a horizontally APIs using python Flask and implemented monitoring and established best practices around using elastic search.

We'd love your feedback!