We provide IT Staff Augmentation Services!

Sr. Python Developer Resume

0/5 (Submit Your Rating)

Dallas, TX

SUMMARY

  • Around 7+ years of experience as Python developer with proven expertise in using new tools and technical developments to drive improvements throughout entire Software Development Lifecycle.
  • Strong Experience in implementing Data warehouse solutions in Confidential Redshift; Worked on various projects to migrate data from on premise databases to Confidential Redshift, RDS and S3.
  • Involved in all the stages of Software Development Life Cycle Primarily in Database Architecture, Logical and Physical modeling, Data Warehouse/ETL development using MS SQL Server, Oracle 11g/10g, and ETL Solutions/Analytics Applications development.
  • Developed and designed an API (RESTful, SOAP, SOA Web Service) for the chatbot integration.
  • Hands on experience on architecting the ETL transformation layers and writing spark jobs to do the processing.
  • Worked with the Relational database systems like MySQL, MSSQL, PL/SQL, Oracle, DB2, SQLServer, RDBMS, and NoSQLdatabase systems like Redis, PostgreSQL, MongoDB, CouchDB, Cassandra, Wakanda.
  • Good at writing SQL Queries, Stored procedures, functions, packages, tables, views, triggers.
  • Built CI/CD pipelines using Docker, Jenkins,and Marathon.
  • Proficient in developing Web Services, RESTful inPython using XML, JSON.
  • Developed applications using RESTful architecture using Node.js and Python as backend languages and used Numpy for Numerical analysis.
  • Designed and Developed Data Warehouses using Star Schema, Snowflake depending upon business needs.
  • Strong Experience in implementing Data warehouse solutions in Confidential Redshift; Worked on various projects to migrate data from on premise databases to Confidential Redshift, RDS and S3.
  • Implemented the Extraction, Transforming and Loading (ETL) strategy by creating packages using SSIS to extract data from DB2, SQL Server, Excel file, other databases, and Flat File sources.
  • Experienced in using big data platforms like DataBricks, Pyspark, Glue, EMR.
  • Worked on Big data tools like Hive QL, Pyspark, Jupyter Notebooks, AWSAthena, AWSGlue.
  • Enhanced Configuration Management using(PUPPET)to assist with automated, repeatable and consist of configuration and application deployments.
  • Designing, implementing, and maintaining solutions for using Docker, Jenkins, Gitfor microservices and continuous deployment.
  • Expertise with cloud platforms like Amazon AWS, S3, EC2. Experience working with libraries pandas, NumPy
  • Hands on in design, develop, test and implementation of web development Python, Django, HTML, XML, CSS, JavaScript, Bootstrap, jQuery, JSON and, AngularJS and Nodejs.
  • Performed various Parsing techniquesusing PySparkAPI'S to cleanse the data from Kafka.
  • Worked on developing Restful endpoints to cache application specific data in in - memory data clusters like REDIS and exposed them with Restful endpoints.
  • Used SQLAlchemy as Object Relational Mapper (ORM) for writing ORM queries.
  • Developed custom consumers and producers for ApacheKafka in Golang for cars monitoring system.
  • Designed the real-time analytics and ingestion platform using Storm and Kafka. Wrote Storm topology to accept the events from Kafka producer and emit into Cassandra DB.
  • Experience in developing web applications by following Model View Control (MVC) Architecture using server-side applications Django, Flask, WebPy, BottlePy, NumPy and Pyramid.
  • Build back-end application withPython / Django, Worked on Dockers, and Jenkins.
  • Hands on experience in implementing LDA, Naive Bayes and skilled in Random Forests, Decision Trees, Linear and Logistic Regression, SVM, Clustering, neural networks, Principle Component Analysis.
  • Proficient in SQL databases MS SQL, MySQL, Oracle, MongoDB,Amazon DynamoDB, or MongoDB
  • Experienced in various types of testing such as Unit testing, Integration, User acceptance, Functional testing.
  • Experience in working with Machine Learning Frameworks like segmentation networks, Transfer learning and deep learning.

TECHNICAL SKILLS

Languages: Python, Golang, Java, C, C++

Database: MySQL, Oracle, Redshift,SQL Azure, SQL, Mongo dB, Cassandra, postgre SQL

Automation Tools: Terraform, CloudFormation, chef, Puppet, Ansible, Docker, Kubernetes, Vagrant.

Scripting Languages: Bash/Shell Scripting, Powershell scripting, CSS, AJAX, PHP, JavaScript, JQuery.

Servers: Tomcat, APACHE 2.x, 3.x, JBOSS 4.x/5.x, WebLogic (8/9/10), WebSphere4/5, TFS, NginxAzure, IIS, Redhat Satellite.

IDE/Tools: PyCharm, Jupyter, PySpark, Eclipse, Spyder

Project Management Tools: Jira, GitHub, Slack

MS office: MS Excel, Word, Power point, etc.

Cloud services: AWS S3, EC2, Athena, Amazon EMR, snowflakes

PROFESSIONAL EXPERIENCE

Confidential - Dallas, TX

Sr. Python Developer

Responsibilities:

  • Developed views and templates for applications with MySQL, Python and Django's view controller and templating language to create a user-friendly interface to perform in a high-level
  • Good experience in backend application development with Django, Flask, JavaScript, Next, AngularJS, MySQL
  • WroteAnsible PlaybookswithPython, SSHas the Wrapper to Manage Configurations ofAWS Nodesand Test Playbooks on AWS instances usingPython, Ansible Scripts to provision development servers.
  • Built web application by using Python, Django, AWS, J2EE, PostgreSQL, MySQL, Oracle, and MongoDB.
  • Developed backend of the application using the flask framework.
  • Designed new features and made improvements to existing features in the code by fixing bugs.
  • Extensively worked on Data Extraction from REST API’s, performed data munging, data modelling and loaded the data into Redshift DB.
  • Used NumPy and Pandas for data manipulation to check machine learning models.
  • Developed several REST APIs using Swagger, microservices style of architecture with Kafka as message broker and Mongo DB as backend database.
  • Developed and tested many features for dashboard using React, Next, Flask, CSS, and JavaScript
  • Managed and supported Continuous Integration (CI) using Jenkinsand bitbucket
  • Helped in interactive API documentation for specific Python SDK methods to write custom requirements
  • Developed framework for converting existing Power Center mappings to PySpark (Python and Spark)
  • Created PySpark frame to bring data from DB2 to Amazon S3
  • Wrote Lambda functions in python for AWS's EMR, Lambda which invokes python scripts to perform various transformations and analytics on large data sets to various formats.
  • Validated the developed lambda scripts and fixed the identified bugs.redecs
  • Developedmicroservicesusing AWS EMR Lambda, API Gateway, DynamoDB, RDS according to the scenario.
  • Coded Snowflake data loaders using Python. Reorganized large volumes of data.
  • Created Python ETL pipelines for teardown and “trickle” data migration from different backends to Snowflake, SQLServer, and Vertica.
  • Provided guidance to development team working on PySpark as ETL platform
  • Responsible for ETL and orchestration process using Airflow and NiFi tool
  • Added support for Amazon AWS S3 and RDS to host files and the database into Amazon Cloud.
  • Debugged the application and followed messages in log files, to figure out the error if existing.
  • Worked with decision making team to find patterns for data processing and have knowledge working on Artificial Neural Network using Tensor Flow in Python to identify the customer's probability of claiming the insurance
  • Developed API modularizing existingpythonmodules with the help of Pyyaml libraries.
  • Utilized PyUnit, the Pythonunit test framework, for all Python applications.
  • Involved in various phases of testing like Installation Testing, Functional Testing,Regression Testing, Integration Testing, Non-Functional Testing, Database Testing, Compatibility Testing andUAT Testing.
  • Creating unit test/regression test framework for working code
  • Experience in developing test automation framework scripts using Python Selenium WebDriver
  • Assisted with writing effective user stories and divide the stories into SCRUM tasks.

Environment: Python, Flask, AWS, Pyramid, Redis, Django, Docker, REST, GitHub, Swagger, LINUX, NumPy, Node.JS, AJAX, ReactJS, Angular2, Devops

Confidential - St. Louis, MO

Python Engineer

Responsibilities:

  • Participated in the complete SDLC process and usedPythonto develop website functionality.
  • Developing Intranet Web Application usingJ2EEarchitecture, usingJSPto design the user interfaces and Hibernate for database connectivity.
  • Designed and developed data management system using MySQL, application logic usingPython and DjangoAPIs for database access.
  • Created documentation for all the components which is included in React-Bootstrap page.
  • UsedReact JSfortemplatingfor faster compilation and developing reusablecomponents.
  • Established Microservices architecture using docker and Kubernetes.
  • Worked with OpenShift platform in managing Docker containers and Kubernetes Clusters
  • Participated in requirement gathering and worked closely with the architect in designing and modeling.
  • Worked on development of SQL and stored procedures on MYSQL.
  • Designed and developed a horizontally scalable APIs using Flask and Cassandra schema for the APIs.
  • Designed RESTful XML web service for handling AJAX requests.
  • Developed remote integration with third party platforms by using RESTful web services.
  • Updated and maintained Jenkins for automatic building jobs and deployment.
  • Implemented database access using DjangoORM.
  • Backend data access modules using PL/SQL stored procedures and Oracle were designed and created.
  • Created RESTful APIs calls with server, parse output report of excel files.
  • Extensively usedpythonmodules such as requests, urllib, urllib2 for web crawling
  • Used Pandas API to put the data as time series and tabular form for east timestamp data manipulation and retrieval
  • Used Test Driven Approach for developing the application and implemented the unit tests usingPythonUnit Test framework
  • Maintained existing Power BI reports and dashboards and made changes onto the same as per requirements.
  • Implemented code to perform CRUD operations on MYSQL using Toad.
  • Deployment of the web application using the Linux server with Bash scripts.

Environment: Python, Django Framework, React JS, CSS, PyCharm, SQL, MySQL, LAMP, jQuery, Kubernetes, Adobe Dreamweaver, Apache web server

Confidential - San Rafael, CA

Python Engineer

Responsibilities:

  • Worked on designing and developing the Real - Time Tax Computation Engine usingOracle, StreamSets, Kafka, Spark Structured StreamingandMySQL.
  • Generated Python Django Forms to record data of online users.
  • Created PHP/MySQL back-end for data entry from Flash.
  • Worked with marketing company to build several Django, Pyramid, Flask and CherryPy applications.
  • Good experience in backend application development with Django, Flask, JavaScript, Next, AngularJS, MySQL
  • WroteAnsible PlaybookswithPython, SSHas the Wrapper to Manage Configurations ofAWS Nodesand Test Playbooks on AWS instances usingPython, Ansible Scripts to provision development servers.
  • Performed advanced procedures like text analytics and processing, using the in-memory computing capabilities of Spark using Scala.
  • Designed and built the reporting application that uses the Spark SQL to fetch and generate reports on HBase table data.
  • Implemented Spark using Scala and utilizing Data frames andSpark SQLAPI for faster processing of data.
  • Involved in ingestion, transformation, manipulation, and computation of data usingStreamSets, Kafka, MySQL, Spark
  • Involved in data ingestion intoMySQLusingKafka - MySQL pipelinefor full load and Incremental load on variety of sources like web server,RDBMS,and Data API’s.
  • Worked on Spark Data sources, Spark Data frames,Spark SQLand Streaming using Scala.
  • Worked extensively on AWS Components such as Elastic Map Reduce (EMR), Elastic Compute Cloud (EC2), Simple Storage Service (S3),Amazon Cloud EC2, Simple Storage Service S3 and Amazon SQS.
  • Experience in developingSparkapplication usingScala SBT.
  • Experience in integratingSpark-MySQL connectorandJDBC connectorto save the data processed inSparktoMySQL.
  • Responsible for creating tables andMySQL pipelineswhich are automated to load the data into tables fromKafkatopics
  • Performed a POC to check the time taking for Change Data Capture (CDC) of oracle data acrossStream, StreamSetsandDB Visit
  • Developed data warehouse model in snowflake for over many datasets using whereScape.
  • Heavily involved in testing Snowflake to understand best possible way to use the cloud resources.
  • Scheduled different Snowflake jobs using NiFi.
  • Used NiFi to ping snowflake to keep Client Session alive.
  • Expertise in using different file formats likeText files, CSV, Parquet, JSON
  • Experience in custom compute functions usingSpark SQLand performed interactive querying.
  • Responsible for masking and encrypting the sensitive data on the fly
  • Responsible for creating multiple applications for reading the data from different Oracle instances to Kafka topics usingStream
  • Responsible for setting up a MySQL cluster on AWS EC2 Instance
  • Experience in Real time streaming the data usingSparkwithKafka.
  • Responsible for creating a Kafka cluster using multiple brokers.
  • Experience working on Vagrant boxes to setup a local Kafka and StreamSets pipelines.
  • Wrote scripts and indexing strategy for a migration to Confidential Redshift from SQL Server and MySQL databases
  • Worked on AWS Data Pipeline to configure data loads from S3 to into Redshift
  • Used JSON schema to define table and column mapping from S3 data to Redshift

Environment: Oracle, StreamSets, Kafka, Spark Structured Streaming,MySQL, AWS, JDBC, Text files, CSV, Spark-MySQL.

Confidential

Python Developer

Responsibilities:

  • Developed views and templates for applications with MySQL, Python and Django's view controller and templating language to create a user-friendly interface to perform in a high-level
  • Built web application by using Python, Django, AWS, J2EE, PostgreSQL, MySQL, Oracle, and MongoDB.
  • Developed backend of the application using the flask framework.
  • Designed new features and made improvements to existing features in the code by fixing bugs.
  • Extensively worked on Data Extraction from REST API’s, performed data munging, data modelling and loaded the data into Redshift DB.
  • Built an Invoice Application in Data Platform to create, analyze, visualize, and predict in Commercial Real Estate
  • Built python scripts to read and write JSON files.
  • Used NumPy and Pandas for data manipulation to check machine learning models.
  • Observed the data behavior and made reusable scripts to help other teams to access for similar cases.
  • Administered the application, related databases, and the hosting environments.
  • Developed several REST APIs using Swagger, microservices style of architecture with Kafka as message broker and Mongo DB as backend database.
  • Developed and tested many features for dashboard using React, Next, Flask, CSS, and JavaScript
  • Managed and supported Continuous Integration (CI) using Jenkins and bitbucket
  • Helped in interactive API documentation for specific Python SDK methods to write custom requirements
  • Developed framework for converting existing Power Center mappings to PySpark (Python and Spark)
  • Created PySpark frame to bring data from DB2 to Amazon S3
  • Wrote Lambda functions in python for AWS's EMR, Lambda which invokes python scripts to perform various transformations and analytics on large data sets to various formats.
  • Validated the developed lambda scripts and fixed the identified bugs.
  • Developedmicroservicesusing AWS EMR Lambda, API Gateway, DynamoDB, RDS according to the scenario.
  • Architected and detailed Position, PnL, and FICC data migration process from multiple sources into Snowflake and Vertica from Sybase - IQ, SQL Server, and Oracle.
  • Coded Snowflake data loaders using Python. Reorganized large volumes of data.
  • Created Python ETL pipelines for teardown and “trickle” data migration from different backends to Snowflake, SQL Server, and Vertica.
  • Provided guidance to development team working on PySpark as ETL platform
  • Responsible for ETL and orchestration process using Airflow and NiFi tool
  • Added support for Amazon AWS S3 and RDS to host files and the database into Amazon Cloud.
  • Debugged the application and followed messages in log files, to figure out the error if existing.
  • Developed API modularizing existingpythonmodules with the help of Pyyaml libraries.
  • Utilized PyUnit, the Pythonunit test framework, for all Python applications.
  • Creating unit test/regression test framework for working code
  • Experience in developing test automation framework scripts using Python Selenium WebDriver
  • Assisted with writing effective user stories and divide the stories into SCRUM tasks.

Environment: Apache Spark, HBase, Kibana, AWS, Cassandra, Flume, Oozie, ETL tools, Hive/Pig.

We'd love your feedback!