We provide IT Staff Augmentation Services!

Senior Python Developer Resume

2.00/5 (Submit Your Rating)

Houston, TX

SUMMARY

  • 7+ years of experience as a Python Developer, proficient coder in multiple languages and environments including Python, C, C++, HTML, JavaScript and SQL.
  • Worked on several standard python packages Pandas, Matplotlib, Beautiful Soup, httplib2, Jinja2, NumPy, PySide, SciPy, wxPython, PyTables, Requests, Urllib, SQLAlchemy, MySQLDB, XMLDocx.
  • Developed various Python scripts to generate reports, send FIX messages (FIX Simulator), SOAP requests, TCP/IP programming and multiprocessing jobs.
  • Good experience in developing web applications and implementing Model View Control (MVC) architecture using server side applications like Django, Django REST, Flask and Pyramid.
  • Extensive experience in developing Front End using HTML/HTML5, XML, DHTML CSS/CSS3, SASS, LESS, JavaScript, ReactJS, Redux, AngularJS (1.X) JQuery, JSON, Node.js, Ajax, JQUERY Bootstrap.
  • Strong experience using REST Web Services for data communication between remote systems, designed, developed and testing of REST interfaces in Java.
  • Expertise in working with different databases like Microsoft SQL Server, Oracle, MySQL, PostgreSQL and Good knowledge in using NoSQL databases MongoDB, HBase and Cassandra.
  • Proficient in developing complex SQL queries, Stored Procedures, Functions, Packages along with performing DDL and DML operations on the database.
  • Hands on Experience in Data mining and Data warehousing using ETL Tools and Proficient in Building reports and dashboards in Pentaho (BI Tool).
  • Excellent working knowledge in UNIX and Linux shell environments using command line utilities.
  • Application Stress tester, created and stress tested standalone, web - applications and generated graph reports.
  • Experience in working on BD PaaS (Big Data Platform as a Service).
  • Experience in analyzing data using Big Data Tools like Hive QL program. Also Experience in using Hbase, Oozie.
  • Extensively used SQL, Numpy, Pandas, Scikit-learn, Spark, Hive for Data Analysis and Model building.
  • Extensively worked on Hadoop, Hive, Spark, Cassandra to build ETL and Data Processing systems having various data sources, data targets and data formats
  • Excellent knowledge of Machine Learning, Mathematical Modeling and Operations Research. Comfortable with R, Python, and Relational databases. Deep understanding and exposure of Big Data Eco - system.
  • Experienced in implementing Cloud solutions in AWS (EC2, EMR. S3, Cloudwatch, Lambda, Cloudtrail, SNS, SES, EBS, CLI, VPC, ELB, IAM, Redshift, RDS, Root53), Google Cloud, Microsoft Azure.
  • Strong knowledge of Data Structures and Algorithms, Object Oriented Analysis, Machine learning and software design patterns.
  • Expertized in Implementing Spark using Scala and SparkSQL for faster testing and processing of data.
  • Hands on experience with Spark-Scala programming with good knowledge on Spark Architecture and its In-memory Processing and Experience with Unit testing/ Test driven Development (TDD), Load Testing.
  • Hands on experience working in WAMP (Windows, Apache, MYSQL, and Python/PHP) and LAMP (Linux, Apache, MySQL, and Python/PHP) Architecture.
  • Expertise in Production support. Provided first, second and third level support to different organizations. Used Jira, Gitlab, pdb, gdb and other debugging tools and deployed production hot fixes.
  • Experience with Agile, Scrum and Waterfall methodologies. Used ticketing systems like Jira and Bugzilla
  • Knowledge of testing deployment and deployment tools using Heroku, Jenkins, pylint, cppCheck and Coverity.
  • Expert at version control systems like Git, GitHub, SVN and CVS. Migrated repos from SVN to GitHub.
  • Familiarity with development best practices such as code reviews, unit testing, system integration testing (SIT) and user acceptance testing (UAT).
  • Experience in writing test scripts, test cases, test specifications and test coverage.
  • Experience in Test driven development for functional and integration testing using Python PyUnit for unit Testing and for Test Automation used Robot and Selenium frameworks.

TECHNICAL SKILLS

Programming Languages: Python, R, C, C++, C# and Core Java

Python Frameworks and Libraries: Python, Django, Flask, Pandas, Matplotlib, Beautiful Soup, httplib2, Jinja2, NumPy, PySide, SciPy, wxPython, PyTables, Requests, Urllib, SQLAlchemy, MySQLDB, XMLDocx.

Web Technologies: HTML, CSS, JavaScript, Bootstrap, JQuery, AJAX, XML, Angular JS, Node JS

Development/ Deployment Tools: Sublime Text, Eclipse, EMACS, Jenkins, Coverity, pyLint, PyCharm, Spyder, Docker, Kubernetes

Databases: Microsoft SQL Server, Oracle, MySQL, MS Access, PostgreSQL

NoSQL databases: Mongo DB, Cassandra, HBase.

Cloud Technologies: AWS, Amazon EC2, S3, EMR, Lambda, Redshift, Heroku, MS Azure.

Operating Systems: Linux 4.x/5.x/6.x, Ubuntu, Red Hat Linux, Windows server 2008, 2012, IBM AIX.

Version Controls: CVS, SVN, Git, GitHub.

Testing, Issue Tracking and Debugging Tools: Jira, PDB, GDB, Bugzilla, Gitlab.

Automation Testing: Selenium, Robot, PyTest.

Development Methodologies: Agile, SCRUM and Waterfall.

PROFESSIONAL EXPERIENCE

Confidential, Houston, TX

Senior Python Developer

Responsibilities:

  • Used Python programming and Django for the backend development, Bootstrap and Angular for frontend connectivity and MongoDB for database.
  • Designed and developed the complete admin module and resolved issues and enhanced the Admin module for more achievements.
  • Developed test automation framework scripts using Python Selenium WebDrive.
  • Tested compatibility of application for dynamic and static content in cross browsers such as Chrome, IE, Edge, Firefox and Safari.
  • Setup Selenium GRID to run automation scripts on different browsers.
  • Implemented programming automations using Jenkins and Ansible on unix/linux OS over cloud like Docker.
  • Automated MySQL container deployment in Docker using Python and monitoring of these containers.
  • Automated most of task using python scripting, RaspberryPI and Raspbian OS, and Unix/Linux shell Scripting.
  • Developed a Django ORM module queries that can pre-load data to reduce the number of databases queries needed to retrieve the same amount of data.
  • Developed Restful micro services using Flask and Django and deployed on AWS servers using EBS and EC2.
  • Developed views and templates with Django view controller and template to create a user-friendly website interface.
  • Identified and created issues and bugs based on the User Stories in JIRA.
  • Worked on REST Web Services for data communication between remote systems, designed, developed and testing of REST interfaces in Java.
  • Used JavaScript for data validations and designed validations modules.
  • Developed project's web page as Single Page Application (SPA) by using AngularJS and JavaScript API and build delivery driver application.
  • Used Pandas API to put the data as time series and tabular format for local timestamp data manipulation and retrieval and storing it into MongoDB.
  • Deployed and tested different modules in Docker containers and GIT.
  • Used Python Library Beautiful Soup for web scrapping to extract data for building graphs
  • Uploaded the admin module on Elastic Bean Stalk (EBS) & EC2 and stored the static files in S3 on Amazon cloud.
  • Track automation results on a daily basis for better performance in testing.
  • Followed Agile testing methodology, participated in daily Scrum meetings and tested each Sprint deliverables.

Environment: s: Python, Django, Django Rest, Flask, AngularJS, JavaScript, Jenkins, Dockers, Container, GitHub, Ansible, Java, HTML5/CSS, Bootstrap, Selenium & Robot Framework, Jira, MS SQL Server 2013, AWS, S3, EC2, EBS, MySQL, PyCharm, Linux, Shell Scripting, JIRA.

Confidential, Framingham, MA

Python Full Stack Engineer

Responsibilities:

  • Developed Python Django forms to record data and the Login module page for users.
  • Designed email marketing campaigns and created interactive forms that saved data into database using Django Framework.
  • Developed the automated process for processing the payer acknowledgements that helped in resubmitting the medical claims automatically using PERL and DBI modules.
  • Worked on Technologies: QT, JAVASCRIPT, QML, Unit Test, C++, QNX, UML API, JavaScript & Json.
  • Used collections in Python for manipulating and looping through different your defined objects and worked in Test driven development with Behave in Python.
  • Experience in UNIX/Linux shell scripting for job scheduling, batch-job scheduling, automating batch programs, forking and cloning jobs.
  • Wrote Python routines to log into the websites and fetch data for selected options.
  • Improving the performance while processing data by modifying functions, queries, cursors, triggers and stored procedures for MySQL database.
  • Extracted data from XML files using PERL -XML modules.
  • Developed the consumer-friendly front-end with an easy-to-use API UI and fast access RESTFUL.
  • Worked with JSON based REST Web services.
  • Use of API, Open NLP & Stanford NLP for Natural Language Processing and sentiment analysis.
  • Embedded AJAX in UI to update small portions of the web page avoiding the need to reload the entire page.
  • Developed views and templates with python and Django's view controller and templating language to create a user-friendly website interface.
  • Used Amazon Web Services (AWS) for improved efficiency of storage and fast access.
  • Added support for Amazon AWS S3 and RDS to host static/media files and the database into Amazon Cloud.
  • Participated in the complete SDLC process and used PHP to develop website functionality.
  • Designed and developed the UI of the website using HTML, XHTML, AJAX, unittest, GCP CSS, JAVASCRIPT and JavaScript.
  • Developed entire frontend and backend modules using python on Django Web Framework.
  • Designed and developed data management system using MySQL.
  • Used Django APIs for database access and built application logic using python.
  • Provided GUI utilizing PyQt for the end user to create, modify and view reports based on client data.
  • Used python to extract information from XML files.
  • Expertise in Service Oriented Architecture (SOA) and its related technologies like Web Services, BPEL, WSDLs, AWS, SOAP, API, XML, XSD, XSLT etc.
  • Worked on development of SQL and stored procedures on MYSQL and designed Cassandra schema for the APIs.
  • Designed and developed a horizontally APIble APIs using python Flask and implemented monitoring and established best practices around using elastic search.

Confidential, Tampa, FL

Python Data Engineer

Responsibilities:

  • Utilized Apache Spark with Python to develop and execute Big Data Analytics and Machine learning applications, executed Machine Learning use cases under Spark ML and Mllib.
  • Identified areas of improvement in existing business by unearthing insights by analyzing vast amount of data using machine learning techniques.
  • Interpret problems and provides solutions to business problems using data analysis, data mining, optimization tools, and machine learning techniques and statistics.
  • Designed and developed NLP models for sentiment analysis.
  • Led discussions with users to gather business processes requirements and data requirements to develop a variety of Conceptual, Logical and Physical Data Models. Expert in Business Intelligence and Data Visualization tools: Tableau, Microstrategy.
  • Creating data pipelines using Apache Spark, a big-data processing and computing framework and Updated and maintained Jenkins for automatic building jobs and deployment.
  • Worked on machine learning on large size data using Spark and MapReduce.
  • Let the implementation of new statistical algorithms and operators on Big Data and SQL platforms and utilized optimizations techniques, linear regressions, K-means clustering, Native Bayes and other approaches.
  • Developed Spark/Scala, Python for regular expression (regex) project in the Big Data Hive environment with Linux/Windows for big data resources.
  • Data sources are extracted, transformed and loaded to generate CSV data files with Python programming and SQL queries.
  • Stored and retrieved data from data-warehouses using Amazon Redshift.
  • Worked on TeradataSQL queries, Teradata Indexes, Utilities such as Mload, Tpump, Fast load and FastExport.
  • Used Data Warehousing Concepts like Ralph Kimball Methodology, Bill Inmon Methodology, OLAP, OLTP, Star Schema, Snow Flake Schema, Fact Table and Dimension Table.
  • Refined time-series data and validated mathematical models using analytical tools like R and SPSS to reduce forecasting errors.
  • Worked on data pre-processing and cleaning the data to perform feature engineering and performed data imputation techniques for the missing values in the dataset using Python.
  • Familiarity in working with TCP/IP, IPv4, and IPv6 protocols in an environment, which provides multithreading, multi tenancy and high availability support at Network Layer.
  • Created Data Quality Scripts using SQL and Hive to validate successful das ta load and quality of the data. Created various types of data visualizations using Python and Tableau.

Environment: Python, Django, Big Data, Map Reduce, Spark, Spark MLLib, Tableau, SQL, Excel, VBA, SAS, Matlab, AWS, SPSS, Cassandra, Oracle, MongoDB, SQL Server 2012, DB2, T-SQL, PL/SQL, XML, Tableau

We'd love your feedback!