We provide IT Staff Augmentation Services!

Python Developer Resume

4.00/5 (Submit Your Rating)

San Jose, CA

SUMMARY:

  • Over 6 years of experience as a Web/Application Developer and coding with analytical programming using Python, Django, Java
  • Having 2+ years of extensive experience as a Data Scientist with extensive experience in Data Mining, Statistical Data Analysis, Exploratory Data Analysis and Machine Learning with various forms of data.
  • Experienced with full software development life - cycle, architecting scalable platforms, object oriented programming, database design and agile methodologies
  • Experienced in MVW frameworks like Django, Angular.js, Java Script, JQuery and Node.js.
  • Expert knowledge of and experience in Object Oriented Design and Programming concepts.
  • Experience object oriented programming (OOP) concepts using Python, C++.
  • Experience in Python OpenStack API’S
  • Experienced in developing web-based applications using Python, Django, C++, XML, CSS, HTML, DHTML, JavaScript and JQuery
  • Experienced in working with various Python Integrated Development Environments like NetBeans, PyCharm, PyScripter, Spyder, PyStudio, PyDevand Sublime Text
  • Well versed with design and development of presentation layer for web applications using technologies like HTML, CSS and JavaScript.
  • Familiar with JSON based REST Web services and Amazon Web services.
  • Experience in writing Sub Queries, Stored Procedures, Triggers, Cursors, and Functions on MySQL and PostgreSQL database.
  • Experienced in Agile and waterfall methodologies with high quality deliverables delivered on-time
  • Expertise in Querying RDBMS such as MYSQL and SQL Server by using SQL for data integrity
  • Very strong full life cycle application development experience.
  • Strong knowledge on Dev Express Controls.
  • Experience with continuous integration and automation using Jenkins.
  • Experience with Unit testing/ Test driven Development (TDD), Load Testing.
  • Experience in developing ColdFusion Components, custom tags and modified CF Objects. Worked on AJAX framework to transform Datasets and Data tables into HTTP-serializable JSON strings.
  • Developed the required XML Schema documents and implemented the framework for parsing XML documents.
  • Have the ability to understand complex systems and be in command of the details to provide solutions. Maintained detailed documentation and architectural solutions in IT infrastructure and sales systems.
  • Ability to learn and adapt quickly to the emerging new technologies and paradigms.
  • Excellent communication, interpersonal and analytical skills and a highly motivated team player with the ability to work independently.
  • Experienced in Machine Learning Regression Algorithms like Simple, Multiple, Polynomial, SVR(Support Vector Regression), Decision Tree Regression, Random Forest Regression.
  • Experienced in Advanced Statistical Analysis and Predictive Modeling in structured and unstructured data environment.
  • Strong expertise in Business and Data Analysis, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Governance, Data Lineage, Data Integration, Master Data Management(MDM), Metadata Management Services, Reference Data Management (RDM).

TECHNICAL SKILLS:

Programming Languages: Python 2.x/3.x, Go language, Scala, C, C++, Java, SQL

Python Libraries: Requests, Scrapy, wxPython, Pillow, SQLAlchemy, BeautifulSoup, Twisted, NumPy, SciPy, matplotlib, Pygame, Pyglet, PyQT, PyGtk, Scapy, pywin32, ntlk, nose, SymPy, Ipython

Web Frameworks: Django, Flask, Pyramid, TurboGears, Muffin, CherryPy

GUI Frameworks: PyJamas, GnomePython, gui2py, PyFltk, PyForms, PyGtk, PySide, TkInter

Version Control Tools: Concurrent Versions System (CVS), Subversion (SVN), Git, Mercurial

Automation Tools: doit, Buildbot, Chef, Puppet, Ansible, Docker

Testing Tools: Unittest, pytest, pythoscope, PyMock, Mocker, antiparser, webunit, webtest, PAMIE, Selenium, Splinter, PyFIT, PyUseCase, Automa, PyChecker

IDE: Netbeans, Thonny, Komodo, PyCharm, PyDev, PyScripter, Pyshield, Spyder, PyStudio

Databases: MySQL, MongoDB, Cassandra, NoSQL, PostgreSQL

Bug Tracking Tools: Bugzilla, Jira, HP ALM/Quality Center, IBM Rational ClearQuest

Operating Systems: IBM OS/2 Warp, Windows 98/NT/2000/XP/Vista/7/8, Unix/Linux, Sun Solaris

Reporting & Visualization: Tableau, Matplotlib, Seaborn, ggplot, SAP Business Objects, Crystal Reports, SSRS, Cognos, Shiny

Data Modeling Tools: Erwin Data Modeler, Erwin Model Manager, ER Studio, and Power Designer.

PROFESSIONAL EXPERIENCE:

Confidential, San Jose, CA

Python Developer

Responsibilities:

  • Responsible for gathering requirements, system analysis, design, development, testing and deployment.
  • Developed tools using Python, Shell scripting, XML to automate some of the menial tasks. Interfacing with supervisors, artists, systems administrators and production to ensure production deadlines are met.
  • Developed Business Logic using Python on Django Web Framework.
  • Designed and developed the UI of the website using HTML, AJAX, CSS and JavaScript.
  • Designed and developed data management system using MySQL.
  • Developed Business Logic using Python on Django Web Framework.
  • Write scripts using python modules and its libraries to develop programs that improve processing of access requests
  • Utilized PyUnit, the Python Unit test framework, for all Python applications and used Django Database
  • 's to access database objects.
  • Used JQuery and Ajax calls for transmitting JSON data objects between frontend and controllers.
  • Involved in building database Model, APIs and Views utilizing Python, in order to build an interactive web-based solution.
  • Used Python based GUI components for the front-end functionality such as selection criteria, created test harness to enable comprehensive testing utilizing Python.
  • Used Amazon Web Services (AWS) for improved efficiency of storage and fast access.
  • Added support for Amazon AWS S3 and RDS to host static/media files and the database into Amazon Cloud. Involved in front end and utilized Bootstrap and Angular.js for page design.
  • Created Data tables utilizing PyQt to display customer and policy information and add, delete, update customer records.
  • Used PyQuery for selecting particular DOM elements when parsing HTML.
  • Developed views and templates with Python and Django's view controller and templating language to create a user-friendly website interface. Used Django Database API's to access database objects.
  • Used JQuery and Ajax calls for transmitting JSON data objects between frontend and controllers.
  • Involved in building database Model, APIs and Views utilizing Python, to build an interactive web-based solution.
  • Used Python based GUI components for the front-end functionality such as selection criteria, Created test harness to enable comprehensive testing utilizing Python.
  • Added support for Amazon AWS S3 and RDS to host static/media files and the database into Amazon Cloud.
  • Involved in front end and utilized Bootstrap and Angular.js for page design.
  • Used Wireshark, live http headers, and Fiddler debugging proxy to debug the Flash object and help the developer create a functional component.

Environment: Python, Django, Shell Scripting, AWS, Pandas, PyQt, PyQuery, Wireshark, Flash, DOM, JSON, PHP, HTML5, CSS3, AJAX, JavaScript, Angular.js, Bootstrap, Apache Web Server, MYSQL, GitHub, LINUX.

Confidential, San Jose, CA

Data Scientist / Python Developer

Responsibilities:

  • Responsible for predictive analysis of credit scoring to predict whether or not credit extended to a new or an existing applicant will likely result in profit or losses.
  • Improved classification of bank authentication protocols by 20% by applying clustering methods on transaction data using Python Scikit-learn locally, and Spark MLlib on production level.
  • Data was extracted extensively by using SQL queries and used R, Python packages for the Data Mining tasks.
  • Performed Exploratory Data Analysis, Data Wrangling and development of algorithms in R and Python for data mining and analysis.
  • Implemented Natural Language Processing (NLP) methods and pre-trained word2vec models for the improvement of in-app search functionality.
  • Involved in transforming data from legacy tables to HDFS and HBASE tables using Sqoop. Research on Reinforcement learning and control (Tensorflow, Torch) and machine learning model (Scikit-learn).
  • Used Python based data manipulation and visualization tools such as Pandas, Matplotlib, Seaborn to clean corrupted data before generating business requested reports.
  • Developed extension models relying on but not limited to Random forest, logistic, Linear regression, Stepwise, Support Vector machine, Naive Bayes classifier, ARIMA/ETS model, K-Centroid clusters.
  • Used Machine Learning to build various algorithms (Random Forest, Decision trees, Naive Bayes) classification models.
  • Extracted data from HDFS and prepared data for exploratory analysis using Data Munging.
  • Extensively used R packages like (GGPLOT2, GGVIS, CARET, DPLYR) on huge data sets.
  • Used R, Python programming languages to graphically analyses the data and perform data mining.
  • Did extensive data mining to find out relevant features in an anonymized dataset using R and Python. Used an ensemble of Xgboost (Tuned using Random Search) model to make predictions.
  • Explored 5 supervised Machine Learning algorithms (Regression, Random Forest, SVM, Decision tree, Neural Network) and used parameters such as Precision/Adjusted R-Squared /residual splits to select the winning model of the 5 different models.
  • Developed Tableau based dashboard from oracle, SQL databases to present to business team for data visualization purpose.

Environment: R (dplyr, caret, ggplot2), Python (Numpy, Pandas, PySpark, Scikit-learn, MatplotLib, NLTK), T-SQL, MS SQL Server, R Studio, Spyder, Jupyter notebook, Tensorflow, Theano, Caffe, MATLAB, ETL, HDFS, Scala, Shiny, h2o, Oracle, Teradata, Java, Tableau, Supervised & Unsupervised Learning

Confidential, Fort Lauderdale, FL

Data Analyst

Responsibilities:

  • Worked on Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatics Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases.
  • Created data transformations from internal and third party data sources into data suitable for handheld devices, including XML.
  • Implemented SDLC methodologies including RUP, RAD, Waterfall and Agile.
  • Conducted GAP analysis so as to analyze the variance between the system capabilities and business requirements.
  • Involved in defining the source to target data mappings, business rules, data definitions.
  • Created Logical/physical Data Model in ERwin and have worked on loading the tables in the Data Warehouse. Documented various Data Quality mapping documents.

Environment: Informatica Analyst 9.6.1, ETL, Agile, Waterfall, XML, Teradata, SharePoint, SSIS, SSRS, Microstrategy, UML, SQL Server, MS Visio, Machine Learning, SQL, ETL, Oracle, Erwin

We'd love your feedback!