We provide IT Staff Augmentation Services!

Python Developer Resume

2.00/5 (Submit Your Rating)

Mountain View, CA

SUMMARY:

  • Around 4 Years of experience in Web Application Development & Designing. Expertise in Object Oriented Concepts, Object Oriented Design (OOD), Object Oriented Analysis (OOA) Programming.
  • Having experienced in Agile Methodologies, Scrum stories and sprints experience in a Python based environment, along with data analytics, data wrangling and Excel data extracts.
  • Proficient in developing Web Services (SOAP, RESTful) in Python using XML, JSON. Developed web - based application using Django framework with python concepts.
  • Excellent knowledge and experience in Oracle Database, JDBC, DB2, PL/SQL, MS SQL Server, My SQL Server, Mongo DB.
  • Developed tools to automate some base tasks using Python, Shell scripting and XML.
  • Experience working in various Software Development Methodologies like Waterfall, Agile SCRUM and TDD.
  • Hands on UML compliant high-level design with data flow diagram, Class Diagrams, Sequence Diagrams, Activity Diagram and Use Cases and documenting for peer developer.
  • Solid understanding in Design Patterns, MVC, Python Algorithms, Python Data Structures.
  • Professional qualified Data Scientist/Data Analyst with over 2+ years of experience in Data Science and Analytics including Machine Learning, Data Mining and Statistical Analysis along with IT experience as a data analyst including profound expertise and experience on statistical data analysis such as transforming business requirements into analytical models, designing algorithms, and strategic solutions that scales across massive volumes of data.
  • Proficient in advising on the use of data for compiling personnel and statistical reports and preparing personnel action documents patterns within data, analyzing data and interpreting results.
  • Skilled Data Scientist with experience in the full software life cycle in SDLC, Agile and Scrum methodologies.
  • Involved in all the phases including data extraction, data cleaning, statistical modeling and data visualization with large data sets of structured and unstructured data.
  • Strong experience in Business and Data Analysis, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Governance, Data Integration and Metadata Management Services and Configuration Management.
  • Strong skills in statistical methodologies such as A/B test, experiment design, hypothesis test, ANOVA.
  • Solid ability to write and optimize diverse SQL queries, working knowledge of RDBMS like SQL Server 2008, NoSQL databases like MongoDB 3.2.
  • Strong experience in Big Data technologies like Spark 1.6, Spark sql, PySpark, Hadoop 2.X, HDFS, hive 1.X.
  • Experience in visualization tools like, Tableau 9.X, 10.X for creating dashboards. Excellent understanding Agile and Scrum development methodology.
  • Have also done some database work regarding Confidential platform and hands on EC2, S3, Redshift, Snowflake, Databricks. Worked with NoSQL Database including HBase, Cassandra and MongoDB.
  • Experienced in Big Data with Hadoop, HDFS, MapReduce, and Spark.
  • Experienced in Data Integration Validation and Data Quality controls for ETL process and Data Warehousing using MS Visual Studio SSIS, SSAS, SSRS.

TECHNICAL SKILLS:

Core Tech Skills: Python, Django, JavaScript, Angular.js, Node.js, React.JS Backbone.js, Flask, Numpy, Pydev, PyQt, Matplotlib, wxPython, PostgreSQL, PyCharm, Webservers, Apache, D3.js, HTML, CSS, Bootstrap.js, EXT-JS, Ajax, JQuery, Dojo, Java, Spring, Hibernate, JDBC, C, C++.

Languages: SQL, Python, PL/SQL, T-SQL.

Packages: ggplot2, caret, dplyr, Rweka, gmodels, RCurl, tm, C50, twitteR, NLP, Reshape2, rjson, plyr, pandas, Numpy, seaborn, SciPy, matplot lib, Scikit-learn, Beautiful Soup, Rpy2.

Web Technologies: JDBC, HTML5, DHTML and XML, CSS3, Web Services, WSDLData Modelling.

Tools: Erwin r 9.6, 9.5, 9.1, 8.x, Rational Rose, ER/Studio, MS Visio, SAP Power designer.

Big Data Technologies: Hadoop, Hive, HDFS, MapReduce, Pig, Kafka.

Databases: SQL, Hive, Impala, Pig, Spark SQL, Databases SQL-Server, My SQL, MS Access, HDFS, HBase, Confidential, Netezza, MongoDB, Cassandra.

Reporting Tools: MS Office (Word/Excel/Power Point/ Visio), Tableau, Crystal reports XI, Business Intelligence, SSRS, Business Objects 5.x/ 6.x, Cognos7.0/6.0.

ETL Tools: Informatica Power Centre, SSIS, Oracle Data Integrator (ODI), Cognos Data Manager, Pentaho Data Integration, QlikView Expressor, Ab Initio, Pentaho.

BI Tools: Tableau, Tableau server, Tableau Reader, SAP Business Objects, OBIEE, QlikView, SAP Business Intelligence, Amazon Redshift, or Azure Data Warehouse.

Additional tech skills: SQL Server, Oracle, MySQL, DB2, JBoss, WebSphere, Tomcat, BEA WebLogic, Eclipse, Flex Builder, NetBeans, RSA, MS Visio, Windows, Linux, Unix.

PROFESSIONAL EXPERIENCE:

Python Developer

Confidential, Mountain View, CA

Responsibilities:

  • Developed web applications and RESTful web services and APIs using Python, Django and PHP. Experience with Django, a high-level Python Web framework.
  • Automated Confidential S3 data upload / download using Python scripts. Automated JIRA processes using Python and bash scripts.
  • Implemented the CI/CD for the Forge Microservices to containerize the Microservices and pushing them to Private Docker Registery.
  • Developing Python Framework using Django to perform scan software unit monitoring.
  • Created Python and Bash tools to increase efficiency of application system. Developed Merge jobs in Python to extract and load data into MySQL database.
  • Analyzed client needs and developed software tools to assist dynamic site content creation (Python, wxPython). Responsible for setting up Python REST API framework using DJANGO.
  • Involved in Confidential Data Migration Services and Schema Conversion Tool along with Talend ETL tool. Create, modify and execute DDL in table Confidential Redshift tables to load data.
  • Data Migration from oracle to redshift using SCT and DMS. Built various graphs for business decision making using Python matplotlib library.
  • OOP in PHP to extend or update functionalities. The project uses MySQL as the database. Built various graphs for business decision making using Python library.
  • Responsible for setting up Python REST API framework using Django. Implement code in python to retrieve and manipulate data. Created entire application using Python, Django, MySQL and Linux.
  • Worked on Element Tree XML API in python to parse XML documents and load the data in database.
  • Creation of REST Webservices for the management of data using Apache CXF (JAX-RS).
  • Worked on python-based test frameworks and test-driven development with automation tools.
  • Developed a fully automated continuous integration system using Git, MySQL, Jerkins, and custom tools developed in Python.
  • Worked in RDBMS implementation using, SQL, PL/SQL, DB2, MySQL on Oracle database. Worked on Python OpenStack APIs and used Numpy for Numerical analysis.
  • Worked on server-side applications with Django using Python programming. Utilized standard Python modules such as csv, itertools and pickle for development.
  • Worked with Oracle RDBMS for writing complex queries and PL/SQL, SQL for Stored Procedures, Triggers and Events, for generating some important responses needed by the application at times.
  • Used Analytical Python Libraries like Pandas and Numpy to work on Data Manipulations. Used integrated debugger tool from PyCharm for debugging of source code for better analysis.
  • Used XML Webservices using SOAP to transfer the amount to transfer application that is remote and global to different financial institutions.

Python Developer

Confidential, Houston, TX

Responsibilities:

  • Experience in working with Python ORM Libraries including Django ORM. Consumed Restful Webservices where the transmission of data is in JSON format.
  • Designed and Developed REST Webservices to interact with various business sectors and used SOAP protocol for webservices communication.
  • Worked on designing, coding and developing the application in Python using Django MVC.
  • Wrote and executed various MYSQL database queries from python using Python MySQL connector and MySQL dB package.
  • Implemented CI/CD to deploy the Microservices in Kubernetes Cluster in Azure Cloud (Jenkins job that pulls the images from Private Docker Registry and deploy services in the cloud).
  • Web development including standardizing the toolsets used from Eclipse to use Git for source control. Utilize PyUnit, the Python Unit test framework, for all Python applications.
  • Developed data analytic tools using Python Pandas, and visualizations using Matplotlib and Bokeh. Create new PL/SQL stored procedures for new Oracle Forms and Reports development.
  • Performed troubleshooting, fixed and deployed many Python bug fixes of the two main applications that were a main source of data for both customers and internal customer service team.
  • Wrote python scripts to parse XML documents and load the data in database. Proficient in developing Web Services (SOAP, RESTful) in Python using XML, JSON.
  • Used Confidential Redshift, S3, Spectrum and Athena services to query large amount data stored on S3 to create a Virtual Data Lake without having to go through ETL process.
  • Experience in implementing and working on the python code using shell scripting. Performed troubleshooting, fixed and deployed many Python bug fixes.
  • Created Oracle database tables, stored procedures, sequences, triggers, views. Used different Confidential Data Migration Services and Schema Conversion Tool along with Matillion ETL tool.
  • Experienced in developing application in Flash Builder and deployed it on Tomcat application server proxied through secured http webserver.
  • Worked on WAMP (Windows, Apache, MYSQL, Python/PHP) and LAMP (Linux, Apache, MySQL, Python /PHP) Architecture.
  • Extensively worked on Application servers like WebLogic and Apache Tomcat.
  • Developed views and templates with Python and Django's view controller and templating language to created user-friendly website interface.
  • Used REST Webservices for creating rate summary and used WSDL and SOAP messages for getting insurance plans from different module and used XML parsers for data retrieval.

Python Developer

Confidential, Houston, TX

Responsibilities:

  • Managed datasets using data frames and MySQL, queried MYSQL database queries from python using Python-MySQL connector MySQL dB package to retrieve information. Development of real-time multi-tasking systems using Python.
  • Designed and developed components using Python with Django framework. Implemented code in python to retrieve and manipulate data.
  • Worked on development of backend services using Python, SQL and Linux. Created many API's for Scrum project which involves creating and maintaining projects in an organization.
  • Developed backend services and created many API's using Python Programming Language and SQL.
  • Involved in Requirement analysis, design and Implementation applying the Water fall model. Wrote Servlets programming and JSP scripting for the communication between web browser and server. Integrate the Oracle BPM with the Spring Framework in the enterprise layer.
  • Involved in packaging, deployment and upgrade of different modules of SAS on JBoss App Server. VB code Analysis and Sybase store procedure converted into SQL.
  • Worked on the MySQL migration project to make the system completely independent of the database being used. Used Spring IBatis to implement this.
  • Provided seamless connectivity between BI tools like Tableau and Qlik to Redshift endpoints. Reviewing the explain plan for the SQLs in Redshift.
  • Experienced in working with various Python IDE's using PyCharm, PyScripter, Spyder, PyStudio, Pydev, Eclipse, NetBeans, Sublime text.
  • Utilized Spark, Scala, Hadoop, HBase, Cassandra, MongoDB, Kafka, Spark Streaming, MLlib, Python, a broad variety of machine learning methods including classifications, regressions, dimensionally reduction etc. Automated the existing scripts for performance calculations using Numpy and SQL alchemy.
  • Working on various Integrated Development Environments like PyCharm, Anaconda Spyder. Experience with application servers and webservers including WebSphere, Tomcat and drop wizard.
  • Created Data tables utilizing PyQt to display customer and policy information and add, delete, update customer records.

Data Scientist / Analyst

Confidential

Responsibilities:

  • Involved in Design, Development and Support phases of Software Development Life Cycle (SDLC). Performed data ETL by collecting, exporting, merging and massaging data from multiple sources and platforms including SSIS (SQL Server Integration Services) in SQL Server.
  • Application of various machine learning algorithms and statistical modeling like decision trees, regression models, neural networks, SVM, clustering to identify Volume using Scikit-learn package in python, MATLAB.
  • Developed Spark/Scala, Python for regular expression (regex) project in the Hadoop/Hive environment with Linux/Windows for big data resources. Used clustering technique K-Means to identify outliers and to classify unlabeled data.
  • Used Python, R, SQL to create Statistical algorithms involving Multivariate Regression, Linear Regression, Logistic Regression, PCA, Random forest models, Decision trees, Support Vector Machine for estimating the risks of welfare dependency. Used Graphical Entity-Relationship Diagramming to create new database design via easy to use, graphical interface.
  • Used MLlib, Spark's Machine learning library to build and evaluate different models. Implemented rule-based expertise system from the results of exploratory analysis and information gathered from the people from different departments.
  • Performed Multinomial Logistic Regression, Random forest, Decision Tree, SVM to classify package is going to deliver on time for the new route. Performed data analysis by using Hive to retrieve the data from Hadoop cluster, Sql to retrieve data from Oracle database.
  • Evaluated models using Cross Validation, Log loss function, ROC curves and used AUC for feature selection. Analyze traffic patterns by calculating autocorrelation with different time lags.
  • Involved in writing stored procedures using Oracle. Optimized the database queries to improve the performance. Designed and developed data management system using Oracle.
  • Used Python to perform ANOVA test to analyze the differences among hotel clusters. Implemented application of various machine learning algorithms and statistical modeling like Decision Tree, Naive Bayes, Linear Regression using Python to determine the accuracy rate of each model.
  • Performed Data Cleaning, features scaling, features engineering using pandas and Numpy packages in python. Developed MapReduce pipeline for feature extraction using Hive.
  • Explored and Extracted data from source XML in HDFS, preparing data for exploratory analysis using data munging. Responsible for different Data mapping activities from Source systems to Confidential .
  • Designed Tableau bar graphs, scattered plots, and geographical maps to create detailed level summary reports and dashboards. Developed hybrid model to improve the accuracy rate.
  • Delivered the results to operation team for better decisions and feedbacks. Participated in all phases of research including data collection, data cleaning, data mining, developing models and visualizations.
  • Worked with data analysis using ggplot2 library in R to do data visualizations for better understanding of customers' behaviors. Visually plotted data using Tableau for dashboards and reports.
  • Implemented statistical modeling with XGBoost machine learning software package using R to determine the predicted probabilities of each model. Delivered the results with operation team for better decisions.
  • Created multiple custom SQL queries in Confidential SQL Workbench to prepare the right data sets for Tableau dashboards. Used R and python for Exploratory Data Analysis, A/B testing, ANOVA test and Hypothesis test to compare and identify the effectiveness of Creative Campaigns.
  • Perform analysis such as regression analysis, logistic regression, discriminant analysis, cluster analysis using SAS programming. Used Meta data tool for importing metadata from repository, new job categories and creating new data elements.

We'd love your feedback!