Python Developer Resume
Mason, OH
SUMMARY
- Around 5 years of experience as a Python Web/Application Developer with strong focus on analytical programming - performed data analysis and data visualizations
- Experienced with full software development life cycle, architecting scalable platforms, database design and agile methodologies.
- Experience in object-oriented programming (OOP) concepts like Multi-Threading, Exception Handling and Collections using Python.
- Experienced in design patterns such as MVC using Django, Flask and deploying application on Apache tomcat server, containerizing applications using Docker.
- Good experience in software development using Python and using its libraries and modules like NlpNumPy, Pandas, Pickle, Jupyter, SciPy, Python-twitter, Matplotlib, urllib2 for data analytics and rapid development
- Proficient in Python OpenStack API'S and GUI framework - Pyjamas (for web).
- Experienced in delivering business applications within AWS platform.
- Experience working on with Amazon Web Services (AWS), Amazon EC2 instances, Load Balancing, Amazon Aurora and Dynamo DB.
- Implemented Apache Spark on Scala to address the computation of analytical queries and observed the performance in processing those queries.
- Strongly followed PEP-8 coding standard and test a program by running it across test cases to ensure validity and effectiveness of code using PyChecker and PyLint.
- Involved in Unit testing and Integration testing of the code using PyTest.
- Proficient in SQL databases MYSQL, PostgreSQL, Oracle and MongoDB.
- Having experienced in writing Sub Queries, Stored Procedures, Triggers, Cursors, and Functions on MySQL, PL/SQL and PostgreSQL database with ETL and Teradata experience
- Experience in Continuous Integration (CI): Gradle, Maven, Ant, Hudson/Jenkins and good knowledge in maintaining various version control systems such as GIT, SVN, CVN.
- Experience in UNIX/Linux shell scripting for job scheduling, batch-job scheduling, automating batch programs, forking and cloning jobs.
- Experience in Third Party tools like CollectD, SNMP, Karaf, Yang, Yaml, OSGi
- Worked on Java: Zookeeper Curator, Guava, Logging (slf4j, logback, log4j), Junit.
- Implemented a 'server less' architecture using API Gateway, Lambda, and Dynamo DB and deployed AWS Lambda code from Amazon S3 buckets.
- Created a Lambda Deployment function, and configured it to receive events from your S3 bucket.
- Designed the data models to be used in data intensive AWS Lambda applications which are aimed to do complex analysis creating analytical reports for end-to-end traceability, lineage, definition of Key Business elements from Aurora for around 4 years.
- Used AWS for application deployment and configuration.
- Utilized Apache Spark with Python to develop and execute Big Data Analytics and Machine learning applications, executed machine Learning use cases under Spark ML and Mllib for around 3 years.
TECHNICAL SKILLS
Programming: Python, C, C++, GoLang, Shell programming, Java, pyspark
Languages: HTML, XHTML, CSS, JavaScript, jQuery, AJAX, XML, Docker
Frameworks: Django, Flask, Pyramid, Pyjamas, Web2py, Bootstrap
Python Libraries: NumPy, SciPy, Pandas, Jupyter, Matplotlib, Urllib2, Python-twitter
Databases: MySQL, Oracle, PostgreSQL, DB2, NoSQL - MongoDB and Cassandra
Web Services: AWS, SOAP, RESTful
Servers: IBM WebSphere, WebLogic, JBoss, Apache Tomcat
Version Control: Git, GitHub, SVN, CVS
Deployment Tools: Heroku, Jenkins, Ansible, Devops
Operating Systems: UNIX, Linux, Windows, Mac OS
Testing Tools: Selenium, HP QC, HP QTP
Methodologies: Agile, Scrum and Waterfall
PROFESSIONAL EXPERIENCE
Confidential, Mason, OH
Python Developer
Responsibilities:
- Participate in requirement gathering and analysis phase of the project in documenting the business requirements by conducting workshops/meetings with various business users.
- Developed Python/Django applicationfor Google Analytics aggregation and reporting.
- Used Django configuration to manage URLs and application parameters.
- Worked on Python Open stack API's.
- Used Python scripts to update content in the database and manipulate files.
- Generated Python Django Forms to record data of online users
- I have used Pandas API to put the data as time series and tabular format for east timestamp data manipulation and retrieval.
- Used Pandas library for statistical Analysis.
- Developed tools using Python, Shell scripting, XML to automate some of the menial tasks. Interfacing with supervisors, artists, systems administrators and production to ensure production deadlines are met.
- Utilized Apache Spark with Python to develop and execute Big Data Analytics and Machine learning applications, executed machine Learning use cases under Spark ML and Mllib.
- Excellent knowledge of Machine Learning, Mathematical Modeling and Operations Research. Comfortable with Python. Deep understanding & exposure of Big Data Eco - system.
- Experienced on data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing.
- Esstablished Hadoop /Hive database Connectivity and performed regex and data analysis.
- Performed troubleshooting, fixed and deployed many Python bug fixes of the two main applications that were a main source of data for both customers and internal customer service team.
Environment: Linux, Python, Django, JIRA, XML, JavaScript, Pandas, JSON, MongoDb, Hadoop,ShellScripting.
Confidential, Atlanta, GA
Python Developer
Responsibilities:
- Designed and developed customer preferences portal in Python using Django framework.
- Automating the deployment of applications as portable, self-sufficient containers that can run on the cloud or on-premises using Docker.
- Managed Docker orchestration and Docker containerization using Kubernetes.
- Used Kubernetes to orchestrate the deployment, scaling and management of Docker Containers.
- Customized user registration with two steps (in-active user creation and email activation).
- Designed User Interface to leverage HTML, XHTML, AJAX, CSS and JavaScript.
- Implemented Ajax calls to get, post, delete REST API calls for upload and delete files.
- Implemented web scraping using Python’s Beautiful Soup library.
- Worked with Pandas for automatic and explicit data alignment, easy handling of missing data and performed Data framing, Data Analysis, and Data representation
- Worked on writing and as well as read data from CSV and excel file formats.
- Worked on data analysis and data mining algorithms using Teradata
- Used NumPy, SciPy, Matplotlib libraries for n-dimensional representation of data and plotting graphs.
- Achieved business process automation via applications developed using Git, Gerrit, Jenkins, MySQL and custom tools developed in Python and Bash.
- Used Amazon Web Services (AWS) for improved efficiency of storage and fast access.
- Added support for Amazon AWS S3 and RDS to host static/media files and the database into Amazon Cloud.
- Managed Build, Reporting, and documentation using Maven.
- ETL process for continuously bulk importing catalog data from Mongodb into Elastic search.
- Performed preliminary data analysis using descriptive statistics and handled anomalies such as removing duplicates and imputing missing values.
- Application of various machine learning algorithms and statistical Modeling like decision trees, text analytics, natural language processing (NLP), supervised and unsupervised, regression models, social network analysis, neural networks, deep learning, SVM, clustering to identify Volume using Scikit-learn package in python
- Developed Python for regular expression (regex) project in the Hadoop/Hive environment with Linux/Windows for big data resources.
- Handled Machine Learning model development and data engineering using Spark and Python.
- Applied statistical techniques and big data technologies using Spark to solve business challenges.
- Wrote python scripts to perform CRUD operations on MySQL database.
- Experience in automation via Bash/shell scripting and Python programming.
- Involved in Unit testing and Integration testing of the code using PyTest.
- Involved in Sprint planning sessions and participated in the daily Agile SCRUM meetings. And monitored on JIRA (Agile).
Environment: Python, Django, Django REST framework, Bootstrap Framework, Nodejs, Jenkins, GIT, Zeus, JBOSS, Ruby, Cassandra, Jira, Microsoft AZURE.