Python Developer Resume
Palo Alto, CA
SUMMARY:
- Having 6 years of IT experience as a Python Developer and coding with analytical programming using Python, Spark. Participates actively in all phase of software development lifecycle (SDLC), spanning from requirements to design, development, testing, debugging and support.
- Built ETL solutions using Python, AWS EMR, Cloudera.
- Extensive knowledge in Python Data manipulation libraries pandas, numpy, petl.
- Experience in AWS Cloud platform and its features which includes EC2, AMI, EBS Cloud watch, AWS Config, Auto - scaling, IAM user management, and AWS S3.
- Good experience in Amazon AWS concepts like EMR, S3 and EC2 web services which provides fast and efficient processing of Big Data.
- Expert in using SQL and No-SQL databases MongoDB.
- Knowledge of building DataWarehouse Systems.
- Experience in developing web-based applications using Python 3.6/2.7, Django.
- Experienced in MVT frameworks like Django
- Developed RESTful web services and APIs using Python Flask, Django.
- Hands on experience working in WAMP (Windows, Apache, MYSQL, and Python) and LAMP (Linux, Apache, MySQL, and Python) Architecture.
- Developed Applications used by Global Users in Pharma and Finance Domain’s.
- Experienced in Python’s modules Numpy, MatPlotLib, Pickle, Pandas, Pyside, Scipy, PyQt, Scikit-learn etc. for generating complex graphs data, creation of histograms etc.
- Knowledge on Cloud innovations including Infrastructure as a Service, Platform as a Service, and Software as a Service supplier (IaaS, PaaS, and SaaS) such as Amazon Web Services (AWS).
- Proficient in SQL databases HIVE, MySQL, PostgreSQL, Oracle and MongoDB.
- Written and developed scripts for automating tasks using Jenkins and UNIX shell programming.
- Designed, developed SQL database schema including stored procedures, triggers and views.
- Deep exposure to programming with Python, Java, R, SQL, Hive, SQOOP, PySpark.
- Experience in developing Visualization using D3.js, plotly, Tableau.
- Experience in using various version control systems like Git, GitHub, and Bitbucket .
- Recognition for providing outstanding customer service and understanding and meeting business needs.
- Well-versed in Visio to create process flows, data models.
- Self-starter with Drive, Initiative and Positive Attitude.
- Ability to interact with individuals at all levels.
- Tri-lingual English, Spanish and French.
- Excellent analytical and communication skills.
- Permanent Resident of USA (GC).
TECHNICAL SKILLS:
Languages: Python, PySpark, Scala
Databases: MySQL, Oracle DB, SQL Server, PostgreSQL, MongoDB
IDE: PyCharm, PL/SQL Developer, and TOAD
Integration Tools: Jenkins, IBM integration.
Web/App Server: Apache Webserver, Tomcat, IIS.
Configuration Management Tool: Svn, Git
Defect Tracking: JIRA, Bugzilla
Cloud services: AWS, VMware, Microsoft Azure, Bitnami, Cloudera
Operating Systems: Unix, Linux, Windows and Mac OS
EXPERIENCE:
Confidential, Palo Alto, CA
Python Developer
Responsibilities:
- Designed an Automated pipe-line which is triggered by the Merchant Request.
- Built Python Daemon Which polls for incoming requests, inserts into Mysql DB and submits the Python Pipe-line to Hadoop.
- Does splitting of the output files to reduce the size
- Encrypts the output using Confidential MLE Encryption.
- Developed a Python Daemon which submits non-blocks spark jobs and polls for the status on the Yarn.
- Developed a Python program which randomly generates Test Data set, stored as Hive Table. Used for Stress testing of the Pipeline
- Developed Python code using SqlAlchemy ORM to connect to the MySql.
- Developed Python program which performs Logcompression/ LogRotation of the Logfiles.
- Developed Python program which does health-check of the system.
- Worked on improve the performance of the Spark jobs by tuning the Executors/Memory.
- Proficient in Pyspark and processing the data on HDFS.
- Co-ordination with Team Members in India, Dallas and Palo Alto.
- Rewrite existing Python modules to deliver certain format of data.
- Involved in maintaining and editing Python scripts for application deployment automation
- Created Python scripts used to generate reports.
- Built a python program to encrypt the credentials
Environment: Python, Pyspark, Scala, HIVE, Hadoop, MySQL, REST Web Services, Shell Scripting, Bitbucket, Jenkins
Confidential, Englewood, CO
Python Developer
Responsibilities:
- Designed the Analytical application using Python, Spark, HDFS, AWS EMR.
- Extracted Data from Multiple Systems and Sources using Python and Loaded the Data into AWS EMR.
- Developed python code which transforms the data from Existing SQL Server/Oracle databases
- Used python’ s libraries like Pandas, Numpy, petl for Data Transformation and Loading
- Proficient in Pyspark and processing the data on HDFS.
- Hands on experience in migrating Business from a physical data centre environment to AWS.
- Rewrite existing Python modules to deliver certain format of data.
- Used No-SQL database MongoDB to store un-structured data.
- Working in team of developers to build and deploy Flask, Linux, and AWS.
- Automated AWS S3 data upload / download using Python scripts.
- Installed, configured, and managed the AWS server. AWS data pipeline for Data Extraction,
- Created tables on top of data on AWS S3 obtained from different data sources.
- Migrated the data from cluster into the AWS.
- Involved in maintaining and editing Python scripts for application deployment automation
- Working on various Integrated Development Environments like Pycharm, Anaconda Spyder.
- Created, activated and programmed in Anaconda environment. Wrote programs for performance calculations using NumPy
- Wrote Python scripts to parse XML documents and load the data in database.
- Created Python scripts used to generate reports.
- Rewrite existing Java application in Python
- Built various graphs for business decision making using Python matplotlib library.
- Transform Data using PySpark, and load into HDFS. Well versed with RDD and cloudera technologies.
- Participating in system redesign, performance improvement, development and implementation of a highly data centric applications.
- Designed, developed Sub Queries, Stored Procedures, Triggers, Cursors, and Functions on SQL Server.
Environment: Python, AWS EMR, Pyspark, HIVE SQL, MongoDB, MySQL, REST Web Services, Shell Scripting, Git, Jenkins, AWS, Azure.
Confidential
Python Developer
Responsibilities:
- Design and development of Web Application using Python, NodeJS and hosted in EC2 in AWS and troubleshoot various issues in Python code and fix them with code enhancements.
- Build a Secure FTP connection in between Servers and created a Data flow through scheduling Python Jobs.
- Involved in creating roles, groups in IAM and created new policies over storage buckets on S3 in AWS.
- Independently involved in writing Python scripts interacting with Google Cloud Storage, Datastore, AWS RDS, Oracle, S3, and Redshift.
- Designed the Project setup and completely involved in implementing the design of the Project Artifacts (Datastore, Storage buckets and creating Service Accounts) in Google Cloud Platform.
- Involved in consuming authentication and authorization services like Okta, OAuth2.0.
- Worked on backend of the application, monitoring the FTP server, maintain logs and involved in release plans.
- Involved in authorizing permission with in IAM and ADMIN services in AWS.
- Managed code versioning with Bitbucket using GIT Tool and deployment to staging and production servers.
- Actively involved in analysis of the system requirements specifications and involved in client interaction during requirements specifications.
- Implemented various mathematical operations for calculation purposes using python Numpy and Pandas.
Environment: Python, Cloudera, HDFS, Oracle, SQL Server, HTML, CSS, MySQL, REST Web Services, JavaScript, Eclipse, Linux, JQuery, GitHub, JIRA, GIT, AWS.
Confidential, Dallas TX
Python Developer
Responsibilities:
- Involved in building database Model, APIs and Views utilizing Python, to build an interactive web-based solution.
- Created a Django dashboard with custom look and feel for end user after a careful study of the Django admin site and dashboard.
- Developed views and templates with Python and Django's view controller and template language to create a user-friendly website interface
- Used Django forms to get vehicle entry and information.
- Used matplotlib and D3 for viewing trends as part of Financial Analysis module
- Worked extensively with Django models as various modules in DMS requires only specific fields from the Master Customer Database.
- Created restful web services using vehicle and customer information for various banks and insurance companies. Used Django-REST framework for implementing web services.
- Worked with view Sets in Django-REST framework for providing web services and consumed web services performing CRUD operations.
- Used Python Library Beautiful Soup 4 for Web Scraping to extract data for building graphs.
- Used PyQt to implement GUI for the vendors to create, modify and view reports based on their sales.
- Automated AWS S3 data upload / download using Python scripts.
- Installed, configured, and managed the AWS server. AWS data pipeline for Data Extraction,
- Created tables on top of data on AWS S3 obtained from different data sources.
- Migrated the data from cluster into the AWS environment.
- Implementation of AWS lambda with python scripting code.
- Designed and managed API system deployment using fast http server and Amazon AWS architecture.
- Business logic implementation, data exchange, XML processing, XML Schema and graphics creation has been done using Python and Django.
- Developed scripts for build, deployment, maintenance and related tasks using Jenkins, Docker, Maven, Python and Bash
Environment: Python, Django Web Framework, HTML, CSS, JavaScript, JQuery, Sublime Text, Jira, GIT, Web Services, UNIX, AWS