- 7+ years of experience as a python/AWS Developer. Hands on Experience in AWS like Amazon EC2, Amazon S3, Amazon Redshift, Amazon EMR and Amazon SQS.
- Over 5 years of IT experience in all phases of SDLC, along with experience in Application Development, Design, software development using Python, Client - Server based applications, testing and Big data ecosystem related technologies like Hadoop HDFS, Hive, Spark, AWS.
- Publishing dashboards and data sources to Tableau Server as well as managing access, user security, refreshing of Tableau extracts etc.
- Hands on Experience in Writing Python Scripts for Data Extract and Data Transfer from various data sources.
- Worked with Various Distributions like Cloudera, Hortonworks, Amazon AWS.
- Extensively worked on Spark SQL, Dataframes, RDD’s to improve the performance of the application.
- Designed and implemented Hive and Pig for evaluation, filtering, loading and storing of data.
- Developed web applications and RESTful web services and APIs using Python and Django.
- Good experience in Object oriented programming concepts in Python, Django and Linux.
- Worked on various applications using python integrated IDEs Eclipse, PyCharm, Sublime Text.
- Extensively worked on analyzing data using HiveQL, Pig Latin.
- Experience in JSON based REST web services and SOAP for sending and getting data for the JSON format.
- Implemented continuous integration using Jenkins.
- Experience in end to end design and deploy rich Graphic visualizations with Drill Down and Drop-down menu option and Parameterized using Tableau.
- Worked with Panda data frames and MySQL, queried MYSQL database queries from python using Python-MySQL connector and MYSQL db package to retrieve information.
- Worked with several Python libraries like NumPy, Pandas and Matplotlib.
- Good communication, organization and interpersonal skills.
- Worked in Agile, SCRUM process with high quality deliverables delivered on-time
Frameworks: Django, ReactJS, Django Rest Framework, Flask, Pyramid, NumPy, SciPy, Anaconda, PyCharm, TDD
AWS: S3, EC2, Elastic Beanstalk, Cloud Watch, Cloud Trail, VPC, ECS, IAM
Web Servers: Web Sphere, Web Logic, Apache Tomcat.
Web Technologies: JSP, XML, XSD, XPATH, XSLT, HTML5, UI Ajax, Web Services, REST API's.
Tools: : IDE, Selenium, Visual Studio, Docker, Kubernetes.
Programming Languages: Python, C#, C, C++.
SDLC: Waterfall, Agile, Scrum.
Databases: Oracle, MS Access, MySQL, MongoDB, Dynamo DB, Cassandra, Pig, Hive.
Bigdata Technologies: Spark, Hadoop
Version Controls: CVS, SVN, Git
Sr. Python Developer
Confidential, Columbus, IN
- Developed classes in Business Layer and Data Access Layer in C#. Worked with marketing company to build several Django, Pyramid, NumPy, SciPy, Anaconda, PyCharm and Flask applications .
- Amazon IAM service enabled to grant permissions and resources to users. Managed roles and permissions of users with the help of AWS IAM.
- Advance Python coding along with GIT, Bitbucket, Mercurial, Jenkins,, iDRAC, Dockers, Confluence (for documentation), PyCharm.
- Validating features such as SF Cluster Upgrades - SFS & DFS, VLAN Refresh, CRUX SDS, Snap Replication, Remote Replication along with iSCSI Volumes, Teaming, IO (VDBench), iDRAC & Switch port configuration.
- Refactor feature and system tests to optimize existing tests within a new framework build in python.
- Validating of various features of Solid Fire Defined Storage Application.
- Diagnose, debug, and perform root cause analysis on defects
- Document current regression suite behavior as a test case and determine coverage of specific metrics.
- Developed Python codes using the documented test case and convert the regression suites to python codes.
- Developed Automation code using Regular Expressions.
- Developing Python code using PEP 8 new coding standards.
- Using various tools like Pycharm, Pylink in order to maintain new coding standards and make debug easy with in the code.
- Worked on data pre-processing and cleaning the data to perform feature engineering and performed data imputation techniques for the missing values in the dataset using Python.
- Created Data Quality Scripts using SQL and Hive to validate successful das ta load and quality of the data. Created various types of data visualizations using Python and Tableau.
Sr. Python Developer
- Used Pandas, NumPy for statistical Analysis and Numerical analysis of Insurance premium. Worked on Angular JS framework to develop interactive websites based on client needs.
- Used Flask to connect front-end to back-end, Cherry pie to build the server and word clouds to visualize movies.
- Wrote and maintained data extracts written in C#, and created data extracts like reading in flat files, XML, reformatted the data to generate the spread sheets using Batch Processing.
- Wrote Python modules to extract/load asset data from the MySQL source database.
- Used python sub-process module to call UNIX shell commands to check directories or files exists.
- Conducted data analysis using Python libraries such as Pandas, Numpy and Matplotlib.
- Worked with Python Django Forms to record data of online users.
- Managed large datasets using Panda data frames and MySQL.
- Developed Spark applications using Scala utilizing Data frames and spark SQL API for faster processing of data.
- Designed and Developed the HiveQL scripts to load the data into HIVE database.
- Used Git for code submissions and review process.
- Involved in using Scala and python transformations using various Spark Actions and Transformations by Creating RDD’s from the required files in HDFS.
- Used Django configuration to manage URLs and application parameters .
- Responsible for writing Hive Queries for analyzing data in Hive warehouse using Hive Query Language (HQL).
Environment : Python, Drop wizard, Spring Boot, Lagom, Kafka, JSON, GitHub, LINUX, Django, Flask, Varnish, Nginx SOA, RESTful.
Python Developer/Cloud Engineer
- Developed Python based micro service to extract the data from system of records into Enterprise Data warehousing.
- Developed another micro service to extract AML data from Enterprise Data warehousing and push to external systems in JSON format.
- All these batch micro services are written in Python utilizing distributed message passing via Kafka message broker with JSON as data exchange formats.
- All these four micro services are deployed in Mesos cluster in AWS using Jenkins, Marathon, and Chronos .
- Debugging the application and following messages in log files, to figure out the error if existing.
- Developed monitoring application which captures the error related data and store it in database
- Involved in storing binary data using Couchbase and CouchDB Server.
- Involved in tokenizing the sensitive data before archiving in AWSS3 using REST based Enterprise Tokenization service, and encrypting the data before sending it over wire to external systems.
- Assisted with development of web applications Flask, Pyramid, Django, Plone.
- Developed Views and Templates with Python using Django ’s view controller and template language.
- Written wrapper scripts to automate deployment of cookbooks on nodes and running the chef client on them in a Chef-Solo environment.
- Converting production support scripts to chef, testing of cookbooks with chef-spec.
- Used Puppet server and workstation to manage and configure nodes.
- Responsible for large-scale Puppet implementation and maintenance. Puppet manifests creation, testing and implementation.
- Used SVN for branching, tagging, and merging.
- Setup Puppet master, client and wrote scripts to deploy applications on Dev, QA, production environment.
- Built an Interface between Django and Salesforce and Django with RESTAPI.
- Refactored existing batch jobs and migrated existing legacy extracts from Informatica to Python based micro services and deployed in AWS with minimal downtime.
- Involved in the setting up of Micro services in HA (Highly Availability) for resiliency.
- Used REST&SOAP to test web services.
- Developed Database Models in PostgreSQL.
- Developed Stored Procedures in PostgreSQL.
- Used GitHub for Python source code version control, Jenkins for automating the build docker containers, and deploying in Mesos.
- Performed unit testing using unit test and Nose, etc. Python unit testing frameworks.
- Created applications for software package, software framework and hardware platform using SDK
- Involved in service based RESTful technologies.
- Created a web service and provided its information to the service registry and made the information regarding the web service available to any potential requester using SOA.
- Assisted with development of internal APIs using PHP7, Laravel, and MySQL
- Performed in page caching using Nginx, Varnish.
- Assisted with writing effective user stories and divide the stories into SCRUM tasks.
Environment: Python, Boto3, Flask, Pyramid, Django, Plone, Docker, Mesos, SOA, REST, Chronos, Kafka, JSON, GitHub, Nginx, Varnish, LINUX.