We provide IT Staff Augmentation Services!

Python Developer Resume

North Bergen, NJ

SUMMARY:

  • 6+ years of IT experience in Analysis, Design, Development, Implementation and Testing of various stand - alone and client-server architecture-based enterprise application software in Python on various domains.
  • Expertise in implementing Object-Oriented technologies, Web based client-server architecture, service-oriented architecture and Object Relational Mappings (ORM).
  • Experienced in developing Web-based Applications using Python, CSS, HTML, JavaScript, Angular JS and JQuery.
  • 3+ years of experience with Snaplogic Platform.
  • Strong experience in Data Warehousing concepts such as D.W.H, DataMart's, Star Schemas, Snowflake schemas, Facts, Fact less facts, various Dimensions, SCD techniques, Dimension modeling (SCD) etc.
  • Skilled experience in Python with using new tools and technical developments (Libraries Used: libraries - Beautiful Soup, Jasy, numpy, Scipy, matplotlib, Pickle, PySide, Pandas dataframe, SQLAlchemy, networkx, numpy, scipy, urllib2, Pychart, Highcharts, PySide, SciPy, PyTables) to drive improvements throughout entire SDLC.
  • Experience of working with WAMP (Windows, Apache, MYSQL and Python/PHP) and LAMP (Linux, Apache, MYSQL and Python/PHP) Architecture.
  • Experience with Open VMS 6.2, 7.1 & 7.2, Unix (SOLARIS), Linux, and NT systems. Performed system tuning functions and installed software on NT, Unix, and Alpha-servers.
  • Expertise in working with server-side technologies including databases, Restful APL and MVC design patterns.
  • Good experience with T-SQL and SQL Server.
  • Experience in creating Open stack services for Identity, Compute, Image Service, Block Storage, Networking (Neutron, Keystone).
  • Superior Troubleshooting and Technical support abilities with Migrations, Network connectivity, Security, and Database applications.
  • Good experience in Shell Scripting, SQL Server, UNIX and Linux, Open stack. Involved in Unit testing, Integration testing, User-Acceptance testing, and Functional testing.
  • Experience in working with different operating systems Windows 98 / NT / 2000 / XP / Vista / 7 / 8, UNIX, and MAC OS X.
  • Experience in AWS Cloud services such as EC2, S3, EBS, VPC, ELB, Route53, Cloud Watch, Security Groups, Cloud Trail, IAM, Cloud Front, Snowball, RDS and Glacier.
  • Experienced in developing API services in Python/Tornado, NodeJS while leveraging AMQP and RabbitMQ for distributed architectures.
  • Expertise in PIP (community most used Python Package Manager), PyVows (BDD tool for Python).
  • Expertise in working with GUI frameworks-Pyjamas, Jytho, guidata, PyGUI, PyQt PyWebkitGtk and Experienced with Elasticsearch, Log stash and Kibana (ELK).

WORKING EXPERIENCE:

Confidential, North Bergen, NJ

Python Developer

Responsibilities:

  • Developed Kafka producer and consumers, HBase clients, Spark, shark, Streams and Hadoop MapReduce jobs along with components on HDFS, Hive.
  • Used Spark and Snowflake to calculate various analytics and served them in Postgres for our Metrics Dashboard.
  • Developed Python based API (RESTful Web Service) to track the events and perform analysis using Django.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using Star and Snowflake Schemas. used Enterprise Data Warehouse (EDW) architecture and various data modeling concepts like star schema, snowflake schema in the project.
  • Helped IT reduce the cost of maintaining the on-campus Informatica PowerCenter servers by migrated the code to Informatica Cloud Services.
  • Involved in importing the existing Power Center workflows as Informatica Cloud Service tasks by utilizing Informatica Cloud Integration.
  • Involved in the error checking and testing of the ETL procedures and programs of Informatica session log.
  • Develop/manage data pipelines in SnapLogic integrated with AWS cloud platform processing huge amount of data consumed from on-prem cluster, process and transform the input (files in avro format and intermediately stored in S3) to output json RDF data.
  • Leveraged AWS cloud services such as EC2, auto scaling and VPC to build secure, highly scalable and flexible systems that handled load on the servers.
  • Deployed the Django application on NGINX along with CI/CD tools and docker.
  • Implemented TFS Build Archival to AWS Simple Storage Service S3 and created Life Cycles for managing the files in S3. Implemented cloud watch for setting alarm for monitoring the Ec2 instances.
  • Used Kubernetes to deploy scale, load balance, and worked on Docker Engine, Docker HUB, Docker Images, Docker Compose for handling images for installations and domain configurations.
  • Extracted data from the database using SAS/Access, SAS SQL procedures and create SAS data sets. Responsible for Data Backups, Restores and Recovery using tape.sys software.
  • Used python packages cx oracle, pyodbc and MySQL dB for working with Oracle, SQL server and MySQL DB respectively. Code conventions pushed for PEP8 and Pylint compliance Analysing the data in existing PIP schema. Modifying data using SAS/BASE, SAS/ MACROS.
  • Created server monitoring daemon with Psutil, supported by Elasticsearch app for analytics which I created. Also researched big data solutions with Cassandra database.
  • Configured Elastic Load Balancer and Auto scaling to design cost effective, fault tolerant and highly available systems.
  • Using Chef, deployed and configured Elasticsearch, Logstash and Kibana (ELK) for log analytics, full text search, application monitoring in integration with AWS Lambda and CloudWatch.
  • Worked on developing CRUD applications using MERN stack (MongoDB, ExpressJS, ReactJS and NodeJS) and REST based API.
  • Wrote python scripts using Boto3 to automatically spin up the instances in AWS EC2 and OPS Works stacks and integrated with Auto scaling to automatically spin up the servers with configured AMIs.
  • Used ReactJS to create Controllers to handle events triggered by clients and send request to server. Maintained states in the stores and dispatched the actions using redux.
  • Developed Docker images to support Development and Testing Teams and their pipelines and distributed images like Jenkins, Selenium, JMeter and Elastic Search, Kibana and Logstash (ELK) and handled the containerized deployment using Kubernetes.
  • Worked with Data migration from Sqlite3 to Apache Cassandra database. Cassandra data model designing, implementation, maintaining and monitoring using DSE, Dev Centre, Data stax Ops center.
  • Automated various infrastructure activities like Continuous Deployment, Application Server setup, Stack monitoring using Ansible playbooks and has Integrated Ansible with Run deck and Jenkins.
  • Used SQL Alchemy as Object Relational Mapper (ORM) for writing ORM queries.
  • Developed a module to build Django ORM queries that can pre-load data to greatly reduce the number of databases queries needed to retrieve the same amount of data.
  • Creating and configuring virtual development environments with Chef and Virtual Box part of the SOA (Service Oriented Architecture) team enforcing best practices for services (REST and SOAP).
  • Built application interface and web scrapping scripts using OO designing, UML modeling and dynamic data structures.
  • Increased speed and memory efficiency by implementing code migration to convert python code to C/C++ using Cython.
  • Developed full stack Python web framework with an emphasis on simplicity, flexibility, and extensibility. It is built atop excellent components and reinvents zero wheels. WSGI, routing, templating, forms, data, plugins, config, events, SQL Alchemy, Storm, CouchDB, OpenID, App Engine, jQuery, etc.
  • Developed a fully automated continuous integration system using Git, Gerrit, Jenkins, MySQL and custom tools developed in Python and Bash.
  • Involved in front end and utilized Bootstrap and Angular.js for page design. Created Data tables utilizing PyQt to display customer and policy information and add, delete, update customer records.
  • Worked with python editors like PyCharm, PyScripter, PyStudio, PyDev, Wing IDE and Spyder.
  • Developed tools using Python, Shell scripting, XML to automate some of the menial tasks.
  • Worked on Jenkins continuous integration tool for deployment of project. Played a key role in a development wide transition from Subversion to Git, which resulted in increase in efficiency for the development community.
  • Worked on Cloud platform engineering Kubernetes, Spinnaker, Docker, Terraform, Consul, drone Jenkins, Chef, Kitchen. Scheduled, deployed and managed container replicas onto a node cluster using Kubernetes.
  • Wrote Ansible Playbooks with Python SSH as the Wrapper to Manage Configurations of AWS nodes and Tested Playbooks on AWS instances using Python. Run Ansible Scripts to Provide Dev Servers.

Confidential, Richmond, VA

Python Developer

Responsibilities:

  • Developed the presentation layer using HTML, CSS, JavaScript, JQuery and AJAX. Utilized Python libraries wxPython, numPY, Pandas, Twisted and mat PlotLib.
  • Worked with Python libraries: requests, python-ldap, suds, pexpect, pip, subprocess.
  • Developed frontend and backend modules using Python on Django including Tasty Pie Web Framework using Git.
  • Implemented SQL Alchemy which is a python library for complete access over SQL.
  • Developed views and templates with Python and Django's view controller and templating language to created user-friendly website interface.
  • Used Pandas library for statistical Analysis. Panda's library was used for flexible reshaping and pivoting of data sets.
  • Developed cross-browser/platform with ReactJs, nodeJs, JQuery, AJAX and HTML5/CSS3 to desired design specs for single page layout using code standards. Created UI from scratch using ReactJs.
  • Used Pandas, NumPy, seaborn, SciPy, Matplotlib, Scikit-learn, NLTK in Python for developing various machine learning algorithms and utilized machine learning algorithms such as linear regression.
  • Used Django configuration to manage URLs and application parameters.
  • Installed, configured, and managed the AWS server. AWS data pipeline for Data Extraction, Transformation and Loading from the homogeneous or heterogeneous data sources.
  • Accessed database objects using Django Database APIs. Worked on python-based test frameworks and test-driven development with automation tools. Worked with real time streaming application and batch style large scale distributed computing applications using tools like Spark Streaming.
  • Implemented advanced procedures like text analytics and processing using the in-memory computing capabilities like Apache Spark written in Scala.
  • Responsible for debugging and troubleshooting the web application. Manage the configurations of multiple servers using Ansible.
  • Deployed mircoservices2, including provisioning AWS environments using Ansible Playbooks.
  • Provisioned load balancer, auto-scaling group and launch configuration for Microservices using Ansible.
  • Used Ansible playbooks to setup Continuous Delivery pipeline. This primarily consists of a Jenkins and Sonar server, the infrastructure to run these packages and various supporting software components such as Maven, etc.
  • Created cloud service using AWS, managed Virtual machines and websites using AWS-EC2, ELB, Autoscaling, Lambda.
  • Developed installer scripts using Python(boto3) for various products to be hosted on Application Servers. Written Python utilities and scripts to automate tasks in AWS using boto3 and AWS SDK. Automated backups using AWS SDK (boto3) to transfer data into S3 buckets.
  • Experience in writing playbooks for Ansible and deploying applications using Ansible. Automated various infrastructure activities like Continuous.
  • Deployment, Application Server setup, Stack Monitoring using Ansible playbooks and has integrated Ansible with Run deck and Jenkins.
  • Provisioned and patched servers regularly using Ansible. Implemented Ansible to manage all existing servers and automate the build/configuration of new servers.
  • Developed an Ansible role for Zabbix-agent which will be integrated into the to the CICD pipeline.
  • Used Ansible to document all infrastructures into version control. Used Ansible to document application dependencies into version control.
  • Develop python code to automate the ingestion of common formats such as JSON, CSV by using Logstash from elastic search to Kibana dashboard to be viewed by clients.
  • Responsible for designing and deploying new ELK clusters (Elasticsearch, Logstash, Graphite Kibana, beats, Kafka, zookeeper etc.
  • Design, build and manage the ELK (Elasticsearch, Logstash, graphite, Kibana) cluster for centralized logging and search functionalities for the App.
  • Designed, Automated the process of installation and configuration of secure DataStax Enterprise Cassandra cluster using puppet.
  • Configured internode communication between Cassandra nodes and client using SSL encryption.
  • Docker container deploying micro services, and scaling the deployment using Kubernetes.
  • Developed Chat Ops interfaces with slack and Kubernetes on GKE.
  • Working on Spinnaker platform for Multi-Cloud Continuous Delivery (Bake, Test, & Deploy/Container Pipelines) using Packer, Terraform, Kubernetes, AWS, GCP.
  • Responsible for on boarding Application teams to build and deploy their code using Git Hub Jenkins, Nexus and Ansible.
  • Migrated out core repository from SUBVERSION to GIT. Managed GitHub Projects and migrated from SVN to GitHub with history.
  • Used Cloud Trail, TESSA, Cloud Passage, Check Marx, Qualys Scan tools for AWS security and scanning.
  • Automated various service and application deployments with ANSIBLE on CentOS and RHEL in AWS.
  • Wrote ANSIBLE Playbooks with Python, SSH as the Wrapper to Manage Configurations of AWS Nodes and Test Playbooks on AWS instances using Python. Run Ansible Scripts to provision Dev servers.
  • Created monitors, alarms and notifications for EC2 hosts using Cloud Watch. Performed web testing automation using selenium API.
  • Worked on Redux making to do list reduces, reducers functions and implementing store method PI Services, JavaScript, Bootstrap, GIT, and JSON.
  • Responsible for Configuring Kafka Consumer and Producer metrics to visualize the Kafka System performance and monitoring. Written and Maintained Automated Salt scripts for Elasticsearch, Logstash, Kibana, and Beats.
  • Performed parameterization of the automated test scripts in Selenium to check how the application performs against multiple sets of data Contributed in developing.
  • Automation Framework that uses Java, Selenium Web Driver and Testing. Wrote automation test cases and fixing automation script bugs. Experience with migration to Amazon web Services AWS.

Confidential, New york

Python Developer

Responsibilities:

  • Worked on developing Web Services using - SOAP, WSDL and developing DTDs, XSD schemas for XML (parsing, processing, and design).
  • Implemented logging mechanism to capture exceptions and errors using Log4j tool. Used Tortoise SVN as a version-control client for Subversion
  • Responsible for entire data migration from Sybase ASE server to Oracle. Migration of API code written for Sybase to Oracle.
  • Debug the application using Firebug to traverse the documents and manipulated the Nodes using DOM and DOM Functions using Firefox and IE Developer Tool bar for IE.
  • Developed web applications in Django Framework's model view control (MVC) architecture.
  • Creation of REST Webservices for the management of data using Apache CXF.
  • Worked on Python to place data into JSON files for testing Django Websites. Created scripts for data modeling and data import and export.
  • Worked on ReactJS for its code reusability and integrated Bootstrap. Used Redux architecture in the whole process to connect Actions.
  • Designed and Developed the input/output data formats in XSD for the WSDL files and accordingly implementing services using Apache AXIS2 & Used NetBeans IDE to develop the application.
  • Implemented Docker containers to create images of the applications and dynamically provision slaves to Jenkins CI/CD pipelines.
  • Developed Hadoop integrations for data ingestion, data mapping and data processing. Used XML for dynamic display of options in select box and description on web page.
  • Implemented the MVC architecture using Apache Struts Framework & Experience in Python open stack API's. Worked with Rational ClearCase to provide sophisticated version control, workspace management and parallel development support. Build and maintained a Selenium Regression test suite.
  • Involved in building database Model, APIs and Views utilizing Python, to build an interactive web-based solution.
  • Built various graphs using Matplotlib package which helped in taking business decisions.
  • Created Restful API's to integrate and enhance functionalities of the application. Also Utilized Restful API in communicating with third parties.
  • Worked on automation using the Python scripting language, Git on Cygwin32 and XML.
  • Worked on monitoring tools like Nagios, Zabbix, AWS Cloud Watch to health check the various deployed resources and services.
  • Involved and played a leading role in database migration projects from Oracle to MongoDB.
  • Designed and managed API system deployment using fast http server and Amazon AWS architecture.
  • Designed and developed Use-Case, Class and Object Diagrams using UML Rational Rose for Object Oriented Analysis/Object Oriented Design techniques.
  • Involved in writing application level code to interact with APIs, Web Services using JSON and involved in AJAX driven application by invoking web services/API and parsing the JSON response.
  • Executed asynchronous tasks with help of Celery and RabbitMQ.
  • Vast experience with Core Java and J2EE using most of the advanced features of Java including JDBC, Spring, Struts, EJB, Servlets, Hibernate.
  • Developed views and templates with Python and Django view controller and templating language to create a user-friendly interface using MVC architecture. Worked on resulting reports of the application and Tableau reports and involved in modifying data using SAS/BASE, SAS/ MACROS.
  • Involved in installing software using pip command for python libraries like Beautiful Soup, NumPy, SciPy, python-twitter, RabbitMQ, Celery, matplotlib, Pandas data-frame and used the PEP8 coding convention.
  • Migration of API code written for Sybase to Oracle and was involved in Overlook the migration activity of PL/SQL programs.
  • Build back-end application with Python / Django, Worked on Dockers, RabbitMQ, Celery, and Jenkins. Involved in migrating the Libraries written using Sybase API's to Oracle OCCI API.
  • Used Python Libraries Pandas and NumPy, SQL and Tableau to procure, clean and aggregate data from Relational database to generate status reports and dashboards.

Hire Now