Python Developer Resume
Chicago, IL
SUMMARY
- Around 7+ years of experience as a Web/ApplicationDeveloperand coding with analytical programming usingPython, DJango, Java.
- Experience in all phases of Software Development Life Cycle (SDLC) - Waterfall, Agile Process across various workflows (Requirement study, Analysis, Design, Coding, Testing, Deployment and Maintenance) in Web & Client/Server application development.
- Extensive experience in developing web applications using Python, Django, Flask frameworks.
- Experience in working with several python libraries including Beautiful soup, NumPy, matplotlib, SciPy, PyQt.
- Good experience of software development in Python (libraries used: libraries- PySpark, Matplotlib, asyncio, python-twitter, Pandas data frame, PostgreSQL for database connectivity)
- Hands-on experience with industry-standard IDEs like PyCharm, Jupyter Notebook.
- Expertise in full life cycle application development and good experience in Unit testing and Test-Driven Development (TDD) and Behavior driven Development.
- Proficient in writing SQL Queries, Stored procedures, functions, packages, tables, views, triggers using relational databases like PostgreSQL.
- Good experience in Shell Scripting, SQL Server, UNIX and Linux and visualization tools such as PowerBI and Tableau.
- Strong design and development experience in J2EE and Web technologies including JSP, Servlets, JDBC, JNDI, EJB, JMS,AMQP,RabbitMQ,SOA, OSGI Bundles, Service Mix, Apache Camel Routing, SOAP and REST, JAX-WS Web Services and REST with Metro stack and Apache CXF, XML SAX/DOM/STAX Parsers GWT,CSS,HTML, DOJO.
- Experience on Migrating SQL database to Azure data Lake, Azure data lake Analytics, Azure SQL Database, Data Bricks and Azure SQL Data warehouse and Controlling and granting database access and Migrating On premise databases to Azure Data lake store using Azure Data factory.
- Install and configuration of Web hosting administration HTTP, FTP, SSH, RSH.
- Experience in Developing Spark applications using Spark - SQL in Databricks for data extraction, transformation and aggregation from multiple file formats for analyzing & transforming the data to uncover insights into the customer usage patterns.
- Expert in setting up SSH, SCP, SFTP connectivity between UNIX hosts.
- Migration of users from non-secure FTP protocol-based servers to more secure SFTP, FTPS, HTTPS based servers.
- Experience with an in-depth level of understanding in the strategy and practical implementation of AWS Cloud-Specific technologies including EC2, EBS, S3, VPC, RDS, SES, ELB, EMR, CloudFront, Cloud Formation, Elastic Cache, Cloud Watch, Cloud Trail, RedShift, Lambda, SNS, DynamoDB, AWS Import/Export.
- Experience in AWS cloud infrastructure database migrations, PostgreSQL and converting existing ORACLE and MS SQL Server databases to PostgreSQL, MySQL and Aurora.
- Experience in Building S3 buckets and managed policies for S3 buckets and used S3 bucket and Glacier for storage and backup on AWS.
- Played a key role in Migrating Teradata objects into the Snowflake environment.
- Experience with Snowflake Virtual Warehouses.
- Experience in using XML, SOAP, REST web Services for interoperable software applications.
- Experience in Agile development processes ensuring rapid and high-quality software delivery.
- Proficient inStatistical ModellingandMachine Learningtechniques (Linear,Logistics,Decision Trees,Random Forest,SVM,K-Nearest Neighbours,Bayesian,XGBoost).
- Knowledge on Hadoop ecosystem, HDFS, Map/Reduce functionality, also worked on processing large data sets using PySpark library in Python applications.
- Well versed with Agile, SCRUM and Test-driven development methodologies.
- Experience in handling errors/exceptions and debugging issues in large scale applications.
- Highly motivated, dedicated, quick learner and have proven ability to work individually and as a team.
- Excellent written and oral communication skills with results-oriented attitude.
- Experience with general system delivery and DevOps and automation frameworks, including one or more of the following: Docker, Kubernetes, and Ansible.
- Well versed in design and development of presentation layer for web applications using technologies like HTML, CSS, and JavaScript.
- Expert knowledge in front - end development usingPython 3.6, Django, Angular JS, Angular 2, Node JS.
TECHNICAL SKILLS
Programming Languages: Python, JAVA, C#, C++, SQL, COBOL, JCL
Query Languages: SQL, PL/SQL
Operating Systems: Windows Vista/XP/7/8/10, Linux, Unix, OS X
Deployment Tools: AWS (EC2, S3, ELB, RDS, Glue), Heroku, Jenkins, Azure
Web Development: CSS, HTML, DHTML, XML, JavaScript. Angular JS, JQuery and AJAX
Web Servers: WebSphere, WebLogic, Apache, Gunicorn
Python Framework: Django, Flask, Web2py and Bottle, Pyramid, Swagger, RabbitMQ
Bug Tracking Tools: Jira, Bugzilla, Junit, gdb
Databases: Oracle 11g/10g/9i, Cassandra 2.0, MySQL, SQL Server RC 2008, Data Warehousing
Cloud Computing: Amazon EC2/S3, Heroku, Google App Engine
Methodologies: Agile, Scrum and Waterfall
IDEs: Sublime Text, PyCharm, Eclipse, NetBeans, jDeveloperWebLogic Workshop, RAD
PROFESSIONAL EXPERIENCE
Confidential, Chicago, IL
Python Developer
Responsibilities:
- Implemented user interface guidelines and standards throughout development and maintenance of website.
- Worked with various Python libraries such as Six, Click, Pandas and Matplotlib for analysis and manipulation of data.
- Designed Docker proof-of-concept using Hashicorp's Nomad and Consul.
- Used Consul for Service Discovery and to create Key-Value pair.
- SQL
- Worked with SAS and Python for data analysis and visualization purpose to get a better understanding about the data.
- Migrated the SAS code to Python for the better use of python visualization and analytical models.
- Created own algorithms to predict the Taxpayer’s behavior which will be helpful for future purpose.
- Involved in development using Python, bug fixing.
- Developed web-based applications using Python (3.x), Django (1.11), XML, CSS, HTML, and DHTML.
- Developed entire frontend and backend modules using Python on Django Web Framework.
- Proficient in AWS services like VPC, EC2, S3, ELB, Auto Scaling Groups (ASG), EBS, RDS, IAM, CloudFormation, Route 53, CloudWatch, CloudFront, CloudTrail.
- Management and Administration of AWS Services CLI, EC2, VPC, S3, ELB Glacier, Route 53, CloudTrail, IAM, and Trusted Advisor services.
- Development of real-time multi-tasking systems using Python.
- Combination of these elements (future prediction & multi-dimensional segmentation) would enable the department to design better web platform and have a vision for the future Taxation.
- Used Python 3.x (NumPy, spicy, pandas, seaborn) and Spark 2.0 (PySpark, MLlib) to develop variety of models and algorithms for analytic purposes.
- Deployed GUI pages by using JSP, JSTL, HTML, DHTML, XHTML, CSS, JavaScript, and AJAX.
- Configured the project on WebSphere 6.1 application servers
- Developed a Machine Learning testbed with 24 different model learning and feature learning algorithms.
- Used SAX and DOM parsers to parse the raw XML documents
- Used RAD as Development IDE for web applications.
- Develop and plan required analytic projects in response to business needs.
- In conjunction with data owners and department managers, contribute to the development of data models and protocols for mining production databases.
- Designed and developed Data Base management system using MySQL. Built application logic usingPython.
- Managed relational database applications with Django ORM framework and MySQL database
- Develop new analytical methods and/or tools as required.
- Contribute to data mining architectures, modeling standards, reporting, and data analysis methodologies.
- Worked with security protocols like SSH and TLS
- Conduct research and make recommendations on data mining products, services, protocols, and standards in support of procurement and development efforts.
- Work with application developers to extract data relevant for analysis.
- Provide and apply quality assurance best practices for data mining/analysis services.
- Utilized Apache Spark with Python to develop and execute Big Data Analytics and Machine learning applications, executed machine learning use cases under Spark ML and Mllib.
- Developed Spark/Scala, Python for regular expression (regex) project in the Hadoop/Hive environment with Linux/Windows for big data resources.
Environment: Python 3.4/2.7, Django (1.11) CherryPy (v14.0), HTML5 (5.1), CSS (v3), Bootstrap (4.x), JSON (2.1.0), JavaScript (5.1), AJAX (1.1.0), Restful web service, SQLite, Cassandra (2.2), AWS (EC2, S3), PySpark, MySQL
Confidential, St Louis, MO
Sr. Python Developer
Responsibilities:
- Involved in building database Model, APIs and Views utilizing Python, to build an interactive web-based solution.
- Experience in handling, configuration, and administration of databases likeMySQLand NoSQL databases likeMongoDBand Cassandra.
- Create a Pyspark frame to bring data from DB2 to Amazon S3.
- Optimize the Pyspark jobs to run on Kubernetes Cluster for faster data processing
- Involved in Migrating Objects from Teradata to Snowflake.
- Installed, configured and maintained Active Directory, DNS, DHCP, WINS, Firewall, VPN, SSH.
- Experience in MongoDBinstallation, patching, troubleshooting, performance, tracking/tuning, back - up and recoveryin dynamic environments.
- Used Django configuration to manage URLs and application parameters.
- Utilized PyQt to provide GUI for the user to create, modify and view reports based on client data. Created PyUnit test scripts and used for unit testing.
- Developing ETL pipelines in and out of data warehouses using a combination of Python and Snowflakes SnowSQL Writing SQL queries against Snowflake.
- Heavily involved in testing Snowflake to understand the best possible way to use the cloud resources.
- Used Scala sbt to develop Scala coded spark projects and executed using spark-submit
- Worked closely with the application development team to integrate blaze rules with the business application.
- Involved in developing a linear regression model to predict a continuous measurement for improving the observation on wind turbine data developed using spark with Scala API.
- Implemented Spring boot microservices to process the messages into the Kafka cluster setup.
- Developed data warehouse model in snowflake for over 100 datasets using whereScape.
- Experienced with Advanced JavaScript including prototype-based inheritance, AJAX, JSON and familiar with JavaScript frameworks such as, JQuery and JQuery-UI.
- Experience in developing the applications using Python 3.6/3.7, Flask web framework backed up by MS SQL/PostgreSQL databases using SQLAlchemy for Object Relational Mapper (ORM).
- Designed and managed API system deployment using fast http server and Amazon AWS architecture.
- Hands-on experience in using message brokers such as ActiveMQ and RabbitMQ.
- Knowledge in Data mining and Data warehousing using ETL Tools and Proficient in Building reports and dashboards in Tableau (BI Tool).
- Design and Develop ETL Processes in AWS Glue to migrate Campaign data from external sources like S3, ORC/Parquet/Text Files into AWS Redshift.
- Implemented AWSCode Pipelineand Created Cloud formation and JSONtemplates inTerraformfor infrastructure as code.
- Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data Factory, T-SQL, Spark SQL and U-SQL Azure Data Lake Analytics . Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks.
- Spun up HDInsight clusters and used Hadoop ecosystem tools like Kafka, Spark and databricks for real-time analytics streaming, sqoop, pig, hive and CosmosDB for batch jobs.
- Implemented a'server less'architecture usingAPI Gateway, Lambda, and DynamoDBand deployedAWS Lambda codefrom Amazon S3 buckets. Created a Lambda Deployment function, and configured it to receive events from your S3 bucket.
- Created and configured new batch job in Denodo scheduler with email notification capabilities.
- Guided and migrated Postgresql and MySql databases to AWS Aurora.
- Analysed the sql scripts and designed it by using PySpark SQL for faster performance.
- Knowledgeable in AWS Services including EC2, S3, RDS, Redshift, Glue, Athena, IAM, QuickSight.
- Hands on experience in creating Docker containers and Docker consoles for managing the application life cycle.
- In-depth knowledge of SnowflakeDatabase, Schema and Tablestructures.
- Very good hands-on experience working with large datasets and Deep Learning algorithms usingApache spark and TensorFlow.
Python Developer
Confidential, TX
Responsibilities:
- Involved in Requirement gathering, Analysis, Design, Estimation and testing of the assigned tasks in Openstack.
- Implemented Rally openstack benchmarking tool on the entire cloud environment.
- Written Nova, Glance, Neutron, Cinder, Keystone, Hashborad, Swift, Pythonclient API to integrate with existing Application.
- Create a strategic architectural design of the platform with networking (vlans, firewalls, load balancers), hypervisors (kvm and VMware), workflow and orchestration (OpenStack APIs, Smart Cloud Orchestrator), security (keystone, LDAP), Inventory and monitoring, licensing, backup/restore.
- Understanding Pythonfiles in openstack environment and make necessary changes if needed.
- Using Cinder to enable persistent storage for applications like databases deployed in Openstack.
- Involved in Automated Openstack and AWS deployment using Cloud Formation, heat and Puppet.
- Working on several Pythonpackages like NumPy, Beautiful Soup, SQLAlchemy, PyTables etc.
- Involve in the development of the application using Python3.3, HTML5, CSS3, AJAX, JSON and Jquery.
- Maintained and managed Puppet modules responsible for deploying Openstack and other cloud tools.
- Developed Cloud infrastructure like compute, storage, and platform Restful services to implement OpenStack API
- Create a custom dashboard using JSF replacing an existing Horizon dashboard using the RESTful API provided by Openstack
- Use Pythonpackages like cx oracle, pyodbc and MySQL dB for working with Oracle, SQL Server and MySQL DB respectively.
- Design front end using UI, HTML, Bootstrap, Angular, CSS, and JavaScript.
- Developed entire frontend and backend modules using Pythonon Django including Tastypie Web Framework using Git.
- Used Pythonand Django creating graphics, XML processing, data exchange and business logic implementation
- Design and developed data management system using PostgreSQL.
- Wrote PythonOO Design code for manufacturing quality, monitoring, logging, and debugging code optimization.
- Build Back-end support for Application from ground up using Python, Shell scripts & Perl.
- Use the Model View controller (MVC) framework to build modular and maintainable applications.
- Used Test driven approach for developing the application and Implemented the unit tests using PythonUnit test framework.
Environment: Python, Django, Web Framework, HTML5, CSS3, Bootstrap, MongoDB, Linux, Javascript, JQuery, AJAX, JSON, Sublime Text, Jira, Git, Django-cms, SSO, database access,Django-cms plugins, Bootstrap, Agile, GitHub, Junit, Agile, UML, JSP, Xml, SOA.
Confidential
Python Developer
Responsibilities:
- Designed front end and backend of the application utilizing Python on Django Web Framework.
- For the development of the user interface of the website used HTML, CSS, Java Script.
- Experience in developing views and templates with Python and Django's view controller and templating language to create a user-friendly website interface.
- Creating REST API's using Django Rest Framework, creating custom middleware and generating tokens.
- Customizing the Django forms in frontend using Django crispy forms and Django template engine.
- Querying the database to get the data and insert the data using Django's ORM.
- For the development of the web applications utilized CSS and Bootstrap.
- Utilized SQLite database for development and SQL server for production.
- Skilled in using Collections in Python for manipulating and looping through different user defined objects.
- Analyzing the requirements. Developing code to run the application.