Sr. Python Developer Resume
San Jose, CA
SUMMARY
- Around 8 years of experience spread across Python, Big Data, Apache Spark, Scala, Java, SQL technologies.
- Have a hand - on experience on fetching the live stream data from DB2 to HDFS using Spark Streaming and Apache Kafka.
- Experience in real time data from various data sources through Kafka data pipelines and applied various transformations to normalize the data stored in HDFS Data Lake.
- Expertise with different tools in Hadoop Environment including Pig, Hive, HDFS, MapReduce, Sqoop, Spark, Kafka, Yarn, Oozie, and Zookeeper.
- Extensive work in ETL process consisting of data transformation, data sourcing, mapping, conversion, and loading using Informatica.
- Implemented workflow actions to drive troubleshooting across multiple event types in Splunk.
- Expertise in developing data driven applications using Python 2.7, Python 3.0 on PyCharm and Anaconda Spyder IDE's.
- Hands on experience in configuring and working with Flume to load the data from multiple sources
- Proficient in designing and querying the NoSQL databases like Hbase, Cassandra, MongoDB, Impala.
- Experience with Web Development, Amazon Web Services, Python, and the Django framework.
- Experienced in using MVC architecture using RESTful, Soap Web services and SoapUI and high-level Python Web frameworks like Django and Flask. Experience object-oriented programming (OOP) concepts using Python, Django, and Linux.
- Experience in building isomorphic applications using React.js and Redux with GraphQL on server side.
- Experienced in WAMP (Windows, Apache, MYSQL, and PHP) and LAMP (Linux, Apache, MySQL, and PHP) Architecture.
- Experienced in MVW frameworks like Django, Angular.js, Java Script, backbone.js, jQuery and Node.js.
- Have experience on Kubernetes and Docker for runtime environment of system to build, test & deploy
- Good experience in working with Amazon Web Services like EC2, Virtual private clouds (VPCs), Storage models (EBS, S3, instance storage), Elastic Load Balancers (ELBs)
- Familiar with JSON based REST Web services and Amazon Web services.
- Knowledge on integrating different eco-systems like Kafka - Spark - HDFS.
- Good Knowledge in Apache Spark and SparkSQL.
- Expertise in software development environments with machine learning algorithms and deep learning frameworks such as TensorFlow, PyTorch, Computational and Recurring Neural Network.
- Experience in running Spark streaming applications in cluster mode and Spark log debugging.
- Skilled on migrating the data from different databases to Hadoop HDFS and Hive using SQOOP.
- Good Experience in the core concepts of MapReduce Framework and Hadoop ecosystem.
- Experience in optimizing volumes, EC2 instances and created multiple VPC instances and created alarms and notifications for EC2 instances using Cloud Watch
- Extensive knowledge of creating manages tables and external tables in Hive Eco system.
- Worked extensively in design and development of business process using SQOOP, PIG, HIVE, HBASE
- Expertise in Working on Data Encryption (Client-Side and Server-Side) and securing data at rest and in transit for data in S3, EBS, RDS, EMR, Red Shift using Key Management Service (KMS).
- Good Knowledge in Amazon AWS concepts like EMR and EC2 web services which provides fast and efficient processing of Big Data.
- Extensive experience in deploying, configuring, and administering Splunk clusters.
- Having a Complete Understanding on Lambda architectures
- Experienced in developing Web Services with Python programming language and Good working experience in processing large datasets with Spark using Scala and Pyspark.
- Knowledge on Spark framework for batch and real-time data processing.
- Knowledge on Scala Programming Language. Good experience with Talend open studio for designing ETL Jobs for Processing of data.
- Hands on experience in MVC architecture and Java EE frameworks like Struts2, Spring MVC, and Hibernate.
- Experienced in WAMP (Windows, Apache, MYSQL, and Python) and LAMP (Linux, Apache, MySQL, and Python) Architecture and Wrote Automation test cases using Selenium Web Driver, JUnit, Maven, and spring.
- Good knowledge in Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC).
- Worked in agile and waterfall methodologies with high quality deliverables delivered on-time.
- Experience with Test Driven Development (TDD), Agile, Scrum and Waterfall methodologies. Used ticketing systems like JIRA, Bugzilla, and other proprietary tools.
- Excellent communication and inter-personal skills detail oriented, analytical, time bound, responsible team player and ability to coordinate in a team environment and possesses high degree of self-motivation and a quick learner.
TECHNICAL SKILLS
Languages: Python, Java, C, C++, PHP
Web technologies: ReactJs, AngularJs, HTML5, JavaScript, ES6, jQuery, Ajax, CSS3, Bootstrap, XML
Web frameworks: FLASK, DJANGO, Node.js, Pyramid, Spring and CSS Bootstrap
Testing Tools: Selenium, Pytest, NoseTest, ROBOT Framework
Database: ORACLE, MySQL, SQLite, PostgreSQL, MS SQL SERVER, MongoDB, Cassandra
IDE s/ Development Tools: NetBeans, Eclipse, PyCharm, and Sublime Text, Soap UI
UML Modelling: Rational Rose, UML, StarUML
Operating system: Mac OS X, Ubuntu, CentOS, Red Hat, Windows
Virtualization tools: VMware Workstation 10/11
CI/CD: Jira, Git, SVN, Confluence, Jenkins, Docker
Version Control: SVN, CVS, Git, GitHub
Cloud Computing: AWS EC2, S3, RDS (MySQL), SQS
Web Servers: Apache Tomcat, Web Logic, Apache Cassandra
PROFESSIONAL EXPERIENCE
Sr. Python Developer
Confidential, San Jose, CA
Responsibilities:
- Automate different workflows, which are initiated manually withPython scriptsandUnix shell scripting.
- Create, activate, and program inAnaconda environment.
- UsePythonunit and functional testing modules such asunit test,unittest2,mock, andcustom frameworksin-line withAgile Software Developmentmethodologies.
- Helped teams to configure Webhooks in Bitbucket to trigger automated builds in Jenkins.
- Extensively worked with Avro and Parquet files and converted the data from either format Parsed Semi Structured JSON data and converted to Parquet using Data Frames in PySpark.
- Designed and implemented open-source AI frameworks like PyTorch, TensorFlow, Scikit-learn.
- Developed Splunk infrastructure and related solution for application toolsets.
- InstalledHadoop, Map Reduce, HDFS, AWSand developed multipleMapReducejobs inPIGandHivefor data cleaning and pre-processing.
- Consumed the data from Kafka using Apache spark.
- Developed backend web services using Node.js and stored dependencies using Node Package Manager (NPM).
- Designed SPA (Single page applications) in Flux architecture using React.js.
- Developed analytical component using Scala and KAFKA.
- Developed REST Microservices which are like API’s used for Home Automation. They also keep the data in synchronization between two database services.
- Manage datasets usingPanda data framesandMySQL, queried MYSQL database queriesfrompython usingPython-MySQLconnector andMySQL dBpackage to retrieve information.
- Development of front-end application using HTML, CSS, JavaScript, Angular.JS, Dependency injection, Rx and Http modules with Node.js server for future evolutions
- Developed Golang API and chat-bot using TDD to automate software deployments and rollbacks using Github webhooks.
- Performed Kafka analysis, feature selection, feature extraction using Apache Spark Machine.
- Involved in theWeb/Application developmentusingPython 3.5, HTML5, CSS3, AJAX, JSONandjQuery.
- Develop and tested many features for dashboard usingPython, Java, Bootstrap, SAAS, CSS, JavaScript,andjQuery.
- GeneratedPython Djangoforms to record data of online users and usedPyTestforwriting test cases.
- Implemented and modified variousSQL queriesandFunctions, CursorsandTriggersas per the client requirements.
- Clean data and processed third party spending data into maneuverable deliverables within specific format with Excel macros and python libraries such asNumPy, SQL Alchemy and matplotlib.
- Built a TensorFlow Object Detection transfer learning model for custom dataset of client.
- Helped team to on-board data, create various knowledge objects, Install, and maintain the Splunk Apps.
- UsedPandasasAPIto put the data as time series and tabular format for manipulation and retrieval of data.
- Helped with the migration from the old server toJira database(Matching Fields) withPython scriptsfor transferring and verifying the information.
- Manage Splunk configurations files like input, props, transforms and lookups.
- Designed and developed web crawler in python using Scrapy framework and using RabbitMQ as a messaging server between the microservices.
- Develop SOA (SAAS), service documents for Enterprise Applications.
- Build sing page apps, modules, graphics, and reusable components using Angular.JS, TensorFlow, React.Js, Bootstrap.js, Node.js, Backbone
- Support production and development on AWS Cloud SaaS Linux environments.
- Created a full web stack using AWS Infrastructure (Beanstalk, multiple lambdas, Amazon Aurora, API Gateway etc.) to create a fully functioning API using GraphQL technology with multiple data sources.
- Analyse Format data usingMachine Learning algorithmbyPython Scikit-Learn.
- Used PySpark to expose Spark API to Python.
- Experience inpython, Jupiter, Scientific computing stack (NumPy, SciPy, pandas and matplotlib).
- Perform troubleshooting, fixed, and deployed manyPython bug fixesof the two main applications that were a main source of data for both customers and internal customer service team.
- WritePython scriptsto parseJSONdocuments and load the data in database.
- Generating various capacity planning reports (graphical) using Python packages likeNumPy, matplotlib.
- Developed single page application by usingAngular JSbacked byMongoDB and Node.js.
- Manage code versioning withGitHub, Bit Bucket,and deployment to staging and production servers and implementMVC architecturein developing theweb applicationwith the help ofDjango framework.
- Use Celery as task queue andRabbitMQ, Redisas messaging broker to execute asynchronous tasks.
- Design and manageAPI system deploymentusing fast http server andAmazon AWS architecture.
- Develop remote integration with third party platforms by usingRESTful web servicesand Successful implementation ofApache Spark and Spark Streaming applicationsfor large scale data.
- Built various graphs for business decision making usingPython mat plotlib library.
- Integrated Zoom messaging webhooks with ServiceNow through a scripted REST API to provide improved incident module experience with conference room tracking and reporting for rapid incident responders
- Good exposure in interacting with RESTful webservices SaaS, PaaS, IaaS.
- Export Test case Scripts and modified the selenium scripts and executed inSelenium environment.
- Developed entire frontend and backend modules usingPython on Django Web Framework.
- Scraping website usingPython Beautiful Soup, and then parsed it withXML.
- Outputting the parsed dataas JSON/BSONand stored intoMongoDB
- Querying data fromMongoDBand use them as input for the machine learning models
- UsingAWSfor application deployment and configuration.
- WroteUNIX shell scriptingfor automation.
- Managed and reviewed Hadoop log file and also worked in analyzing SQL scripts and designed the solution for the process using PySpark.
- Developed views and templates withDjangoview controller and template Language to create a user-friendly website interface.
- Implemented Data Quality framework using AWS Athena, Snowflake, Airflow and Python.
- UsedJavaScriptandJSONto update a portion of a webpage.
- Develop consumer-based features usingDjango, HTML and Test-Driven Development(TDD).
- DevelopedPython web servicesfor processingJSONand interfacing with the Data layer.
- Increased the speed of pre-existing search indexes throughDjango ORM optimizations.
- Developed module to buildDjango ORM queriesthat can pre-load data to greatly reduce the number of databases queries needed to retrieve the same amount of data.
Environment: Python 2.7, Splunk 4.3,5.0, Django, Webhooks, HTML5/CSS, SAAS, TensorFlow, Splunk Enterprise, PostgreSQL, Node.js MS SQL Server 2013, MySQL, React.JS, JavaScript, Jupiter Notebook, VIM, PyCharm, Shell Scripting, PySpark, Angular.JS, JIRA.
Sr. Python Developer
Confidential, Carson, CA
Responsibilities:
- Worked on development ofinternal testing tool frameworkwritten inPython.
- DevelopedGUIusing Python and Djangofordynamically displayingblock documentationand other features of python code using a web browser.
- Wrotescripts inPythonfor extracting data fromHTMLfile.
- Developed views and templates withPythonandDjango'sview controllerand templating language to create a user-friendly website interface.
- Optimize the Pyspark jobs to run on Kubernetes Cluster for faster data processing
- Designed machine learning and deep learning models using TensorFlow, PyTorch.
- Writing production-grade JavaScript, Node.js and Ext JS applications.
- Provide regular support guidance to Splunk project teams on complex solution and issue resolution.
- Developed front end using Angular.js, React.JS, Node.js, bootstrap.js, backbone.js, JavaScript, where back end is java with REST webservice.
- Worked with Requests, NumPy, SciPy, TensorFlow, Matplotlib, and Pandas python libraries during development lifecycle.
- Experienced in developing web-based applications using SaaS,Python, Django, Kafka, JavaScript, and jQuery based in Ansible automation script, Angular.JS
- Analysed the sql scripts and designed it by using PySpark SQL for faster performance.
- Experienced best practices for integrating microservices into an existing system architecture.
- Experience in designing and deploying AWS Solutions using EC2, S3, AWS EMR, Athena, and AWS Redshift.
- UsedJavaScriptandJSONto update a portion of a webpage.
- Performed troubleshooting, fixed, and deployed manybug fixes for applications that were a main source of data for both customers and internal customer service team.
- Created the Application using React.JS and Node.js libraries NPM, gulp directories to generate desired view and flux to root the URLs properly.
- Handled potential points of failure (database, communication points and file system errors) through error handling and communication of failure.
- Responsible for debugging the project monitored onJIRA (Agile)
- WrotePython scriptstoparse JSON documentsand load the data in database.
- Developed Microservices and worked on Dockers, Spring Boot, deployed applications on Pivotal Cloud Foundry.
- Worked on Restful APIs to access the third-party data for the internal analysis and statistical representation of data.
- UsedPythonandDjangoto interface with thejQuery UIand manage the storage and deletion of content.
- Build SQL queries for performing various CRUD operations like create, update, read and delete.
- Developed entire frontend and backend modules usingPythononDjangoincludingTasty pieWeb Framework usingGit.
- Developed Restful Microservices using Flask and Django and deployed on AWS servers using EBS and EC2.
- Worked on front end frame works like CSS Bootstrap for development of Web applications.
- Created database usingMySQL, wrote several queries to extract data from database.
- Worked inNoSQLdatabase on simple queries and writing Stored Procedures for normalization and renormalization.
- Setupautomatedcorn jobstoupload data into database,generate graphs, bar charts,upload these chartsto wikiand backup the database.
- Design and develop API-based microservices that integrate with Slack and Zoom.
- DevelopedMerge jobs inPythonto extract and load data intoMySQL database.
- Successfully migrated theDjango databasefromSQLitetoMySQLtoPostgreSQLwithcomplete data integrity.
- Designed front end usingUI, HTML, Bootstrap, Node.js, underscore JS, Angular JS, CSS,andJavaScript.
- Designed and modified User Interfaces using JSP, Angular.js, Node.js, JavaScript, CSS and jQuery.
- FollowedAGILEdevelopment methodologyto develop the application.
- Involved in Data base design, normalizations, and de-normalization techniques.
- Used AWS Redshift, S3, Spectrum and Athena services to query large amount data stored on S3 to create a Virtual Data Lake without having to go through ETL process.
- Deployment SaaS with PowerShell.
- Involved in User Acceptance Testing and prepared UAT Test Scripts.
- Effectively communicated with the external vendors to resolve queries.
- Used and customized apache server to for checking our developed project.
- UsedTest driven approach (TDD)for developing services required for the application.
- ImplementedIntegration test cases.
- UsedGitto resolve and codingthe work onpythonandportlet.
Environment: Python 2.7, Django 1.4, HTML5, CSS, SAAS, XML, MySQL, Node.js, React.JS, TensorFlow, JavaScript, Angular JS, jQuery, CSS Bootstrap, JavaScript, PySpark Eclipse, Git, GitHub, AWS, Linux, Shell Scripting.
Python Developer
Confidential, NY
Responsibilities:
- Worked in comprehending and examining the client's business requirements.
- UsedDjango frameworksandPythonto builddynamic webpages.
- Developed tools for monitoring and notification usingPython.
- Enhanced the application by usingHTMLandJava scriptfor design and development.
- Used data structures like directories, tuples, object-oriented class-based inheritance features for making complex algorithms of networks.
- Implemented Microservices architecture in developing the web application with the help of Flask framework.
- Responsible for maintaining and expanding AWS (Cloud Services) infrastructure using AWS Stack especially worked with database setup and maintenance on AWS EC2.
- CreatedPHP/MySQLback-end for data entry from Flash and worked in tandem with the Flash developer to obtain the correct data through query string
- Involved in designing databaseModel, API's, Views using python to build an interactiveweb-based solution.
- GeneratedPython DjangoForms to record data of online users.
- Implemented Data tables toadd, delete, updateanddisplaypatient records and policy information using PyQt.
- Implemented a module to connect and view the status of anApache Cassandra instanceusingpython.
- DevelopedMVCprototype replacement of current product withDjango.
- Improved theData Securityand generated report efficiently by caching and reusing data.
- CreatedUIusingJavaScriptandHTML5/CSS3.
- Managed datasets usingPanda data framesandMYSQL. Queried the database queries usingPython-MySQLconnector and retrieved information usingMySQL db.
- Recorded the online users' data usingPython Djangoforms and implemented test case usingPytest.
- Developed the application using theTest-driven methodologyand designed the unit tests usingPython Unit test framework.
- Implemented CI/CD to deploy the Microservices in Kubernetes Cluster in Azure Cloud (Jenkins job that pulls the images from Private Docker Registry and deploy services in the cloud).
- Createdweb applicationprototype usingjQueryandAngular JS.
- Deployed the project intoHerokuusingGIT versioncontrol system,
- Maintained and Updated the application in accordance to the clientele's requirement
Environment: Python 3, Django 1.6, Tableau 8.2, Beautiful soup, HTML5, CSS/CSS3, Bootstrap, XML, JSON, JavaScript, jQuery, Angular JS, Backbone JS, Restful Web services, Apache spark, Linux, Git, Amazon s3, Jenkins, MySQL, Mongo DB, T-SQL, Eclipse.
Python Consultant
Confidential, Bellevue, WA
Responsibilities:
- Created or identified state-of-the-artcomputational mechanics algorithmsthat extend modelling capabilities, implement as software suitable for high-performance computers, and document their verification.
- Wrote script or program to automate analysis tools to streamlining the evaluation of initial data for integrity.
- UsingPython packagecreated an interface automate import and exporting drawing and extracting standard data for manufacturing purpose.
- Built application and programto streamline the cavity development and process and generate documentation from the design and via versa.
- UsedPython Flask frameworkto build modular & maintainable applications.
- Automated data movements using Python scripts. Involved insplitting, validating, and processing of files.
- Created corePython APIwhich will be using among multiple modules.
- Participated in developing web applicationUIusingHTML, CSS, JavaScript.
- Uploaded statistics toMySQLfor analysis and logging.
- Developed complexSQL queriesfor testing the database functionality.
- UsedUNIX serverfor application deployment and configuration.
- Wrote shell scripting for automation.
- Designed database schema design and defined relationships between tables.
- Provided technical assistance formaintenance, integration,andtestingof software solutions during development and release processes.
- Createdunit test/regression testframework forworking/new code.
- Controlled backend logic usingPython.
- Provideddesign/CADteam strong scripting capabilities inSKILL, PERL, Ocean, Linux Shell, etc.
- Interface withCAD toolvendors to prove out releases and flows, solve bugs, improve usability, etc.
- Performed3D Modellingand Analysis usingCAD tools
- Wrote script and macros for analysis tool for dynamic analysis
- Installation of differentUnix System, CAD, Databaserequired to perform
- Solution for existing machinery and new developed machine based on analysis and simulation.
- Creating3D complex surface modellingand do variousstatic, dynamic, and fluid analysis.
- Supporting documentation needed for manufacturing engineering change request and clearly and frequently communicate with all functional areas.
Environment: Python, CAD, UNIX, MySQL, Pandas, Flask, OpenNLP, StanfordNLP, CSS, JavaScript, XML, MATLAB
