We provide IT Staff Augmentation Services!

Python Developer Resume

0/5 (Submit Your Rating)

Austin, TexaS

SUMMARY

  • Mastering/Leading in the development of applications/tools usingPython for 6years.Excellent Hands - on experience programming in Python, C++, Java and C
  • Experience in Big Data Technologies like Hadoop, Map Reduce, HDFS, Zookeeper, Kafka, Hive, Pig, Sqoop, Oozie, Flume, Yarn, HBase, Spark with Scala.
  • Experience in Amazon AWS Technologies like S3, EMR, EC2, RedShift, AWS Glue, Athena
  • Experience in Azure Cloud Technologies like Azure Data Factory, Azure SQL Managed Instance, Azure HDInsight, Azure Blob Storage, Azure Data Lake Gen2,Azure Synapse.
  • Good Experience on deployment Automation & Containerization using Docker, Kubernetes.
  • Excellent experience in developing Apache Nifi as ETL tool for batch processing and real time processing.
  • Good experience inHadoop Big Dataprocessing. Expertise in developing the queries inSpark SQL, Hive.
  • Experience in creating Kafka producer and Kafka consumer for Spark streaming.
  • Experienced in Object Oriented Concepts withC++,Javaand Python.
  • Proficient with JavaScript frameworks such as Angular (8x-11) and React
  • Experience using ES2015 (ES6) and TypeScript 2.x
  • Excellent knowledge in the various phases ofSDLCRequirementAnalysis,Design,Developmentand Testing
  • Experience indevelopingmulti-threaded applications using C in LinuxEnvironment.Good experience in error and exceptional handling.
  • Excellent Experience in debugging the issues using debuggers likegdb, pdb.
  • Good understanding with development best practices such as code reviews andunittesting.
  • Experience in designing the automation framework usingPerlandShellscripting.
  • Experience in writingtest plans,test cases,test specificationsandtest coverage.
  • Experience withVersion Control, ideallySVN, CVSandGIT.
  • Proficiency with fundamental front-end languages such as DOM, HTML5, CSS3 and JavaScript (ES5 and ES2015+)
  • Experience with front-end tooling workflows: Node.js (NPM), WebPack, Angular-CLI
  • Build cross-platform desktop applications based on Electron.js and native languages such as C++ with deep system integrations, fast and reliable performance requirements.Experience with Responsive Web Design (RWD) patterns
  • Proficiency with server-side languages such as Python and Java
  • Familiarity with database technology such as MySQL, Oracle, and MongoDB.
  • Experienced in UNIX Shell scripts.
  • Flexible in learning and using new technologies, working in team environment as well as independently to complete the project.

TECHNICAL SKILLS

Programming: Python, C++, C, Java, JavaScript

Tools: and Technologies: Node.js (NPM), WebPack, Angular-CLI, React, Linux, NoSQL, MySQL Database, HTML CSS, Operating Systems, Cloud Services, Algorithm Design, Natural Language Processing, Automation, Machine Learning, TCP/IP, RESTful and SOAP Architecture, MQTT protocol, IoT environment, Thingworx

Industry Knowledge: AWS Cloud, Google Cloud Platform, Apache NiFi, HDFS, Apache Spark, Apache KafkaPlatforms: Eclipse, PyCharm, Spyder, NetBeans, Sublime Text

Methodologies: Agile (Scrum, JIRA), Waterfall, SDLC, TDD

Version Controls: Git

Software: Cadence Virtuoso, MATLAB, SPICE, OrCAD, NI Multisim

Operating Systems: Windows, UNIX, Linux, Mac OS X

Relevant Courses: Cloud Computing, Database Systems,Computer Networks, Machine Learning, Algorithms, Software Engineering

PROFESSIONAL EXPERIENCE

Python Developer

Confidential, Austin, Texas

Responsibilities:

  • Work on learning tools and techniques to design and develop an AI Voice Assistant application for electronic medical records.
  • Working experience on python database connection activity with PyMySQL, SqlAlchemy, cx oracle, instant client, PyMsSQL packages.
  • Design and develop rich single-page application that combines consumer-grade usability and design sensibility with enterprise-grade performance, scalability, reliability and implementing MVC architecture.
  • Using Fetch to request JSON data through Python Flask REST APIs to acquire data from the server and display it in sorted order.
  • Develop Node.js native C++ add-ons and cross-platform JavaScript or Typescript modules using Angular 10running on Windows, macOS, and GNU/Linux
  • Build cross-platform desktop applications based on Electron.js, and native languages such as C++ with deep system integrations, fast and reliable performance requirements.
  • Following User Interface guidelines and standards throughout development and maintenance phase of application using CSS, HTML, JavaScript.
  • Agile, Scrum and all phases of Software Development Life Cycle (SDLC) such as planning, Analysis, Design, Implementation, Testing and Maintenance on large scale business and enterprise portal applications.
  • Design different components using Python/PERL languages with extensive use of Object Oriented Programming Techniques like classes, Polymorphism, Method Overloading, Method Overriding etc.
  • Utilizing Angular design patterns components, models, services to build user-friendly, cross-platform compatible single page web applications.
  • Experience writing Angular typescript components using NgModules, RxJs, Animations, Routern, HttpClient, Forms.
  • Ability to perform component interactions through services, invoking input and output bindings.
  • Communicating with back-end services using HTTP and handling errors using HTTPErrorReponse, TimeOutResponses.
  • Ability to Interact between different angular applications, writing accessibility level coding approach usiang aria attributes.
  • Build RESTful API’s using Python Flask and SQL Alchemy data models, Increased code quality 90% by writing unit tests using Pytests.
  • Working in development of applications especially in DOCKER environment and familiar with setting up docker images and containers.
  • Powerful web-based HTTP request connections to check asynchronous service statuses using Websockets.
  • Implement server-side validation using Javascript ES6, Angular CLI, JSON.
  • Testing application using selenium automation, Karma and TDD test performance for back-end.
  • Writing specific Unit test for front end application using angular type script.
  • Familiar with akv2k8s (Azure key-vault to Kubernetes) deployment process. Involved in building deployment, service, inject YAML files. Familiar with creating secret key-vaults, Repository and Pipeline process using Azure Devops.
  • In-line code reviews to stay focused on code under review using Atlassian inside the JIRA issue workflow.
  • Adhere to scrum methodology and routinely deliver high quality working software on deadline.
  • Following JIRA Kanban board process during development, Code Review, Testing and Ready to release phases.
  • Support desktop distribution efforts such as application installers and beta-based automatic updates
  • Work cross-functionally with various Intuit teams: product management, QA/QE, various product lines, or business units to drive forward results

Software Developer

Confidential, Santa Clara, CA

Responsibilities:

  • Designed and build 5 custom dashboards to turn millions of data points into actionable information.
  • Built Apache NiFi pipeline as an ETL toolto interact with LinkedIn, Twitter, HubSpot, Google Analytics and BrightTALK APIs to drive end to end data initiatives including working with data engineers to automate tracking, measurements and models.
  • Worked on Apache NiFi data Pipeline to process large data setsfordesigning and evaluating experiments to improve efficiency of marketing actions.
  • Designed and developed scalable and cost-effective architecture in AWS Big Data services for data life cycle of collection, ingestion, storage, processing, and visualization.
  • Involved in creating End-to-End data pipeline within distributed environment using the Big data tools, Spark framework and Tableau for data visualization.
  • Ensure that application continues to function normally through software maintenance and testing in production environment.
  • Leverage Spark features such as In-Memory processing, Distributed Cache, Broadcast, Accumulators, Map side Joins to implement data preprocessing pipelines with minimal latency.
  • Implemented real-time solutions for Money Movement and transactional data using Kafka, Spark Streaming, Hbase.
  • The project also includes a spread of big data tools and programming languages like Sqoop, Python, Oozie etc.
  • Worked on scheduling Oozie workflow engine to run multiple jobs.
  • Experience in creating python topology script to generate cloud formation template for creating the EMR cluster in AWS.
  • Good knowledge on AWS Services like EC2, EMR, S3, Service Catalog, and Cloud Watch.
  • Experience in using Spark-SQL to handle structured data from Hive in AWSEMR Platform (M4.Xlarge,M5.12Xlarge clusters).
  • Exploring with Spark, improving performance and optimization of the existing algorithms in Hadoop using Spark Context, Spark-SQL, Data Frame, and Pair RDD's.
  • Experienced in handling large datasets using Partitions, Spark in Memory capabilities, Broadcasts in Spark, Effective & efficient Joins, Transformations and other during ingestion process itself.
  • Experienced in optimizing Hive queries, joins to handle different data sets.
  • Involved in creating Hive tables (Managed tables and External tables), loading and analyzing data using hive queries.
  • Actively involved in code review and bug fixing for improving the performance.
  • Good experience in handling data manipulation using Python Scripts.
  • Involved in development, building, testing, and deploy to Hadoop cluster in distributed mode.
  • Created Splunk dashboard to capture the logs for end-to-end process of data ingestion.
  • Written unit test cases for Pyspark code for CICD process.
  • Worked on configuring Lookup’s for Data Validation and Integrity.
  • Created automated systems that can scale with increase in expectations and complexity.
  • Developedvisualization, reports,and key performance indexes which enable us to track marketing effectiveness and make marketing decisions
  • Worked onvariouspredictive, explanatory models and machine learning algorithms.
  • Proactively identifyopportunities for marketing and sales and gather data to drive revenue.
  • Collaborate with business teams to identify critical business problems and areas of opportunity to translate business questions toAdvanced Analytics solutions for the business problems.

Software Engineer

Confidential, Boise, Idaho

Responsibilities:

  • Spearheaded and deployed an independent solo project to build new IoT pipeline for Dry Etch process at Fab plant.
  • Created MQTT communication and established low latency and high data transfer rate between the tool and sensor.
  • Built entire real-time Apache NiFi pipeline to ingest vibration data into HDFS &perform data analytics using Python at same time.
  • Designed the automation framework using Shellscripting and Python.
  • Automated the entire project and accomplished a faster data transfer by 10% compared to other Fab.
  • Transferred the analytics results using a REST API to Thingworx and to visualize them on Thingworx mashup.
  • Worked on development of SQL and stored procedures on MYSQL.
  • Managed source control and version control using GIT and Project status tracking using JIRA.
  • Worked in agile environment using techniques such as TDD, Pair Programming, Refactoring, Continuous Integration
  • Converted legacy Shell scripts to Map-Reduce jobs in a distributed manner without performing any kind of processing on the Edgenode to eliminate the burden.
  • Created Apache Spark applications for data preprocessing for greater performance.
  • Developed Spark code and Spark-SQL/streaming for faster testing and processing of data.
  • Experience in creating spark applications using RDD, Dataframes.
  • Worked extensively on hive to analyse the data and create reports for data quality.
  • Implemented Partitioning, Dynamic Partitions and Buckets in HIVE for increasing performance benefit and helping in organizing data in a logical fashion.
  • Written Hive queries for data analysis to meet the business requirements and designed and developed User Defined Function (UDF) for Hive.
  • Involved in creating Hive tables (Managed tables and External tables), loading and analyzing data using hive queries.
  • Good knowledge about the configuration management tools like SVN/CVS/Github.
  • Experience in configuring Event Engine nodes to import and export the data from Teradata to HDFS and vice-versa.
  • Worked with FDP team to create a secured flow to get the data from KAFKA Queue to CS3.0.
  • Expert in creating the SFTP Connection to the internal and external source to get data in secured manner without any breakage.
  • Handle the production Incidents assigned to our workgroup promptly and fix the bugs or route it to the respective teams and optimized the SLA’s.

Python Developer

Confidential

Responsibilities:

  • Developed parsers written inPython to extract useful data from the design database.
  • Developedxml parsingand data structuresusingPython.
  • Built Python APIs that work in Linux and windowsand maintaining them using the GIT.
  • Analyzed the data in design database using various Python packages such as pandas, scipy and numpy
  • Worked on infrastructure withDockercontainerization.
  • Implemented Kubernetesto orchestrate the deployment, scaling and management ofDockerContainers.
  • Worked on Micro Services deployments on AWS ECS and EC2 instances
  • Worked on defining the components and subcomponents usingPythonto represent the system in hierarchy form.
  • Developed a set of library functions over the system based on the user needs.
  • Used Djangoconfiguration to manage URLs and application.
  • Finding out any bugs during testing phase of application, spinning up a ticket and working on the fix.
  • Continuous Integration and Deployment using Jenkins.

Python Developer

Confidential

Responsibilities:

  • Experience in writing SQL queries for performing various CRUD operations like create, update, read and delete.
  • Designed and created the ‘Apple Academy’ website using React, Java, CSS3, and HTML5.
  • The site can create custom graphics for the skill sets of incoming employees & their training progress which is then accessed by the Managers to access their performance.
  • Worked on Responsive web design approach. Integrated with front-end applications with REST APIs.
  • Analyzed the functional specifications and user interface specification documents.
  • Transitioned software development efforts to a test-driven development (TDD) process for gains in code quality, software functionality and programmer productivity. Developed backend of the application using the flask framework.
  • Designed and maintained databases using Python. Developed Python based API (RESTful Web Service) using FlaskSQL Alchemy and PostgreSQL.
  • Maintained and monitored frameworks like Django, flask
  • Involved in various phases of Software Development Life Cycle (SDLC) such as requirement gathering, modeling, analysis, design and development.
  • Collaborated with Product Management and User Experience experts for product definition, schedule and project related decisions.
  • Skilled in using Python collections for manipulating different user defined objects.
  • Found critical bugs in the firmware which helped the teams to make stable software.
  • PerformedRoot Cause Analysis (RCA) of the application as per the client requirements to deliver enhancement.
  • Executed DML, DDL Operations as per the Business requirement.
  • Developed front end reports using Java objects like JSP and Servlets.
  • Participated in design, test case reviews and production support readiness reviews for new releases. Provided inputs for Ready / NOT Ready decision.
  • Implemented web applications in Flask and spring frameworks following MVC architecture.
  • Worked with PIP in managing software packages written in Python.

Software Engineer Intern 

Confidential

Responsibilities:

  • Developed architecture to automate and streamline the entire process of Data collection and analysis to effectively analyze the code defectsinPython.
  • Worked with several distributed systems, networking, data structures & algorithms, and operating systems.
  • Experience in writing queries to get results from MySQLdatabase.Created aPython/Djangobased web application which uses Python for data analysis,MySQL database, andHTML, CSS, JQueryandHighCharts, matplotlibfor data visualization of sales, trends’ identification &progress tracking.
  • Worked on continuous integration/delivery solutions.Developed and tested many features for dashboard using Flask, CSS and JavaScript.
  • ImplementedFlaskframeworkfor the application backend development.
  • Added support forAmazonAWS S3to host static/media files and the database intoAmazon Cloud.
  • Performed extensive code reviewing usingGitHubpull requests. Improved code quality and conducted meetings among Team from time to time.
  • Developed server-based web traffic withRESTful APIs, Flask and Pandas.
  • Implemented PL/SQL views, wroteStored Procedures and database triggers.
  • Involved in the project life cycle that includes design, development, implementation, verification and validation

We'd love your feedback!