Sr Software Engineer Resume
Sunnyvale, CaliforniA
SUMMARY
- 7+ years of experience in Analysis, Design, Development, Management, and Implementation of various stand - alone, client-server enterprise applications, refining and scaling data management and analytics, procedures, workflows and best practices.
- Experienced in extracting Real time feed using Spark Streaming and converting it to RDD and processing data in the form of Data Frame and saving the data as Parquet format in HDFS.
- Experienced in handling large datasets using partitions, Spark in Memory capabilities Broadcasts in Spark, effective & efficient joins, transformation and other during ingestion process itself.
- Worked on migrating legacy SAS programs into Spark transformations using Spark and Python.
- Expertise in database platforms and as well worked on Hadoop, Spark, AWS EC2, AWS S3 and monitoring resources for better understanding of functioning of systems.
- Hands-on experience in loading data from the UNIX file system to HDFS.
- Good understanding of Object-Oriented Technologies and Relational Database Systems.
- Good Knowledge of Agile Methodologies (Scrum).
- Expertise in Infrastructure and configuration management tool Terraform and Ansible for automation
- Experience in Setting up the build and deployment automation for Terraform scripts using Jenkins
- Provisioned the highly available EC2 Instances using Terraform and cloud formation and wrote new plugins to support new functionality in Terraform.
- Strong experience in Shell Scripting, SQL Server, Linux, and Open stack.
- Experienced in writing SQL Queries, Stored procedures, functions, packages, tables, views, triggers.
- Building a Data Quality framework, which consists of a common set of model components and patterns dat can extended to implement complex process controls and data quality measurements using Hadoop.
- Experience in creating Splunk dashboard and writing SPL for all the search commands, functions and arguments.
- Strong Experience in working withPython ORM Libraries including Django ORM, SQLAlchemy.
- Experienced of software development in Python (libraries used: libraries- Beautiful Soup, NumPy, SciPy, matplotlib, python-twitter, Pandas data frame, network, urllib2, MySQL dB for database connectivity) and IDEs - sublime text, Spyder, PyCharm.
- Experienced in developing web-based applications usingPython, Django, PHP, XML, CSS, HTML, DHTML, JavaScript and jQuery, RUBY, AJAX.
- Expertise in working with GUI frameworks - Pyjamas, Python.
- Good noledge of web services with protocols SOAP, REST.
- Has Experience in working with server-side technologies including databases, restful API and MVC design patterns.
- Experienced in NoSQL technologies like MongoDB, Cassandra and relational databases like Oracle, SQLite, PostgreSQL and MySQL databases.
- Very strong experience writing API's and Web Services in PHP andPython.
- Superior Troubleshooting and Technical support abilities with Migrations, Network connectivity, and Security and Database applications.
- Expert level skills in HTML, CSS, and JavaScript including familiarity with common libraries like jQuery, Foundation, Bootstrap and Backbone.
- Skilled in debugging/troubleshooting issues in complex applications.
- Experience in working with different operating systems WINDOWS, LINUX, and iOS.
- Expert in maintaining technical documentation for projects.
- Good analytical and problem-solving skills and ability to work on own besides being a valuable and contributing team player.
- Excellent Interpersonal and communication skills, efficient time management and organization skills, ability to handle multiple tasks and work well in a team environment.
- Hands-on experiences in writing and reviewing requirements, architecture documents, test plans, design documents, quality analysis and audits.
TECHNICAL SKILLS
Languages: Python, C, Ruby, shell scripting.
Web Design: HTML5, XHTML, CSS3, JSP, AJAX
Databases: Microsoft SQL Server, SQLite, MySQL, PostgreSQL, DB2, MongoDB, Cassandra, Redis
Frameworks: Django, Flask, Pyramid, Pyjamas, Jython, Angular JS, Node JS, Spring, Hibernate
Python Libraries: Report Lab, NumPy, SciPy, Matplotlib, HTTPLib2, Urllib2, Beautiful Soup, Pickle, Pandas
Application and Web Servers: Apache Tomcat, JBOSS, WE Brick, Phusion Passenger
BigData Ecosystems: HDFS, Apache Spark, AWS EMR, PySpark
Version Control Systems: CVS, SVN, Git and GitHub.
Deployment tools: Amazon EC2, He Roku
Operating Systems: Windows, Linux, Unix
Protocols: HTTP/HTTPS, TCP/IP, SOAP, SMTP
Other Tools: MS Office (MS-Excel, MS-PowerPoint, MS-Project 2013), Visio 2013
Cloud Amazon Web Services: Terraform, cloud formation
PROFESSIONAL EXPERIENCE
Confidential, Sunnyvale, California
Sr Software Engineer
Responsibilities:
- Developed a CLI tool and underlying backend services for data scientists to package, upload, optimize, encrypt, and deploy AI models for Inferencing Engine.
- Designed and developed containerized applications using Docker, Kubernetes, and AWS EKS clusters.
- Developed Chatbot's noledge base (engine) and Client application using Python 3.
- Led group of engineers in an Agile Software Development process dat includes reviews of code and mentoring.
- Played a major role in migrating data from old systems as part of the cost reduction project. Maintained, enhanced, and added new features to the existing applications based on the business needs.
- Built a real time data pipeline to process the images from multiple data sources by utilizing the AWS services such as Step function, Cloud Watch and Lambda and exposed these lambda services to API Gateways.
- Implemented thumbnail generation, image cropping for different types of images like DICOM, JPEG and PNG using python libraries Pydicom, Pillow.
- Worked on Artificial Intelligence model training and testing using Amazon SageMaker.
- Created an internal python library those can-do AWS S3 operations and few data manipulations.
- Built multiple micro services using Python 3.6 and written unit tests and component tests using pytest and behave, respectively.
- Used Terraform scripts to Automate Instances for Manual Instances dat were launched before.
- Extensively involved in infrastructure as code, execution plans, resource graph and change automation using Terraform.
- Built a real time data pipeline to process the images from multiple data sources by utilizing the AWS services such as Step function, CloudWatch and Lambda and exposed these lambda services to API Gateways.
- Automating the cloud deployments using Terraform Templates.
- Managing AWS Infrastructure with AWS CLI and API.
- Infrastructure automation and configuration management using Terraform
- Terraform templates for dev, test, staging and production environments. tier 1 and tier 2 support for the platform
- Provisioned Kubernetes Infrastructure using Terraform Enterprise.
- Manages and maintains a smooth operation of network environments while working with systems, network, software, and hardware engineers dat are remote and on site.
- Provides specifications and detailed schematics for network architecture.
- Maintains technical expertise in all areas of network and computer hardware and software interconnection and interfacing, such as routers, firewalls, hubs, IDS/IPS, deep packet inspection analysis, application packet analysis, switches, and proxies.
- Setup and build AWS infrastructure various resources, VPC EC2, S3, IAM, EBS, Security Group, Auto Scaling, and RDS in Cloud Formation JSON templates.
- Created methods (get, post, put, delete) to make requests to the API server and tested Restful API using postman. Also used Loaded Cloud Watch Logs to S3 and then loaded into Kinesis Streams for Data Processing.
Environment: python 3.6, Docker, Kubernetes, Ec2, IAM, EBS, S3, Jenkins, Cloud Watch, SciPy, Pandas, MySQL, Linux, Shell Scripting, Pydicom, Sagemaker.
Confidential, Sunnyvale, California
Sr. Python/Data Engineer
Responsibilities:
- Analysed the SQL and SAS scripts and designed the solution to implement using Pandas and PySpark.
- Built web pages using HTML, CSS, JavaScript, and jQuery for QC Reports on data.
- Developed a python package to connect to Teradata from Spark and generate QC Reports.
- Worked on a high configuration EMR cluster to run Spark jobs.
- Developed a web application using Python and Django.
- Worked in Agile methodology attending the daily stand up and completing tasks in sprints.
- Analysed and resolved data load issues by working with business and technical teams.
- Managed to build new connections to databases like ORE, Snowflake, Redshift and Teradata from Spark and Python.
- Analyses voice, video, and/or data communications networks, including planning, designing, evaluating, selecting, and upgrading operating systems and protocol suites and configuring communication media with concentrators, bridges, and other devices.
- Plans network layouts and configures systems to user environments.
- Analyses network topologies and traffic and capacity requirements.
- Rewrote the legacy Teradata SQL queries to Amazon Snowflake as part of database migration.
- Analyse requirements at the business meetings and strategize the impact of requirements on different applications.
- Installed and Configured Pivotal Cloud Foundry (PCF) Application Manager, Configured LDAP for authorization, configuring Log generator for logs in PCF (Splunk).
- Developed and designed an API (RESTful Web Service) for the chatbot integration.
- Worked on chatbot development for providing relevant product information to the customers.
- Chatbot also makes use of Wikipedia Web API or Wolfram Alpha Artificial Intelligence Web API Engine to support responses.
- Chatbot's front-end developed using Flask Development Framework and Angular.js.
- Analyse, design, and migrate systems using latest Technologies like python, spark and cloud infrastructure using Amazon Web Services in order to provide long-term supportability and sustainability.
- Based on the new or updated business requirements, design and implement the Rules for Processing Workflows using latest python and Spark framework versions.
- Involved in Business requirements, Data analysis and System design meetings.
- Created entire web application usingPython, Django and MySQL.
- Used HTML, CSS and JavaScript to create front-end pages using Django Templates and wrote Django Views to implement application functions and business logic.
- Extracted datafrom multiple sources, integrated data into a common data model, and integrated datainto a target database, application, or file using efficient programming processes.
- Designed and developed a data management system using MySQL and optimized the database queries to improve the performance.
- Developed new Splunk apps to monitor the application log volume (Event count), Indexing volume, missing events, missing hosts/source/source type from Splunk monitoring.
- Has experience in Splunk operational intelligence tools, creating complex searches, dashboards and alerts.
- Written PowerShell scripts for archiving and moving of older log files to Azure Storage and automation scripts using Python boto3.
- Added support for Amazon AWS S3 and RDS to host static/media files and the database into Amazon Cloud.
- Created Splunk (SPL) dashboard for searching, monitoring and analyzing data.
- Tuned the code with performance and consistency as the main factors of consideration.
- Developed entire frontend and backend modules using Python on Django Web Framework.
- Designed and developed data management system using MySQL.
- WrotePython scripts to parse XML documents and load the data in database.
- Using GitHub version control tool to coordinate team-development.
- Responsible for debugging and troubleshooting the web application.
Environment: Python 2.7, SQL, Spark 2.1.0, Snowflake, Amazon S3, Elastic Map Reduce, Django 1.9, Java Script, HTML, XHTML, jQuery, JSON, XML, CSS, MySQL, Bootstrap, Git, Linux.
Confidential, Chicago, IL
Sr. Python/Django developer
Responsibilities:
- Participated in all the stages of the software development lifecycle like design, development, and implementation and testing.
- Django Framework dat used in developing web applications to implement the model view control architecture.
- For database access, Django API's has been used.
- Business logic implementation, data exchange, XML processing and graphics creation done by using Python and Django.
- Views and Templates developed with Python and to create a user-friendly website interface Django's view controller and template language is used.
- Developed UI using CSS, HTML, JavaScript, AngularJS, jQuery and JSON.
- DB2 SQL Procedures and UNIX Shell Scripts designed and developed for Data Import/Export and Conversions.
- Provide training on different PaaS tools including OSE Console, XL release, GIT, Jenkin, Splunk, AppDynamics to the application teams new in PaaS platform.
- A Django dashboard with custom look and feel for end users been created after a careful study of the Django admin site and dashboard.
- Unit Test Python library was used for testing many programs on python and other codes.
- JIRA used to build an environment for development.
- Different testing methodologies like unit testing, Integration testing, web application testing were performed.
- For the extraction of data from the database various queries were written and a database was created using MYSQL.
- Search engine optimization by replacing existing databases with MongoDB (NoSQL Database).
- To build and populate DB and to ensure the standards the team of QA was also collaborated.
- In order to avoid the reloading of the entire web page to update small portions, AJAX in UI used.
- For sending and receiving the data between multiple systems, RESTful Web-Services implemented.
- Features for the dashboard are developed and tested using CSS, JavaScript, Django, and Bootstrap.
- Git repository created and added to the GitHub project.
- An application was developed in the Linux environment and dealt with all commands.
- Performance and consistency being the main factors of consideration when the code was tuned.
- Actively worked as a part of a team with managers and other staff to meet the goals of the project in the stipulated time.
Environment: Python 2.7, Django 1.6, Java Script, HTML, XHTML, Angular JS, jQuery, JSON, XML, CSS, MySQL, Bootstrap, Git, Linux, Pharms, requests.
Confidential
Python developer
Responsibilities:
- Used Django framework for application development.
- Booting up nodes using prebuilt images on Amazon EC2.
- Bootstrapping Amazon EC2 nodes with software you want running when your nodes boot up.
- Uploading, copying, downloading, and deleting files using Amazon S3.
- Assisted in reduction of cost and optimization of supplier selection for the CRM Applications.
- Used severalpython libraries like Python, Numbly and Matplotlib.
- Design, develop, test, deploy and maintain the website.
- Designed and developed the UI of the website using HTML, XHTML, AJAX, CSS and JavaScript.
- Developed entire frontend and backend modules usingPython on Django Web Framework.
- Wrotepython scripts to parse XML documents and load the data in database.
- Generated property list for every application dynamically usingpython.
- Using Subversion version control tool to coordinate team-development.
- Responsible for debugging and troubleshooting the web application.
- Created server-monitoring daemon with Psutil, supported by the Django app for analytics, which me created. Also researched big data solutions with Cassandra databases.
- Fetched twitter feeds for certain important keyword usingpython -twitter library.
- Experienced in Agile Methodologies and SCRUM Process.
- Worked in development of applications especially in the UNIX environment and familiar with all of its commands.
- Collaborated with internal teams to convert end user feedback into meaningful and improved solutions.
- Build all database mapping classes using Django models and Cassandra.
- Used Pandas API to put the data as time series and tabular format for east timestamp data manipulation and retrieval.
- Designed and developed data management system using MySQL.
- Built various graphs for business decision-making usingPython matplotlib library.
- CreatedPython and Bash tools to increase efficiency of call centre application system and operations; data conversion scripts, AMQP/RabbitMQ, REST, JSON, and CRUD scripts for API Integration.
- Resolved ongoing problems and accurately documented progress of a project.
Environment: Python 2.7, Django 1.4, SciPy, Pandas, Bugzilla, SVN, C++, Java, jQuery, MySQL, Linux, Eclipse, Shell Scripting, HTML5/CSS. Red hat Linux, Apache, Cassandra