Scala Developer/spark Developer Resume
San Antonio, TX
OBJECTIVE:
Around 8+ years of experience in backend development and other services using Scala along with integrated libraries and frameworks. I consider myself to be a very hard working individual with a proven track record of success. I am actively seeking opportunities in a similar field where I can apply my al and industry experience. Overall, a positive, "hands - on" performer who enjoys working in diversified teams towards company goals and objectives.
PROFESSIONAL SUMMARY:
- Experienced in software development including design, development, testing and deployment
- Experienced in designing, developing and implementation of web-based applications and component-based solutions using Scala and Play framework.
- Having experience on Scala collections like list, sequences, maps, pairs, tuples and scala generics.
- Experienced in designing, developing and implementation of interfacing distributed applications.
- Experienced in Software Development Life Cycle; Requirements specification documents, detailed design documents, test cases and deployment manuals.
- Worked with open source data handling and mapping libraries like MongoDB (reactive mongo), Leaflet etc.
- Worked with spark-based applications involving data processing manipulation and filtering.
- Experience with developing REST APIs for different methods and processes being performed by the data and business layers.
- Experience in writing unit test cases and performance test cases using FunSuite, Gatling, Jmeter.
- Good Communication, Customer Interface skills and Inter-Personal Skills.
SOFTWARE SKILLS:
Programming Languages/Technologies: C, C++, HTML5, Java, Scala, Python, Play Framework, SQL, MongoDB, and Apache Spark, Gatling, Apache Jmeter.
Software s/IDE s: MATLAB, AutoCAD, IntelliJ, Eclipse, Mongo Shell, MongoDB Compass, SolidWorks, ANSYS, Gambit, Fluent.
Operating Systems: Windows, Linux, Mac OS.
PROFESSIONAL EXPERIENCE:
Confidential, San Antonio, TX
Scala Developer/Spark Developer
Responsibilities:
- Extracted the data from the csv files for different years and created a parallelised RDD in a local Spark Session.
- Converted the raw data from the extracted csv files to readable and processable content in the RDDs.
- Created a visualisation technique by colour coding the temperatures for the different years using linear interpolation method and finally extracting the image.
- Generated different tiles to set up different images with varying zoom levels on the map projection and integrated it with the web application.
- Computed the average and deviations of temperatures from normal for different years from the processed data by generating a user-defined integer grid and computed the new images for these values.
- Compiled all the data processed and integrated it with a user interface using signals and an open source mapping library called Leaflet.
Environment: Scala, Spark, IntelliJ, Eclipse, Leaflet, SQL.
Confidential, Piscataway, NJ
Scala Developer
Responsibilities:
- Implemented Spark RDD transformations to map business analysis and apply actions on top of transformations.
- Developed and retrieved No-SQL data using Mongo DB using DAO's.
- Implemented test scripts to support test driven development and continuous integration.
- Perform data analytics and load data to Amazon s3/data lake/Spark cluster.
- Develop spark SQL tables & queries to perform ad hoc data analytics for analyst team.
- Deploy components using Scala Build Tool and Docker images, Hadoop.
- Played an important in migrating jobs from spark 0.9 to 1.4 to 1.6.
- Consumed the data from Kafka using Apache spark.
- Involved in converting Hive/SQL queries into Spark transformations using Spark RDDs and Scala .
- Analysed the SQL scripts and designed the solution to implement using Scala .
- Developed analytical component using Scala , Spark and Spark Stream.
- Collecting and aggregating large amounts of log data using Apache Flume and staging data in HDFS for further analysis.
- Extensively involved in developing Restful API using JSON library of Play framework.
- Used Scala collection framework to store and process the complex consumer information.
- Used Scala functional programming concepts to develop business logic.
- Designed and implemented Apache Spark Application (Cloudera).
Environment: Hadoop, Spark, Scala , Hive, Java, SQL, Cloudera Manager, Teradata, PL/SQL, MySQL, Windows, Hadoop YARN, Spark Core, Spark Streaming, Spark SQL, Scala, Python, Kafka, Hive
Confidential, Piscataway, NJ
Software Engineer
Responsibilities:
- Involved in the development of an online discourse similar to Coursera where students and teachers are able to connect online and set up a live session as an alternative to the recorded sessions that are available in other similar websites.
- The main backed was developed using Play framework along with the integration of the Akka mechanism in order to set up the live sessions using Web Sockets.
- Additional payment API added to the website to handle the integrated PayPal payment method.
- Added logic for payment resolution and rollback the transaction if payments get stuck while in transit.
- Implemented Signal Server for connecting/signalling calls and video calls over webRTC(real-time communication) in order to compensate for the varying fluctuations observed while establishing the WebSocket connections.
- Implemented support for the admin team to join any ongoing live class from admin panel to monitor and do the quality assurance.
- Made use of SQL to retrieve user’s metadata faster when row indexes were available.
Environment: Scala, Play Framework, Akka, MongoDB (Reactive Mongo), Play JSON (Jackson), Google FCM, AWS s3, Gatling, FunSuite.
Confidential, West Lafayette, IN
Scala Developer
Responsibilities:
- Was involved in getting the exact client requirements and their new modifications for the server being developed and design with a basic skeletal plan for the future work structure.
- Developed the backend of a dating app that is similar to Tinder or Happn, but having more leniency towards the female users in the form of like notifications, messaging and privacy. The frontend was developed using Android Native which provided the interface for the backend.
- Implemented the business logic for the user login, signup using Facebook or mobile number.
- Implemented proxies for 3rd party APIs like movies database, songs database
- Play framework was used to develop the endpoints(routes) and performed all the data processing including other business logic using Scala.
- The main data layer and extraction was built using the verified open source Reactive Mongo library extension of MongoDB.
- Made use of Web Sockets for real-time messaging through Akka mechanism by using Akka Actors for reading messages, data processing and maintaining connections over web sockets.
- Implemented background services to observe user behaviour and create reports out of it for the marketing team.
- Implemented data pipeline for ML algorithms used by the data science team for clustering users and their behaviours.
- Was involved in testing, bug fixing and creating data pipelines for the marketing teams.
Environment: Scala, Play Framework, Akka, MongoDB (Reactive Mongo), Play JSON (Jackson), Google FCM, AWS s3, Gatling, FunSuite.
Confidential
Scala Developer
Responsibilities:
- Was Responsible for the development of the backend of an election management software where the party officials were giving out tasks to their subordinates as well as keeping a track of its completion using GPS coordinates.
- Primary role included setting up the endpoints in the routes file after the data was extracted and processed from the Data Layer.
- The batch insertions of the data in the form of voter lists and IDs was done using MongoDB and the access to the extracted data for each role in the hierarchy was set up according to the user types.
- Implemented the complex Access Control System (ACL)for giving access to a different set of data to different types of users like booth worker, sub-district head, constituency head, district head, state head, admin, super admin, etc.
- Implemented support for PDF and CSV exports of data for review.
- Used mongo aggregates to get extract data for complex graphs to plotted on frontend side in the admin panel.
- Good understanding of the different operations like validation, authorization, data filtering, etc performed by different layers of the complete project in order to set up the correct endpoints to be exposed to the public.
Environment: Scala, Play Framework, Akka, MongoDB (Reactive Mongo), Play JSON (Jackson), Google FCM, AWS s3, MySQL, Gatling, FunSuite.