We provide IT Staff Augmentation Services!

Software Engineer Resume

4.00/5 (Submit Your Rating)

Jacksonville, FL

SUMMARY:

  • Overall 11 years of IT experience
  • 7 years of experience of software development in Spark,Scala,Core Java, J2EE,REST, SOAP webservices and related technologies.
  • Experienced with web development by using HTML5, CSS3, JavaScript, and Angular2.
  • Experienced in J2EE for Business applications development, together with JSP, Servlet, JDBC, Java Mail, XML, Web Service and JavaScript, etc.
  • Experienced in J2EE frameworks including Spring,Scalatra and Spring JPA, etc. Extensive experience in their configuration, integration and implementation.
  • Solid noledge of SQL and Stored Procedures, Experienced in databases SQL Server, Oracle.
  • Proficient in writing use cases for Requirement Analysis, Design, Development, Testing and Documentation of Software Development Life Cycle (SDLC) using Object - Oriented Design (OOD) /Programming (OOP).
  • Experience with Docker, Jenkins and AWS.
  • Strong noledge of Agile Methodology (Scrum).
  • Experience in using Git, SVN, Maven, Eclipse, SQL Developer, JIRA
  • Strong verbal and written skills. Good problem solving skills. Work well individually or in a team environment and shares responsibility for the team's success or failure. Willing to work non-standard hours to see a project finished.

TECHNICAL SKILLS:

Big Data: Cloudera distribution(Hive, Spark)J2EE API

Programming Languages: JSP, JDBC, Web Service, Scala, Java, C, C++, SQLJPA, XML

Web Technologies: J2EE Frameworks Angular 2, HTML5, CSS3, JSP, Typescript, Spring,Scalatra JavaScript, JSON

XML Technologies/IDE Tools: XML, DOM, SAX EclipseApplication/Web Server

Databases: Nginx, Apache Tomcat Oracle, SQL Server, MySQL

Methodologies/Operating Systems: Agile, Waterfall, SDLC Windows, Linux

Source Control Management Build Tools: SVN, Git Gradle, Maven, Jenkins, Docker

PROFESSIONAL EXPERIENCE:

Confidential, Jacksonville, FL

Technologies: Scala, Scalatra SQL, REST Web Services

Roles and Responsibilities:

  • Worked on Babbage microservice used for calculation of max price of improvement.
  • Improved performance by migrating costly SQL queries to utility classes.
  • Improved code coverage by writing junit test cases.
  • Fixed production bugs. Assisted in code migration by performing code analysis, providing code walk throughs and collaborating with offshore team. Clarified questions related to the service being migrated, autantication and processes.

Confidential, Jacksonville, FL

Responsibilities:

  • Worked on highly visible Provider Vista process which is responsible for closing critical Care Gaps for the enterprise on strict timelines. I was responsible for the entire process and worked on enhancements, bug fixing using Java programming paradigms, optimizing some sections of code, historical loads. This project had multiple teams involved, hence coordinated with other teams, nurses during requirement gathering, validations and provided necessary clarifications.
  • Worked on building weekly report for Provider Vista which gives information on the source and target data with load status.
  • Worked on getting ORU lab data from Sailfish in hive tables for consumption by SDS process. All necessary classes and functions were coded in Scala. Used salting and unsalting technique for optimizing some SQL queries.
  • Worked on code optimization which will help change the schedule of jobs from weekly to daily.
  • Worked on loading data for different providers and performed necessary transformations using Scala and spark SQL.
  • Developed report to find all rejected records by comparing data before and after QA validation process using Scala, SQL.
  • Worked on a module to retrieve files which are scheduled to run. This was necessary to find out if a file received has been processed or not using Java.
  • Worked on building a module for computing the incremental data to be sent to downstream process using Scala and spark SQL.
  • Worked on POC for Kafka producer to publish generic records using Avro schema.
  • Worked on compression techniques for large CCDA messages using Kafka Producer.
  • Worked on writing complex queries using Spark Sql for various sections such as Immunization, procedure to retrieve data from ccda parsed data where data is stored in sparse format.
  • Worked on analyzing duplicate records and fixed bugs in provider details for CCDA. Supported query changes related to provider codes, retry logic for webservices
  • Automated data profiling for CCDA parsed data using Java to build new tables which helped other projects to get results faster.
  • Automated validation dat checks for schema field names against data frame using Java.
  • Made changes to Data Quality Rules dat are applied to CCDA messages.
  • Worked on publishing rejected messages to Kafka topics using Scala
  • Worked on retrieving thousands of PDFs dat are sent to ECMS team. Optimized Scala code and shell script code dat generated meta xml and PDFS.
  • Worked on resolving the count discrepancies between meta xml and PDF files using Java and generated necessary PDFs on strict timelines.
  • Optimized shell script code to pull and push data to hdfs. Initial code were making the process was running slow and also effected the other processes running in the cluster.After the changes job was able to run much faster.
  • Developed process to load historical data received from ECMS team using Shell Scripting and Spark Sql.
  • Understood the code written in Java which was used to send messages using kafka. We also needed to understand the dependencies for Serialization and deserialization and access from our code.
  • Developed code to write Kafka Consumer using Scala to consume, deserialize data to specific record.
  • Parsed deserialized data for 6 Schemas and pushed to HDFS using Scala generics concept.

Confidential

Responsibilities:

  • Worked on designing web pages for monitoring tech prod support for tenant parameters, EIP Workflow.
  • Wrote code to display details for tenant configurations. Provided functionality to edit and delete the contents .
  • Worked on functionality which will allow user to enter details to place an app in the queue if it was not placed before.
  • Generating different kinds of documents for business such as rate grid proposal.
  • Enhancing document generation, fixing and debugging related issues.

Confidential, Gainesville, FL

Technologies: Java, REST, Spring MVC, Spring JPA, Spring Boot, Apache Kafka, UML

Responsibilities:

  • Wrote queries in Spring JPA to access database in SQL Server 2008.
  • Implemented Rest Web Service using Springboot. Used Junit to test the web service APIs.
  • Implemented Apache Kafka Consumer to receive the request in JSON format.
  • Worked on embedded Kafka to unit test Kafka consumer. Also used console based producer to test the consumer.
  • Worked on Angular 2 to write the components to display the data, used HTML5, CSS3 to design the front end such as bid period details based on the selected bid period. The components displayed the data sorted in ascending and descending order.
  • Worked on an application which was containerized using docker. Wrote the Jenkins job to execute the build script to build angular 2 and Springboot code. Served Angular2 code from an nginx server within a docker container.
  • Used two AWS Linux instances. One instance had Jenkins job which pulls the latest code from GitHub, builds the angular2 and Springboot Code, creates the images and pushes them to DockerHub. Used Publish over SSH plugin to send the docker compose file over to the other instance and wrote the build script commands dat would pull and run the images in the second instance.
  • Used port forwarding in Windows to run docker containers.
  • Worked on deploying a in-house project using MongoDB on docker.

Confidential, Gainesville, FL

Technologies: Java, Spring Integration, XML, XSL, Apache CXF, Jenkins

Responsibilities:

  • Collaborated on an agile team to develop, test and deliver the assigned stories with the upmost quality.
  • Transformed, validated and processed incoming SOAP messages for partner requests for various parts of the reservation process including pricing, booking, modify using Apache CXF, XSL, Spring Integration.
  • Wrote Spring Integration flows which passed messages through components such as chains, channels, routers, gateways, service activators. Service activators invoke validators to validate the response received. Composite and service layer calls are made as part of the chain .The service layers are responsible to call the external Amadeus web service to get the response.
  • Worked on merger XSLs to merge the request sent with the response received in order to maintain all relevant data throughout the flow.
  • Wrote validators in Java using xpath to check for conditions in the flow.
  • Followed TDD process to write the unit tests with Junit and Mockito. In order to unit test the XSL, I used XSL to transform the input XML and compared against the expected XML.
  • Fixed critical bugs in the code by coordinating well with the QA team.
  • Distributor service was written to migrate the code from legacy platform to new platform. Wrote routers for invoking the new or legacy code based on release dates.
  • Wrote message enricher which would compute the Ticket Time Limit for partners and add it to the request in the flow.
  • Wrote code to identify if the itinerary was international by invoking external REST service.
  • Migrated the synchronous booking process to asynchronous JMS queue.
  • Confirmed Jenkins jobs are passed as part of continuous integration.

Confidential

Technologies: Oracle SQL, JAVA OLAP API, SQL SERVER 2008 Tools:Universe Design Tool, Information Design Tool

Responsibilities:

  • Developing required facts, dimension tables, and business objects using SQL to build the backend.
  • Developed reports for divisions such as Plastics, IPD, and PPD.
  • Worked on OLAP Java API to access data from OLAP. The OLAP Java API makes it possible for Java applications to access data dat resides in an Oracle data warehouse Worked on Spring REST back end.
  • Analyzed requirement, made changes to the data model (Star schema)

Confidential

Responsibilities:

  • Supervised Research: Worked on No SQL data store such as Oracle Berkeley DB with a web interface for updating, inserting and deleting of the metadata stored.
  • Applied Cryptography and Computer Security: Developed browser extension using algorithm to protect against phishing attacks.
  • Data Intensive Computing: Built client server system consuming FAO web services to find the minimum distance capital city using Google Maps API and all possible neighboring cities for a place. Deployed project on cloud (GAE).Implemented word co-occurrence matrix using both pairs and stripes approach in Hadoop in VMWare.
  • Large-Scale Distributed System: Displayed weather forecast for cities using Java RMI and web services. Used Google Maps to display the weather details. Implemented Dijkstra’s shortest path algorithm using map reduce programming model in VMWare.
  • Search Engine: Developed a search engine in C++ to search unformatted text.
  • Computer Vision and Image Processing: Worked with RGB images, grayscale transformations to find hidden text. Implemented chamfer algorithm to find the geodesic distance between 2 pixels in an image.

Software Engineer

Confidential

Responsibilities:

  • When customer submits a JSP form with all details, the information is checked in the database to see if the information is valid, if their are any new records the row in the database is updated.
  • When a service order is created for a particular component, the database is checked to see if enough materials are present to repair it. If not the database is checked for substitutes. If the available quantity of the substitute is greater TEMPthan required tan the status of the service order is updated accordingly.
  • Worked on Webdynpro web services to send email messages.
  • Worked in SAP data transfer technologies including LSMW, BDC and Business Application Programming Interface (BAPI), enhancements in modules such as CS, MM, and SD.
  • Developed code to validate details of the customer in BADI. The data is tan placed in IDOC.
  • Used ALV tree reporting to display header and item data from EKKO, EKPO

We'd love your feedback!