We provide IT Staff Augmentation Services!

Big Data, Java Developer/architect Resume

5.00/5 (Submit Your Rating)

WA

SUMMARY:

  • Over 14 years of experience in architecting, designing, prototyping, coding, testing and deploying Java/JEE based Web applications, Web Services and Big Data applications. Worked on business domains like Finance, Publishing, Telecom, Aviation and Unemployment Insurance.
  • 3+ years of work experience in Big Data with hands on experience in Hadoop ecosystem components like MapReduce, HBase, Sqoop, Oozie, Spark, Phoenix, Hive, Avro,, Zeppelin, Ambari, Beeline, NIFI and Kafka.
  • Worked on cloud native platform technologies such as Pivotal Cloud Foundry, AWS SDK for S3 and Docker.
  • Hands - on experience in using Databases like Oracle & PL/SQL, MySQL, Mongo DB and MSSQL server.
  • Have experience developing applications using frameworks like Spring, Spring MVC, Struts 1, Struts 2, Spring Security, Spring Boot, and Hibernate.
  • Skilled in implementing SOAP and RESTful Web services
  • Basic working knowledge of Python and Scala.
  • Well versed with entire development life cycle and facilitating agile practices including automated builds and continuous integration/deployment using Maven, and Jenkins.
  • Good understanding of Hadoop Architecture and various components such as HDFS,YARN, Job Tracker, Task Tracker, Name Node, Data Node and MapReduce concepts.
  • Profound knowledge and understanding of UML diagrams used tools like MS Visio and StarUML.
  • Have good knowledge of OOPs, OOA & D, OSGI, SOA, JMS, OWASP and Microservices.
  • Experience in working with Source Code Security Analyser Tool such as Veracode, SonarQube and rectifying the vulnerabilities in the application Source code.
  • Evaluate, troubleshoot, and performance monitor of web applications. Did tuning and programming optimization to achieve maximum performance.
  • Ability to work independently with a high degree of personal initiative.
  • Highly motivated with excellent analytical and communication skills, and an ability & enthusiasm to learn new systems and technologies. Worked with end users to formulate and document business requirements. Exposure of working closely with Customers
  • Have H1B work permit valid till 2020.

TECHNICAL SKILLS:

Programming: Java

Big Data Ecosystem: HBase, Oozie, MapReduce, HDFS,Spark, Sqoop, Phoenix, Solr, YARN, Zookeeper, Knox, Ambari, Hive, Beeline, Kafka, Avro, Zeppelin

Web Services Frameworks: CXF, RESTEasy, Jersey, Axis2, WS-Security, Membrane Service Proxy

Cloud Platforms: Amazon EC2, Dell EMC S3, PCF 2.0,Docker 2.0

Web Technologies: HTML5, CSS3, JQuery, Ajax, JavaScript, JSON

Web/Application Server: Apache Tomcat 7, JBoss 7,WebSphere Application Server 8.5.5.9,WebLogic Server11gR1 (10.3.6), JBOSS WildFly

JEE Technologies: EJB, JDBC, Servlet, JSP, JSTL, Apache Active MQ, Mule 3

Web Frameworks: Struts 1.2, Struts 2, Spring, Spring Boot, AngularJS

Methodologies: OOD, TDD, Agile, Scrum, Lean

ORM: Hibernate, JPA

Operating Systems: Windows, Linux (Red Hat, Ubuntu & CentOs)

Databases: Oracle 11g, MySql, MS SQL Server, SQLite, PL/SQL,MongoDB

Testing Frameworks: JUnit, Mockito, Selenium

IDE: Eclipse, IntelliJ IDEA

Build/CI/CD Tools: Ant, Maven, Jenkins

Version Control Tools: VSS, SVN, Perforce, GitHub, Team Foundation Server 2015, GitLab

Modelling Tools: MS Visio, Enterprise Architect, Magic Draw, StarUML

Other Tools: JVisualVM, LDAP, JBPM, Rally, JIRA, Bugzilla, Jmeter, JProfiler, SOAP Sonar, SOAP UI, PMsmart, MS Project Plan, SonarQube, Veracode,Tableau

PROFESSIONAL EXPERIENCE:

Confidential, WA

Big Data, Java Developer/Architect

Responsibilities:

  • Implemented Oozie workflows for ingestion of the Confidential data into the HBase and HDFS
  • Implemented Sqoop job to import the data from the external MYSQL database into Hive.
  • Implemented the module to ingest JSON file into Hive table than mapped and fill the Hive table to the HBase table.
  • Implemented AmazonS3Client to access ECS object storage using the AWS S3 SDK.
  • Developed multiple MapReduce jobs in Java for Confidential JSON data processing.
  • Written Selenium IDE scripts for automation testing of the Confidential web application.
  • Worked on Pivotal Cloud Foundry to deploy the Confidential application into it.
  • Participated in multiple big data POCs to evaluate different architectures, big data analytics tools and vendor products and designed technical solution based out of the client’s needs and system architecture
  • Debugged and resolved various technical problems on the Development, QA and Production Confidential Hadoop environments by collaborating with Hadoop admins and Hortonworks’ team.
  • Worked on a POCs using Spark and Kafka.
  • Responsible for maintaining the code quality using SonarQube and Veracode scans and fixing the code issues reported by these tools before every release to the customer.
  • Implemented Diagnostic utility to test the health of the Hadoop environment.

Environment: Java, Spring, JBOSS WildFly, RedHat Linux, Oozie, HBase, Phoenix, WebHDFS, PCF, Hive, Kafka,Sqoop,Team Foundation Server (TFS), GitLab, Maven, MongoDB, MSSQL, AngularJS, Jenkins, Hibernate.

Confidential, Colorado, Arizona, North Dakota

Java Developer/Lead

Responsibilities:

  • Did performance optimization by refactoring the code of the wage upload functionality in the ESS module.
  • Implemented Drools to externalize the rules in the application to fix the issue of frequently changing business logic due to the government tax rules change
  • Implemented Mule ESB flow using VM connectors to use 100 in-memory asynchronous queues for faster processing of wage upload functionality.
  • Designed and implemented RESTful webservice client and server for the various kinds of registration and common functionalities in the Product that could be used by various modules.
  • Written code designed to abide by the standards of three-tiered or three-layered applications (Presentation Layer, Business Layer, and Database Layer) for a new rating module in the Tax part of the application.
  • Played pivotal role in bug fixing, writing Jnuit test cases for unit testing and identifying the potential issues during code reviews.
  • Responsible for Release management on QA and Staging server

Environment: Java, Mule, Drools, JBoss, Oracle, Hibernate, spring MVC, Active MQ, Maven, RestEasy, RedHat Linux,SVN.

Confidential

Java Developer

Responsibilities:

  • Fixed defect on various application features, participated in prototype solution and UI design development.
  • Involved in understanding of business requirement, Implementation and support of the application.
  • Implemented the Instant Messaging history module that stores the chat history in the device so that it can be reviewed later. History records can be filtered by the user so that it can be found easily.
  • Implemented a POC for the integration of Confidential with GDrive (Google Drive).

Environment: : HTML5, CSS3, JQuery Mobile, Google Drive, XMPP, SMACK API, Web SQL, Ejabberd, PhoneGap, Android,SVN.

Confidential

Java Developer/Lead

Responsibilities:

  • Designed, Implemented an Enterprise application for the management of Confidential .
  • Analysed, Designed and implemented a SOAP based web service server and client based on ONVIF standard for the device management specification .
  • Analysed, Designed and implemented a RESTful web service to provide the user/agent related information to another application written in c language.
  • Designed and implemented the various modules like module for the auto registration of cameras, fault management, camera health monitoring and diagnostic management, Camera Management, and User Management.
  • Implemented the reporting module in the system.
  • Developed an extensible java framework based on Selenium to test the web application.
  • Instrumental role in code reviews and Unit/Module Testing and Automation testing.

Environment: : Spring, Hibernate, Struts2, Maven, CXF , Jsp, Free Chart, Selenium, AJAX, MySQL 5.1, Tomcat 7, Linux

Confidential

Java Developer/Lead

Responsibilities:

  • Led team through SDLC iterations, mentored fellow developers on the various aspects of application architecture and development
  • Designed and developed OSGI based framework for provisioning and dynamic configuration update.
  • Driven requirements gathering sessions with Client on requirement basis.
  • Developed a prototype model based on the Confidential node standard comprising Active MQ.
  • Implemented the node health monitoring web service application.

Environment: : Spring, Hibernate, Maven, Pax, Spring-DM, Apache Karaf, JSP, Active MQ, MySQL 5.0, Tomcat 5, Linux

Confidential

Java Developer

Responsibilities:

  • Improved user satisfaction and adoption rates by Analysing, designing, and implementing a SOAP web service based on the SUSHI standard WSDL provided by NISO for the automated report downloading. This Web service won the platform integration with various SUSHI partners.
  • Implemented Web service client for SUSHI Web service to download the reports for testing.
  • Developed server side code for various functionalities used by the publishers and libraries, in the application.
  • Responsible for the development, maintenance of the report generation part of the project.
  • Involved in production bug fixing and code review.

Environment: : Spring, Hibernate, Maven, JSP, Ajax, CXF, Apache server, Jasper Reports, Struts2, MySQL, Tomcat 5, Linux

Confidential

Java Developer

Responsibilities:

  • Troubleshot and fixed the Production bugs and did feature enhancements of the product.
  • Designed and implemented the online Lease insertion interface for customer assets management module.
  • Implemented an online reporting tool with various filters options and search facilities for the most desired results.
  • Responsible for the built deployment of the application on the staging and production server.

Environment: Java, JSP, Apache server, Servlet, JDBC, SQL server 2000, JRUN

We'd love your feedback!