We provide IT Staff Augmentation Services!

Sr Java Developer/sr. Data Analyst Resume

3.00/5 (Submit Your Rating)

Bellevue, WA

SUMMARY

  • 9+ Years of experience in Web IT Industry using Java and Bigdata mainly in Design, Develop and operations.
  • 4 years of experience in BIG DATA Technologies like Apache Hadoop, Spark and HIVE (Hadoop Ecosystem).
  • 2 years of experience in Python.
  • 3+ Years of experience in SOA
  • 2+ Years of experience in Spring Boot and MicroServices.
  • 2+years of experience in AWS ( EC2, S3, DynamoDb, Lambda, Cloudwatch) .
  • 8+years of experience in Java/J2ee.
  • 4+ years of experience in Spring and REST based webservices .
  • 3+ years of experience in Data Analytics using Splunk, AppDynamics, Tableau .
  • 3+ years of experience in Front - end UIDevelopment skills using scripting languages like HTML5, CSS3,Bootstrap, JSON, JSP and server-side skills like Angular JS.
  • 2 years in DevOPS (CHEF and PUPPET)
  • 1 yr in Sharepoint Development.
  • Experience of Model-View-Controller (MVC) pattern in development of Single Page Application (SPA) using AngularJS.
  • Expert in using GIT and SVN.
  • Strong experience in web development using JEE, Struts 1.x and 2.x, Spring, Hibernate, Servlets, JDBC, JSP, XML/XSL/XSLT, HTML,EJB DHTML, JavaScript, JSTL, AJAX, IBM Websphere, BEA WebLogic Application Server, JBoss Application Server and Tomcat.
  • Experience in installation, configuration, supporting and monitoring Hadoop clusters using Apache, Cloudera distributions and AWS.
  • Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, NameNode, Data Node, Node Manager and MapReduce programming paradigm.
  • Experience in managing and reviewing Hadoop log files.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
  • Experience in working and managing Hadoop clusters in Amazon Web Services with EMR in EC2 instances and load balancing.
  • In depth understanding of Lucene Java Libraries programmed for indexing and fast retrieval of data.
  • Proficient in Agile methodology with tools like Serena and Jira .
  • Extensive experience working on Hadoop ecosystem components like MapReduce, HDFS, HBase, Oozie, Hive, Sqoop, NoSql, Falcon, Pig, Storm, Kafka, Accumulo, Lucene.
  • In-depth understanding of Hive Scripting which includes writing UDF’s and custom input and output formats.
  • Proficient in SDLC and OOPs Concepts.
  • Experienced in designing and executing test cases based on Business requirements and Functional Specifications.
  • Extensively involved in testing the applications manually.

TECHNICAL SKILLS

Languages: Java, SQL, PLSQL, XML,C, XHTML, HTML,CSS, Java Script,Angular Js

Programming Language: Java, J2EE, JDBC, Servlets, JSP, Splunk, SQL

Big Data Technology: Hadoop, Map Reduce, Pig, Hive, Spark

Framework: Struts, Hibernate and Spring.

Development Tools: Eclipse, WebSphere, WebLogic, JBoss, ANT 1.7Design and Modeling UML and Rational Rose.

Web Services: SOAP, REST, WSDL, UDDI.

Databases: Oracle 10g/9i/8i, SQL Server, MS-Access.

Scripting languages: Java Script. Angular Javascript

XML technologies: DTD, XSD, XML, XSL, XSLT, SAX, DOM, JAXP.

Environment: s: UNIX, Red Hat Linux, Windows 2000,Windows XP.

Methodologies/processes: Agile, waterfall.

Visualization Tools: Tableau 9 and 10.0

Monitoring Tool: Splunk, NAGIOS

PROFESSIONAL EXPERIENCE

Confidential, Bellevue, WA

Sr Java Developer/Sr. Data Analyst

Responsibilities:

  • Review the BRDS and collaborate to Rebellion Wireframe design.
  • Expertise with spring Framework using components like MVC, Transactions, ORM and JDBC.
  • Designed and implemented business logic withspring4 framework to perform IOC to isolate business logic from data presentation, AOP, and integratedSpringMVC to define Controller, action mappings, services.
  • Have used REDIS for caching mechanism.
  • Develop consumer based features and applications using Python.
  • Used Dockers for creating images that are deployed on AWS asMicroservices.
  • Developed a system of micro-services to replace a legacy, monolithic application, using Java,Spring Boot, and Cloud Foundry.
  • Developed core Service as maven jar that can be included by other Microservices as their maven dependency.
  • Created Server instances on AWS. Tomcat, Swagger are installed and used for deployingMicroservices.
  • Developed service layer usingspringMVC and springBoot.
  • Created Docker images for SOA projects that are developed as MicroServices.
  • ImplementedMicroServicesarchitecture using Spring Boot for making different application smaller and independent.
  • Worked with Architecture team on new Gen 3SOA, Web and ESB Architecture for large ecommerce site.
  • Implemented logging using log4j andSpringAOP.
  • Worked to secure our RESTful web services using oAuth 2.0 withspringsecurity to authenticate any third-party requests coming in.
  • DevelopedREST-based services within application to have a communication channel within application modules.
  • Developed and Exposed JAX-RSRESTwebservicesfor Customer profiles (getCIDetails), reviewCustomerNotification, updateOrders and billing.
  • Configured and builtSpringMVC application on Tomcat web server.
  • Worked on REMO app to enhance the stability and performance.
  • Managed integration of Splunk for logging REST API transactions .
  • Created Dashboards and various other visualizations using splunk, Appdynamics and Tableau .
  • Created Alerts in splunk and Appdynamics based on the threshold set by the stakeholders and Operations.
  • Extensive knowledge of the ExtraHop platform, Splunk and AppDynamics.
  • Performed advanced procedures like text analytics and processing, using the in-memory computing capacities ofSparkusing Scala.
  • Experience in designing and deploying AWS Solutions using EC2, S3, EBS,Cloudwatch, Elastic Load balancer (ELB), auto scaling groups..
  • Creating S3 buckets and also managing policies for S3 buckets and Utilized S3 bucket and Glacier for Archival storage and backup on AWS.

Confidential

Sr. Java/Hadoop Developer

Responsibilities:

  • Review the Business Requirements and user stories.
  • Maintain high-quality of RESTful services and implemented REST web Services usingSpringMVC and JAX-RS.
  • GeneratingREST(JAX-RS) basedWebServicesusing Server Oriented Architecture.
  • Querying database using SQL to analyze / validate and test data in the application.
  • Work closely with Application Teams to create new Splunk dashboards for Operation teams.
  • Used Splunk to access the Production logs and tweaked the logs to get trends of known issues.
  • Extensively used analytical features inTableau like Statistical functions and calculations, trend lines, forecasting etc. in manyTableau reports.
  • Vast knowledge of utilizingcloudtechnologies including Amazon Web Services (AWS), and PivotalCloudFoundry(PCF)
  • Designed and implemented a series ofSOAcompliant web services on JBOSS, WebLogic platforms.
  • Handled the services modeling, WSDL/XML data models and data mapping in support of Service specifications and Service realizations using UML service profile forSOA.
  • Developed, and implemented architectural solutions involving multiple PivotalCloudFoundry(PCF) foundations on VMware virtual infrastructure (on-premesis) .
  • Implemented Micro-services using PivotalCloudFoundryplatform build upon Spring Boot Services.
  • Created the Load Balancer on AWS EC2 for unstable cluster.
  • Did a test run of the SQOOP tool to pull data from various databases, to verify that functionality works as expected.
  • Created NAGIOS script for Cluster monitoring and various other alerts using the shell scripts.
  • Experienced in working withSparkeco system usingSparkSQL and Scala queries on different data formats like Text file, CSV file.
  • Expertized in ImplementingSparkusing Scala andSparkSQL for faster testing and processing of data.
  • Developed the UI using Angular JS, CSS, HTML .
  • Developed organization's Website and all custom web applications using JQuery, Javascript, HTML, CSS, XML and AJAX
  • Implement POC with Hadoop. Extract data with Sparkinto HDFS .
  • Hands on knowledge of writing code in Scala

Environment: Core Java,Hadoop,Accumulo,Hive,Pig,Spark,Scala, AWS EC2, Maven 2.1.1, Servlet, Html, CSS,Angular JS,JSON, Jetty, HDFS, Pig, Sqoop, Shell Scripting, Ubuntu, Linux Red Hat. Hbase, Oozie, Falcon, MapReduce, Jira, Bitbucket, Maven, Bamboo, J2EE, AngularJS, Jmocit, Lucene, Storm, Ruby, Unix, Sql, AWS(Amazon Web Services).

Confidential

Sr. Java Developer

Responsibilities:

  • Single handedly designed and developed a demo application of EAN’s RESTful webservices for EAN Affiliates.
  • Creation of REST Web Services according to requirements.
  • Developed RESTful API’s using Spring Framework, Java, JavaScript and published on Swagger UI .
  • Used Spring MVC and designed Controllers and Handlers using annotations to implement the business logic of the application
  • Wrote code to marshal and unmarshal java objects to XML and vice versa using JAXB
  • Wrote ANT build scripts to automate build processes and eliminate manual updates of XSD generated java files.
  • Was Involved in API documentation using Swagger .
  • Developed organization's Website and all custom web applications using JQuery, Javascript, HTML, CSS, XML and AJAX
  • Involved in writing application level code to interact with Restful Web APIs, Web Services using AJAX, JSON, XML and JQuery.
  • Extensively used Mockito and various other mocking utilities to mock webservices, database calls, and static private methods
  • Experience in Hadoop/HDFS commands, writing Java MapReduce Programs, verifying managing and reviewing Hadoop Log files.
  • Experience in setting up load balancer for the S3 bucket and EC2 clusters in AWS.

Environment: Core Java,Webservices, Hadoop, ANT, Maven 2.1.1, JSP, Servlet, Html, css, Java script/jQuery, REST, JSON, Apache Tomcat6, and MySQL. Hadoop, HDFS, Pig, Sqoop, HBase, Shell Scripting, Ubuntu, Linux Red Hat.

We'd love your feedback!