We provide IT Staff Augmentation Services!

Senior Analytical Hadoop / Spark Developer Resume

SUMMARY:

  • 10+ years of professional experience in IT on JAVA, JEE including 2+ years of hands on experience in Big Data, Hadoop Ecosystem Components.
  • Expertise in developing application with financial domain, using Enterprise Technologies pertaining to Core Java 1.8, JEE, Servlets 2.2/2.3, JSP 2.0, Struts 2.0, Hibernate 3.0, Spring IOC, Spring MVC, Spring Boot Hibernate, JMS,XML, JDBC 2.0, JNDI, JAXP, JAXB, Web Logic, Web Sphere and Tomcat.
  • Experience in Web Services using XML, HTML, SOAP and REST API.
  • Solid background in Object Oriented Analysis & Design, Development and Implementation of Client Server/Web/Enterprise development using n - tier architecture knowledge of Angular JS practices, Creating custom, general use modules and components which extend the elements and modules of core Angular JS
  • Strong experience creating real time data streaming solutions using Apache Spark Core, Spark SQL & Data Frames, Spark Streaming
  • In depth knowledge of Hadoop Architecture and YARN
  • Experience in writing Map Reduce programs using Apache Hadoop for analyzing Big Data.
  • Hands on experience in writing Ad-hoc Queries for moving data from HDFS to HIVE and analyzing the data using HIVE QL.
  • Experience in importing and exporting data using Sqoop from Relational Database Systems to HDFS.
  • Experience in writing Hadoop Jobs for analyzing data using Pig Latin.
  • Working Knowledge in NoSQL Databases like HBase.
  • Integrated Apache Kafka for data ingestion
  • Experience in using Apache Flume for collecting, aggregating and moving large amounts of data from application servers.
  • Experience in using Zookeeper and Oozie Operational Services for coordinating the cluster and scheduling workflows.
  • Extensive experience with SQL, PL/SQL, Shell Scripting and database concepts.
  • Experience in using version control management tools like CVS, SVN and Rational Clear Case.
  • Highly motivated, self-starter with a positive attitude, willingness to learn new concepts and acceptance of challenges.
  • Ability to work independently and with a group of peers in a results-driven environment. Strong analytical and problem solving skills. Ability to take initiative and learn emerging technologies and programming languages

TECHNICAL SKILLS:

Hadoop Core Services: HDFS, Map Reduce, Hadoop YARN.

Hadoop Distribution: Cloudera.

Hadoop Data Services: Apache Hive, Pig, Sqoop, Flume.

Hadoop Operational Services: Apache Zookeeper, Oozie.

No SQL Databases: Apache HBase, Cassandra

Virtualization Software: VMware, Oracle Virtual Box.

Java& JEE Technologies: Core Java 1.7 & 8, Servlets3.2, JSP 2.0, JDBC, Java Beans.

IDE Tools: Eclipse, Net Beans,RAD

Programming Languages: C, Java, Unix Shell scripting,Scala

Data Bases: Oracle 11g/10g/9i, DB2, MS-SQL Server, MySQL, MS-Access.

Web Servers: Web Logic 11, Web Sphere 6.1, Apache Tomcat 5.5/6.0.

Environment: Tools: SQL Developer, Win SCP, Putty.

Frameworks: Struts 2.0, UML, Hibernate 3.0, Spring2.5

Version Control Systems: CVS, Tortoise SVN,GIT

Operating Systems: Windows, Linux.

PROFESSIONAL EXPERIENCE:

Confidential

Senior Analytical Hadoop / Spark Developer

Responsibilities:

  • Developed Spark scripts by using Scala shell commands as per the requirement.
  • Used Spark API over Cloudera Hadoop YARN to perform analytics on data in Hive.
  • Developed Scala scripts, UDFFs using both Data frames/SQL and RDD/MapReduce in Spark for Data Aggregation, queries and writing data back into OLTP system through Sqoop.
  • Experienced in performance tuning of Spark Applications for setting right Batch Interval time, correct level of Parallelism and memory tuning.
  • Loaded the data into Spark RDD and do in memory data Computation to generate the Output response.
  • Optimizing of existing algorithms in Hadoop using Spark Context, Spark-SQL, Data Frames and Pair RDD's.
  • Involved in loading data from Oracle database into HDFS using Sqoop queries.
  • Responsible for building scalable distributed data solutions using Hadoop.
  • Developed Map Reduce pipeline jobs to process the data and create necessary HFiles.
  • Developed Pig Latin scripts for data cleansing.

Environment: HDFS, Hive, HBase, Spark Core, Spark SQL, Spark Streaming, Scala, SBT 0.13, Oozie, kafka-0.9.0, Apache NiFi, Java (jdk1.7 & 1.8 ), UNIX, SVN and Zookeeper, JEE, JSP, JSTL, Spring 2.5, Oracle 11g/10g,Maven, REST-ful Web Services, Apache Axis2, LINUX, Tomcat7,GIt,Jenikins.

Confidential

Senior Java/Hadoop Developer

Responsibilities:

  • Involved in writing MAVEN project, primary responsibilities include development of web application using spring MVC and Hibernate to pull queries from oracle
  • Involved in loading data from Oracle database into HDFS using Sqoop queries.
  • Responsible for building scalable distributed data solutions using Hadoop.
  • Developed Map Reduce pipeline jobs to process the data and create necessary HFiles.
  • Developed Pig Latin scripts for data cleansing.
  • Worked with different File Formats like TEXTFILE, AVROFILE for HIVE querying and processing.
  • Developed PIG UDF'S for manipulating the data according to Business Requirements and also worked on developing custom PIG Loaders.
  • I have successfully written Spark Streaming application to read streaming twitter data and analyze twitter records in real time using Yardstick framework to measure performance of Apache Ignite Streaming and Apache Spark Streaming.
  • Using Apache Nifi to Stream data Feeds to Kafka.
  • Involved in writing MAVEN project, primary responsibilities include development of web application using spring MVC and Hibernate to pull queries from oracle.
  • Implemented test cases for Spark using Scala as language.

Environment: Hadoop, Map Reduce, HDFS, Hive, HBase, Spark Core, Spark SQL, Spark Streaming, Scala, SBT 0.13, Oozie, kafka-0.9.0, Apache NiFi, Java (jdk1.7 & 1.8 ), UNIX, SVN and Zookeeper, JEE, JSP, JSTL, Spring 2.5, Oracle 11g/10g,Maven, REST-ful Web Services, SOAP, Apache Axis2, LINUX, Tomcat7

Confidential

Tech Lead( Java/Hadoop)

Responsibilities:

  • Collaborated with the Business Intelligence team to understand the high level data roadmap and define data discovery priorities.
  • Installed and configured Hadoop-1.0.2 Map Reduce, HDFS, and developed multiple Map Reduce jobs in Java for data cleaning and preprocessing.
  • Analyzed Hadoop cluster using the big data analytical tools such as Pig, Hive, Scoop and Flume.
  • Collected and aggregated large amounts of web log data from different sources such as web servers, mobile and network devices using Apache Flume and stored the data into HDFS for analysis.
  • Developed optimal strategies for distributing the web log data over the cluster, importing and exporting the stored web log data into HDFS and Hive using Scoop.
  • Involved in creating Hive tables, loading millions of records of the stored log data and writing queries that will invoke and run the Map Reduce jobs in the backend.
  • Transformed large sets of semi-structured and unstructured data in various formats, to extract parameters such as user location, age, spending time etc.
  • Analyzed the web log data using Hive to calculate metrics such as number of unique visitors, page views, etc.
  • Exported the analyzed data to relational databases using Sqoop for visualization and generating reports.
  • Designed efficient high-performing applications to extract, transform, load, and query very large datasets, including unstructured data.
  • Installed Apache Oozie workflow engine to run multiple Hive and Pig jobs independently with time and data availability.
  • Modelled user behavior based upon previous findings and most relevant data available, and contributed to the development of tools for tracking and understanding user behavior.
  • Worked on Hive joins to produce the input data set.

Environment: Hadoop, Map Reduce, HDFS, Hive, Oracle 11g/10g, HBase, Spark Core, Spark SQL, Spark Streaming, Scala, SBT 0.13, Oozie, Java (jdk1.6), UNIX, SVN and Zookeeper,Java, J2EE, JSP, JSTL, AngularJS, Spring 2.5,Maven, REST-ful Web Services, SOAP, Apache Axis2, LINUX, Tomcat7,Maven

Confidential

Senior Java Developer

Responsibilities:

  • Used Agile Methodologies to manage full life-cycle development of the project.
  • Developed application using Struts, spring and Hibernate.
  • Developed rich user interface using JavaScript, JSTL, CSS, JQuery and JSP’s.
  • Developed custom tags for implementing logic in JSP’s.
  • Used Java script, JQuery, JSTL, CSS and Struts 2 tags for developing the JSP’S.
  • Involved in making release builds for deploying the application for test environments.
  • Used Oracle database as backend database.
  • Wrote SQL to update and create database tables.
  • Used Eclipse as IDE.
  • Using RIDC Interface get content details and Create Content through application.
  • Used Spring IOC for injecting the beans.
  • Used Hibernate for connecting to the database and mapping the entities by using hibernate annotations.
  • Created JUnit test cases for unit testing application.
  • Writing/integration JSP communicating to spring controller and passing query criteria to hibernate to pull data and showing reports based on searches both on web gui and streaming data into excel .
  • Involved in writing MAVEN project, primary responsibilities include development of web application using spring MVC and Hibernate to pull queries from oracle.
  • Development Tools are Maven, Spring MVC,Hibernate
  • AppserversIwebservers:Tomcat7
  • Used JUNIT and JMOCK for unit testing.

Environment: J2EE1.6, JSP, JSTL, Ajax, Spring 2.5, Struts 2.0, Ajax, Hibernate 3.2,JDBC, JNDI,XML, XSLT, Web Services, WSDL, Log4j, ORACLE 11g, Oracle Web logic Server 10.3, SVN, Windows XP, UML.

Confidential

Java Developer

Responsibilities:

  • Involved in the requirements gathering. Design, Development, Unit testing and Bug fixing.
  • Used Agile Methodologies to manage full life-cycle development of the project.
  • Involved in development of GUI for Client Using Java Swing Development using Java1.6, Java swing worker also developed the entire online help for the application system using the Java help system, the application had perspectives for views to support multiple tab Components in the UI, it was packaged as a web start build.
  • Involved in creating ant scripts for the Jnlp files to update client builds for the Usability tests.
  • Worked with Generics using Java1.6 and some other open source tools to build the C2C1M Client like, Table Layout, Jakarta P01etc, Worked with JDIC (Java Desktop integration Components) components to embed a web browser displaying HTML in a Java window.
  • Worked on JMS to publish and subscribe to topics messages downstream to update legacy data using IBM MQ series.
  • Created/Configured Hibernate mapping classes, 1-Hibernate configuration files (XML files) to use the Hibernate frame work to update or view Database records, also created Criteria Queries to retrieve data from the Database.
  • Developed RMT servers of client and server could communicate for data updates and retrievals. The combination of hibernate with RMI was simple and perfect solution for concurrency issues.
  • Software Development Methodology used to develop the project was Agile Methodology (with small scrum teams).Connected client to web service (JAX—RPC) to retrieve data and d display for central storage, the application was developed over (lie spring framework.)
  • Worked on writing/updating ANT scripts while creating web archive files/enterprise archive files the jars were signed to maintain security for the systems.
  • Built high performance java servers and used maven for building/packaging.
  • Design of the Client side of the application using Java swing (1.6) using the JSR296 application framework.
  • Collaborating with analysis team to review the developed UI and packaging/deployment the application for UAT and SIT using java web start build techniques.
  • Development tools: JavaSwing,Intellijidea,YourKit Profiler
  • Application server: Websphere
  • Server technology: RMI Remote Method Invocation), Hibernate. Version Control;Clearcase
  • Special tools; JDIC, Table Layout (opensourceswing’s layout), Glazed Lists, lntelliJ IDEA, Your Kit Java Profiler (to clear memory leaks in the client/server code), Deadlocks P0I (Apache for exporting to excel from the GUI tables), hibernate..

Environment: J2EE1.6, JSP, JSTL, Ajax, Spring 2.5, Struts 2.0, Ajax, Hibernate 3.2,JDBC, JNDI,XML, XSLT, Web Services, WSDL, Log4j, ORACLE 11g, Oracle Web logic Server 10.3, SVN, Windows XP, UML, Sterling Commerce Distributed Order Management(DOM).

Confidential

Java/J2EE Developer

Responsibilities:

  • Involved in Unit Testing of the Application.
  • Developed the JSP’s and Modeling data using MVC architecture.
  • Involved in designing front-end screens.
  • Documenting daily weekly status report and sending to client
  • Implemented the Servlets and JSP components
  • Used Hibernate for Object Relational Mapping and data persistence.
  • Developed the Database interaction classes using JDBC.
  • Created JUnit test cases and ANT scripts for build automation.

Environment: Java, J2EE 1.4, HTML, XML, JDBC, JMS, Servlets, JSP 1.2, Struts 1.2, Hibernate, Web services, Eclipse 3.3, Web Sphere 7, Oracle 9i, ANT, Microsoft Visio.

Hire Now