We provide IT Staff Augmentation Services!

Big Data/java Developer Resume

4.00/5 (Submit Your Rating)

SUMMARY:

  • 8 years of software development experience Analysis, Design, Development, Implementation, Integration and testing of Application Software in web - based environments, distributed n-tier and Client/Server architectures and over 3 years of experience in Hadoop ecosystem’s implementation, maintenance and Big Data analysis operations.
  • Extensive experience as a Hadoop Developer and in Java application development.
  • Experience in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions.
  • In depth knowledge of Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager,YARN and MapReduce concepts.
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required.
  • Good knowledge about YARN configuration.
  • Hands on experience on Hortonworks and Cloudera Hadoop environments.
  • Expertise in writing Hadoop Jobs for analyzing data using Hive QL (Queries), Pig Latin (Data flow language), and custom MapReduce programs in Java.
  • Expert in working with Hive data warehouse tool-creating tables, data distribution by implementing partitioning and bucketing, writing and optimizing the HiveQL queries.
  • Good knowledge in creating Custom Serdes in Hive.
  • Developed Pig Latin scripts using operators such as LOAD, STORE, DUMP, FILTER, DISTINCT, FOREACH, GENERATE, GROUP, COGROUP, ORDER, LIMIT, UNION, SPLIT to extract data from data files to load into HDFS.
  • Extending Hive and Pig core functionality by writing custom UDFs.
  • Good knowledge in Linux shell scripting or shell commands.
  • Good Experience in importing and exporting data between HDFS and Relational Database Management systems using Sqoop.
  • Loaded streaming log data from various Webservers into HDFS using Flume.
  • Very Good knowledge and Hands-on experience in Cassandra and Spark (YARN).
  • Good Knowledge of architecture and functionality of NOSQL DataBases like HBase, MongoDB and Cassandra.
  • Good Knowledge on Hadoop Cluster architecture and monitoring the cluster.
  • Knowledge of job workflow scheduling and monitoring tools like Oozie (hive, pig) and Zookeeper (Hbase).
  • Experience in Object Oriented Analysis, Design and Programming.
  • Extensive experience in Java and J2EE technologies like Servlets, JSP, Enterprise Java Beans (EJB), JDBC.
  • Expertise in developing web based GUIs using Applets, Servlets, JSP, JQuery,HTML, JavaScript and CSS.
  • Experience in development of Multi-Tier distributed Enterprise Applications and implementation of Model-View-Controller (MVC) using Struts 1.1/1.2 frameworks, and spring Framework.
  • Experience in working with different operating systems Windows Variants,UNIX and LINUX.
  • Well versed with multiple version control tools like CVS and SVN.
  • Experience with Backend Databases like Oracle, MS SQL Server, MySQL and DB2
  • Experience in writing Sub Queries, Stored Procedures, Triggers, Cursors and Functions in SQL, PL/SQL.
  • Experience in developing the projects using Agile Methodology.
  • Strong analytical and Problem solving skills.
  • An excellent team player and self-starter with good communication skills and proven abilities to finish tasks before target deadlines.

TECHNICAL SKILLS:

Big Data: Hadoop, HDFS, MapReduce, Hive, Sqoop, Pig, Flume,Oozie,Zookeeper,Spark HBase, MangoDB,Cassandra.

Java Technologies: JAVA, JavaScript, J2EE, JDBCProgramming and Scripting Languages: Java, SQL, PL/SQL,Shell Scripting

Databases: Oracle 8i/9i/10g/11g/12c, SQL Server 2005/2008/2008 R 2/2012/2014/2016, MySQL 4.1/5.0, DB2,AWS(Amazon Web Services)

Web/Application Servers: IBM WebSphere, JBoss, Apache Tomcat, BEA WebLogic

IDE and Database Tools: MyEclipse, RAD, NetBeans, MS Visual Studio, Toad for Oracle

Frameworks/API: Struts 1.x/2.0, Hibernate 2.0/3.x, Spring 2.x/3.x

Version control/Build tools: CVS, SVN, Maven, GIT, SCM, ANT

Operating Systems: Windows Variants, Unix Aix 5.2/5.3, Linux

PROFESSIONAL EXPERIENCE:

Confidential

Big Data/Java Developer

Responsibilities:

  • Implemented the layers communication from the JSP level to the CDB database
  • Involved in application development using Java, Hibernate, Servlet, JSP, Spring.
  • Designed UI pages using CSS, JavaScript, JSP and Tag library.
  • Maintain the entire Model view control for the Enterprise Application.
  • Wrote the SQL scripts for pushing the data into the CDB in a batch process.
  • Designed and implemented the various Discount Algorithms programs for issuing discount to customers.
  • Coordinating the production, business, and technical calls.
  • Providing the hot fix in production.
  • Implemented the CDB fetching programs which provide the layers DTO to Bean communication.
  • Implemented Spring batch processing using core java.
  • Implemented Hadoop Cluster and Mahout Recommendation System for the project
  • Worked on a project based on the implementation of Hadoop and Map Reduce on large database systems
  • Worked on the use of PIG & Hive and Cassandra to perform operation on large unstructured data
  • Worked on AWS for fetching the data from IBM server and implemented the discount.
  • Transferring the data from drive-wise project to AWS clouds.
  • Updating the AWS clouds using the SOAP webservices.

Environment: Java, JSP, Servlet, ORM, JavaScript, Spring, PLSQL/SQL, Agile, SOAP, Hadoop, Allstate Framework, Map reduce, CDB, DTO, Spring Batch, Load Management, AWS (Amazon Web Services)

Confidential, Detroit, MI

Big Data/Hadoop Developer

Responsibilities:

  • Responsible for building scalable distributed data solutions using Hadoop.
  • Involved in Installing and configuring Hive, Pig, Sqoop, Flume and Oozie on the Hadoop cluster.
  • Wrote Simple to complex Map/reduce Jobs using Hive and Pig.
  • Developed Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms.
  • Executed Hadoop streaming jobs to consume and process terabytes of xml format data.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted the data from MySQL into HDFS using Sqoop.
  • Analyzed the data by performing Hive queries and running Pig scripts to study customer behavior.
  • Collected the logs data from web servers and integrated in to HDFS using Flume.
  • Implemented NameNode backup using NFS. This was done for High availability.
  • Developed PIG Latin scripts to extract the data from the web server output files to load into HDFS.
  • Involved in the installation of CDH3 and up-gradation from CDH3 to CDH4.
  • Worked on NoSQL databases including HBase, MongoDB, and Cassandra.
  • Created Hive External tables and loaded the data in to tables and query data using HQL.
  • Wrote shell scripts for rolling day-to-day processes and it is automated.
  • Used UDF's to implement business logic in Hadoop.
  • Experience in managing and reviewing Hadoop log files.
  • Implemented business logic by writing UDFs in Java and used various UDFs from Resources.
  • Hands on experience in monitoring and managing the Hadoop cluster using Cloudera Manager.
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required.
  • Involved in Installing the Oozie workflow engine in order to run multiple Hive and Pig jobs.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.

Environment:: Hadoop, MapReduce, HDFS, Hive, Pig, Java, SQL, Cloudera Manager, Sqoop, Flume, Oozie, CDH3, MongoDB, Cassandra, HBase, Java (JDK 1.6), Eclipse, Oracle 11g/12c and Unix/Linux.

Confidential, Hoffman Estates, IL

Big Data/Hadoop Developer

Responsibilities:

  • Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, HBase NoSQL database and Sqoop.
  • Installed and configured Hadoop Map Reduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and pre-processing.
  • Applied MapReduce framework jobs in java for data processing by installing and configuring Hadoop, HDFS.
  • Performed data analysis in Hive by creating tables, loading it with data and writing hive queries which will run internally in a map reduce way.
  • Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
  • Worked on assisting integration of Hadoop process to IT infrastructure.
  • Apache Hadoop installation & configuration of multiple nodes on system.
  • Load and transform large sets of structured, semi structured and unstructured data.
  • Build/Tune/Maintain Hive QL and Pig Scripts for user reporting.
  • Used Sqoop to import and export data from HDFS to RDBMS and vice-versa.
  • Created HBase tables to store various data formats of PII data coming from different portfolios Implemented Sqoop for loading data from Oracle database to NoSQL database.
  • Developed Map Reduce Programs.
  • Experienced in defining job flows.
  • Experienced in managing and reviewing Hadoop log files.
  • Supported MapReduce Programs running on the cluster.
  • Load data from UNIX file system to HDFS.
  • Worked with Flume to load log data into HDFS.
  • Installed and configured Hive and also written Hive UDFs.
  • Moved data from Hadoop to Cassandra using Bulk output format class.
  • Creating Hive tables, loading data, and writing Hive queries.
  • Develop Shell scripts to automate routine DBA tasks (i.e. database refresh, backups, monitoring).
  • Created Hive queries for data analysis to meet the business requirements.
  • Created Hive tables and working on them using Hive QL.

Environment: Java, Hadoop, MapReduce, Pig, Hive, HBase, Oozie, FLUME, HDFS, Sqoop, Oozie, Cloudera, Cassandra, ETL, NoSQL, and UNIX Shell Scripting.

Confidential, Troy, MI

Java Developer

Responsibilities:

  • Involved in the design and prepared activity diagrams, sequence diagrams, class diagrams and use case diagrams for various use cases using Microsoft Visio.
  • Followed agile methodology and Test driven approach in building the system.
  • Application was based on the Model View Controller (MVC-2) architecture Used Spring MVC framework at the Web tier level to isolate each layer of the application so that complexity of integration will be reduced and maintenance will be very easy.
  • Developed user interface using JSP, JSTL, HTML, CSS and JavaScript to simplify the complexities of the application.
  • Used the Spring validation to validate form data.
  • Interacted with database Microsoft SQL Server using Object/Relational mapping framework ‘Hibernate’ and used HQL, Criteria, and Named Queries.
  • Configured Hibernate mapping files and Hibernate configuration files to connect with the database.
  • Implemented various J2EE design patterns, like DTO, DAO and Singleton.
  • Communicated between different applications through Web Services (XML, WSDL, UDDI, and SOAP) and exchanged data.
  • Used Jira for project tracking, Bug tracking and Project Management.
  • Configured and used Log4J for logging all the debugging and error information.
  • Worked with ANT build scripts for compiling and building the project and CVS for version control

Environment: JDK, HTML, JavaScript, Servlet2.4, JSP2.0, Spring 3.0, Hibernate3.2, Web Services (SOAP, WSDL, UDDI), XML, Log4J, ANT, Junit, Microsoft SQL Server 2005, JBoss 5.1, Eclipse, CVS, Windows 7/ Server 2003.

Confidential

Java Developer

Responsibilities:

  • Wrote Java codes for implementing the various POC (Proof of Concept).
  • Created the technical design document, Integration test plan for the team.
  • Wrote the Restful Web services to fetch and transfer the batch data from mainframe database to the relation database.
  • Involved in writing Oracle SQL & PL/SQL- Stored Procedures, Functions, Triggers, Sequences, Indexes, and Views.
  • Wrote the unit test plan
  • Developed dynamic user interface with JSP, HTML and CSS.
  • Client side validation was done using JavaScript.
  • Enhanced the application for multithreaded scenarios. Deployed the application under JBoss Application Server and resolved the production issues during migration onto the production server.
  • Developed Server Side EJB components for middle tier implementation.
  • Setup JDBC connectivity to database.
  • Implementation of MVC architecture by separating the business logic from the presentation logic using Struts framework.

Environment:: Java, J2EE, JavaScript, Multithreading, Eclipse, HTML, CSS, JSP, Struts, EJB, Servlets, JBoss Application Server, Oracle,, REST, PL/SQL, RESTFul.

Confidential

Java Programmer

Responsibilities:

  • Actively involved in software development life cycle starting from requirements gathering and performing object oriented analysis and design using UML.
  • Implementation of MVC architecture by separating the business logic from the presentation logic using Struts framework.
  • Developed dynamic user interface with JSP, HTML and CSS.
  • Client side validation was done using JavaScript.
  • Enhanced the application for multithreaded scenarios. Deployed the application under JBoss Application Server and resolved the production issues during migration onto the production server.
  • Developed Server Side EJB components for middle tier implementation.
  • Setup JDBC connectivity to database.
  • Developed a batch job in java using JDK (Batch API), PERL Scripts which runs every second business day of month.
  • Involved in designing database tables, writing procedural constructs, functions on database side to implement business logic.

Environment: Java, J2EE, JavaScript, Multithreading, Eclipse, HTML, CSS, JSP, Struts, EJB, Servlets, JBoss Application Server, Oracle, UML and UNIX.

We'd love your feedback!