Java Programmer Resume
SUMMARY:
- A professionally qualified HADOOP Developer and Sun certified Java Programmer (SCJP) & Sun Certified Web Component Developer (SCWCD) with sound academic credentials and having around 8 years of experience including 3 years in Big data ecosystem related technologies. My expertise includes J2SE, Web Technologies and implementation of HADOOP Technologies.
- Hands on experience in installing, configuring, and using Hadoop ecosystem components like Hadoop Map Reduce, HDFS, HIVE, PIG, SQOOP, Flume, Kafka.
- HBase, Oozie and other Hadoop related eco - systems as a Data storage and retrieval systems.
- Having experience in writing Map Reduce, PIG and Hive UDF’s to solve the purpose of utility classes.
- Involved in writing Hive queries to load and process data in Hadoop File System.
- Well-experienced Mapper, Reducer, Combiner, Partitioner, Sort and Shuffling along with Custom Partitioning for efficient Bucketing.
- Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
- Involved in importing data to HDFS using SQOOP.
- Good experience in design the jobs and transformations and load the data sequentially & parallel for initial and incremental loads.
- Experience in designing both time driven and data driven automated workflows using Oozie.
- Experience in loading logs from multiple sources directly into HDFS using Flume.
- Strong Knowledge in Object Oriented Programming Concepts.
- Proficient in Java, Servlets, JSP, JDBC, HTML, XML and Oracle 10g/11g.
- Experience in web server like Tomcat.
- Strong knowledge on Struts, Spring, Hibernate, Web services.
- Proficient in Building, Deploying, Debugging and Testing of various project phases.
- Experience in JUnit Testing Java/J2ee applications. TFS, Anthill Pro, Visual Studio Source Control, HP Service Manager, Squirrel sql client and TeamTrack.
TECHNICAL SKILLS:
Web Technologies: Servlets, JSP, JSTL, JNDI, JDBC
Programming Languages: Java, JavaScript, C and C++
Big Data Technologies: Hadoop MapReduce, HDFS, HIVE, PIG, SQOOP, Flume, and Ooozi.:
RDBMS: Oracle 10g, 11g.
Servers: Apache-Tomcat, TFS.
Markup Languages: XML and HTML, CSS
Operating Systems: Windows 2008/NT/XP, UNIX.
Tools: JUnit, log4j, AnthillPro and Ant.
IDE s: Eclipse, IntelliJ.
PROFESSIONAL EXPERIENCE:
Confidential, San Jose, CA
Responsibilities:
- Configuration Management of a multi node Hadoop cluster for SQOOP and Hive.
- Integrated xCarrier to connect with datastore using Streaming API’s.
- Configured Flume to transfer the data to HDFS.
- Involved in loading data from Linux file system to HDFS and bulk Loaded the cleaned data into HBase.
- Migrated data from RDBMS to HBase to perform real time analytics.
- Developing Map Reduce jobs, Hive & PIG scripts.
- Involved in writing MR Unit test cases and results.
- Involved in scheduling Oozie workflow engine to run multiple Hive and pig.
- Importing and exporting data into HDFS and Hive using Sqoop. Creating Hive external tables using shared meta-store.
Technologies: CDH 4.x, Linux, Oracle11g.
Confidential
Responsibilities:
- Evaluated suitability of Hadoop and its ecosystem to the above project and implemented various proof of concept (POC) applications to eventually adopt them to benefit from the Big Data Hadoop initiative.
- Estimated Software & Hardware requirements for the Name Node and Data Node & planning the cluster.
- Configuration Management of a multi node Hadoop cluster for SQOOP and Hive.
- Involved in creating Hive tables, loading data and writing hive queries.
- Involved in importing data from relational database to HDFS using SQOOP.
- Migrated data from RDBMS to Hbase to perform real time analytics.
- Developing MapReduce jobs, Hive & PIG scripts.
- Involved in writing MRUnit test cases and results.
- Developed bash scripts to bring the log files from FTP server and then processing it to load into Hive tables.
- Involved in POC phrase to replace traditional message broker using Apache Kafka.
- Loaded the aggregated data into datameer for reports generation.
- Worked with application teams to install operating system, Hadoop updates, patches and version upgrades as required.
- Monitored System health and logs and respond accordingly to any warning or failure conditions.
Technologies: CDH 4.x, Linux, Core Java (Jdk6), Oracle11g.
Confidential
Responsibilities:
- Performed project management routines.
- Involved in daily technical status meetings and business requirement discussions.
- Customization of existing tools according to requirement.
- Working on codes and cases cites identification, bug fixes and enhancements of codes related engines. It includes enhancements in cases and codes.
- Working on trackers to replace legacy systems (Mainframe).
- Enhancement of Confidential Test Client Tool.
- Involved in Confidential operations like Support, Builds & Release and Regression Testing. 24*7 production Support for critical production issues. Analyzed Java Core, Heap Dump.
- Enhancement of Confidential for Practical law contents and trademarks support.
- Involved in queues enhancements for Confidential job submission.
- I have involved in development process and production release cycle.
- Since all requirements come from US then I have to coordinate with stakeholders. Coordination between the team was the most challenging and critical task which I have efficiently managed.
- GAP: provide solution and support to end users, enhancement in the product. I was module lead for this project for almost 2 yrs.
- IPA: created new engines for Patents & Trademarks support.
- WLN: DAO layer development.
- Confidential Monitor enhancements (tool to track the jobs submitted to Confidential ).
Technologies: Core Java (Jdk6), JUnit, Struts 1.2, XML, Tomcat 6, Oracle11g.
Confidential
Responsibilities:
- Communicated with customers about requirements and priority of tasks.
- Involved in Cite Advisor operations like Support, Builds and Release.
- Worked with the customers by understanding their business needs and provided an effective solution within the time limit.
- For any issue, I have analyzed, discussed within the team for the impact via calls, e-mails as well as pre-testing with temporary code.
- Collaboratively made the decisions with the help of risk leads for feasible solution among various applications to meet the needs of the client requirements.
Technologies: Core Java (Jdk 6), Spring, JUnit, XML, Oracle 11g.
Confidential
Responsibilities:
- Enhancement of LPA Authority GUI page.
- Communicated with in the team for the requirements.