We provide IT Staff Augmentation Services!

Hadoop Developer/admin Resume

0/5 (Submit Your Rating)

Charlotte, NC

SUMMARY

  • 7 years of experience in IT with about 3 years of Big Data Ecosystem related technologies and about more than 4 years in requirement analysis, development, implementation, testing, documenting and maintenance of Web Applications using Java/J2EE technologies
  • Around 3 years of experience on BIG DATA using HADOOP framework and related technologies such as HDFS, HBASE, MapReduce, Yarn, HIVE, PIG, FLUME, OOZIE, SQOOP, and ZOOKEEPER
  • Experience in using Cloudera Manager for installation and management of single - node and multi-node Hadoop cluster (CDH3, CDH4 & CDH5)
  • Experience with Amazon EC2, S3, RedShift, dynamo db.
  • Experience in working with Hortonworks cluster installations and working with various tools such as HIVE, SQOOP, IBM TIVOLI, and HCATALOG.
  • Experience in JMS, Java Web Services components like SOAP, WSDL, UDDI and HTTP.
  • Expertise in developing applications using Java/J2EE, Spring, Struts, Web services, Servlets, EJB, JDBC, JMS, Hibernate, JSP, JSTL, HTML, XHTML, CSS, XML, AJAX, Java Script, and jQuery.
  • Experience in developing the GUI using scripting languages JSP, JSTL, AJAX, JavaScript, jQuery and HTML.
  • Proficient in RDBMS concepts, writing SQL Queries, Stored Procedures, Triggers, Cursors and Functions.
  • Experience with Core Java with multithreading, Concurrency, Exception Handling, File handling, IO, Generics, Data structures, Collections and Garbage Collection.
  • Hands on experience in developing SOAP based Web services.
  • Strong knowledge in Design Patterns like Singleton, Service Locator, MVC, Facade, Value Object and Cache Technique.
  • Experience in Tomcat and Web logic Administration, Installation, domain creation, JMS, JDBC configurations, J2EE applications deployment and Troubleshooting.
  • Ensure deliverables are prepared to satisfy the project requirements and schedule. Escalate issues which cannot be resolved by the team, facilitate problem solving and collaboration.
  • Coordinate with internal and external teams (onshore /offshore) as necessary.Establish meeting times, places and agendas, Ensure discussions and decisions lead to closure. Good experience in operations production support and project development, communicating project scope, and ability to work in a team environment. Driven by new challenges and adept at adapting to any cultural and business environments.

TECHNICAL SKILLS

Languages: C++,Java, J2EEJava Front End / Scripting / Web Technologies HTML, Java Script, AJAX, JQuery, Coffee Script and Sencha Ext JS (Extended Java Script) / JSP 1.2 / 2.0 and Servlets 2.3/2.4/2.5

Java / J2EE Frameworks: Apache Struts, Spring (Core and iOC for Dependency Injection, Spring MVC, Spring with JDBC, JBoss Seam for Workflow Management, Jolt, DSL (Domain Specific Language) and Hadoop

Distributed Technologies/ Protocols / Java Web Service implementations: Java Networking, Java Beans (BDK), RMI, EJB 1.x/2.x/3.x, JBoss DROOLS Rule Engine, JAX-RPC, JAX-WS and JAX-RS api’s with implementation of Apache Axis 1.x/2.x and Apache CXF 2.x, Jersey, HTTP, XML, XSD, XSLT, XPATH, SOAP, WSDL, UDDI and HTTP

ORM / Transaction Frameworks: Hibernate, Spring with JDBC, Spring with Hibernate, Spring with JPA (Java Persistence API), JTA (Java Transaction API) and EJB Transactions

BIG Data Eco Systems: Cloudera Distribution for Hadoop (CDH), MapReduce, HDFS, HBase, YARN, Hive, Pig Scripting, Sqoop and Flume.

Application/Web Server’s: Oracle BEA Web Logic 6.0/7.0/8.1/9.1/10.3 , IBM Web Sphere 6.x/ 7.x, JBoss5.x/ 7.x, Jetty and Apache Tomcat 5.0/6.0/7.x

Databases: Oracle IBM DB2, Sybase, Oracle SQL and MS Access 2000

Product / IDE’s/ Tools: Eclipse 3.x, My Eclipse 6.1/6.5/10, Eclipse Kepler for Java EE, Oracle SUN Net Beans 7.x, Oracle BEA Web Logic Workshop 9.x, IBM RAD 7.5.5.5, TOAD for DB2, FTP, SFTP, FileZilla Client for File Transfer, WinSCP, SFTP, Putti, XML Spy, JUnit for Unit Testing, Win Merge for File Comparison, SOAP, Apache Ant for automating software build processes, log4J for logging, Enterprise Architect, Oracle SQL Developer, TOAD, My SQL Workbench for Data modelling

Operating Systems: Windows 2000/2007/XP, UNIX, LINUX and MS-DOS

Version Controls: Tortoise SVN

Software Methodologies and others: Water Fall Model, Agile Scrum, Object Oriented Analysis & Design, UML and Design Patterns

PROFESSIONAL EXPERIENCE

Confidential, Charlotte, NC

Hadoop Developer/Admin

Responsibilities:

  • Worked on analyzing Hadoop cluster and different Big Data analytic tools including Pig, Hive HBase database and SQOOP.
  • Installed Hadoop, Map Reduce, HDFS, and Developed multiple map reduce jobs in PIG and Hive for data cleaning and pre-processing.
  • Coordinated with business customers to gather business requirements. And also interact with other technical peers to derive Technical requirements and delivered the BRD and TDD documents.
  • Extensively involved in Design phase and delivered Design documents.
  • Involved in Testing and coordination with business in User testing.
  • Importing and exporting data into HDFS and Hive using SQOOP.
  • Written Hive jobs to parse the logs and structure them in tabular format to facilitate effective querying on the log data.
  • Involved in creating Hive tables, loading with data and writing hive queries that will run internally in map reduce way.
  • Experienced in defining job flows.
  • Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting.
  • Experienced in managing and reviewing the Hadoop log files.
  • Used Pig as ETL tool to do Transformations, even joins and some pre-aggregations before storing the data onto HDFS.
  • Load and Transform large sets of structured and semi structured data.
  • Responsible to manage data coming from different sources. Involved in creating Hive Tables, loading data and writing hive queries.
  • Utilized Apache Hadoop environment by Cloudera. Created Data model for Hive tables.
  • Involved in Unit testing and delivered Unit test plans and results documents.
  • Exported data from HDFS environment into RDBMS using Sqoop for report generation and visualization purpose.
  • Worked on Oozie workflow engine for job scheduling

Environment: Hadoop, Hive, Map Reduce, Pig, SQOOP.

Confidential, Atlanta,GA

Hadoop Developer

Responsibilities:

  • Responsible for building scalable distributed data solutions using Hadoop.
  • Responsible for Cluster maintenance, Adding and removing cluster nodes, Cluster Monitoring and Troubleshooting, Manage and review data backups, Manage and review Hadoop log files.
  • Continuous monitoring and managing the Hadoop cluster through Ganglia.
  • Installed Oozie workflow engine to run multiple Hive and Pig jobs which run independently with time and data availability.
  • Spark is used to work on computations. Impala is used to get the fastest results on the data sets.
  • Performed major and minor upgrade for the cluster.
  • Coded MapReduce program in Java to extract information from huge volume of files, load it into Hive and then HQL to create the report on top of them
  • Involved in using PIG Latin to analyze the very large scale data.
  • Importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe.
  • Scaling the Hadoop Cluster by implementing Commissioning and Decommissioning.
  • Implemented Fair schedulers on the Job tracker to share the resources of the cluster for the map reduce jobs given by the users.
  • Configured the compression codec's to compress the data in the Hadoop cluster.
  • Developed Map Reduce programs to perform analysis
  • Knowledge on ETL technologies and Data mining.
  • Implemented Partitioning, Dynamic Partitions, buckets in Hive and wrote map reduce programs to analyze and process the data.
  • Extensive experience in designing the jobs using MapReduce features in large scale distributed systems.
  • Implemented the custom Data Serialize and De-serialize (SerDe) for exposing the data to the end users.

Environment: MapReduce, Pig, Impala, Oozie, Sqoop, Parquet, ANT, Zookeeper, JDBC, Log4j

Confidential

Senior Java Developer

Responsibilities:

  • Played the role of Senior Java Developerin the project called "Coverage Selection Tool".
  • Technologies involved are EJB 3.0, Web services, Dojo (UI Framework) and other J2EE server components.
  • Analyze and prepare technical specifications with UML diagrams (Use case, Class, and Sequence diagrams
  • Used Rational Rose to develop the components required by client.
  • Wrote complex logic for forecasting the price of the products and subparts in next future quarters.
  • Development of business components applying OOAD and using good design patterns like, DAO, Value Objects, DTO, Factory, singleton.
  • Implemented DOM parsing module and created XSD and XSLT components.
  • Used stored procedures and Triggers extensively to develop the Backend business logic in Oracle database.
  • Involved in performance improving and bug fixing.
  • Analyze old database table fields and map to new schema tables using complex SQL Queries and PL/SQL procedures.
  • Developed ANT scripts for deploying the application using Apache ANT.
  • Coordinate the Functional users and testing teams in testing the application in Test environment.
  • Given production support for this after deployed in to the production server.
  • Involved in data base migration testing activities.

Environment: Java, JSP, Servlets, XML, JDBC, Java Script, PL/SQL, ANT build, CSS, HTML, Eclipse IDE, JavaScript

Confidential

Java Developer

Responsibilities:

  • Involved in Analysis, design and coding of business modules and functionalities.
  • Analyzed project requirements and provided required technical assistance to team members.
  • Worked on creation of session beans, Message Driven Beans.
  • Worked on creation of various modules using EJB, JSP.
  • Implemented AJAX for server side validations, auto loading of data and to improve performance.
  • Implemented the presentation layer (user interface) using JSP, JavaScript, HTML, CSS and AJAX.
  • Created Complex queries, stored procedures and worked on performance tuning of SQL queries.
  • Developed functionalities to create Data Transfer Objects (DTO) and Data Access Objects (DAO).
  • Configured and Deployed application on Web Logic application server.
  • Assisted other team members to resolve complex problems.
  • Documented detailed design and code at various stages of development.
  • Actively involved in the integration of various modules and code reviews.
  • Worked with project architect and lead designer to provide various solutions in developing functionalities.

Environment: Java 1.5, JSP, Servlets, EJB, JMS, Extended JavaScript, CSS, AJAX, jQuery, XML, XSLT, HTML, Weblogic, Oracle.

We'd love your feedback!