We provide IT Staff Augmentation Services!

Senior Hadoop Developer Resume

3.00/5 (Submit Your Rating)

Bellevue, WA

SUMMARY

  • 13+ years of experience in Designing, building distributed, scalable and complex applications using Java, J2EE and Big Data Technologies.
  • 4 Years of experience in Hadoop Eco system components HDFS, Map Reduce, Spark, Scala, Kafka, HBase, Zookeeper, Sqoop, Impala, Solr, Hive, Pig, Sqoop, Oozie, Storm, YARN, Cloudera and Hortonworks distribution.
  • Experienced with Spring, Struts, EJB, Hibernate, JPA, EJB, JMS, SOA, SOAP, Restful Web services, Design Patterns, JBoss, Apache tomcat, Jetty, JSP and Servlets
  • Experience in advanced web user interface development using advanced JavaScript, Angular JS, Html, CSS, Ajax, JQuery, Flex and Bootstrap
  • Experienced with Jenkins, Hudson, SVN, git, CVS, JIRA, Bugzilla, Fogbugz, Python, Maven and ANT.
  • Experience in Test Driven Development using JUnit, Pig Unit, and Html Unit for Acceptance test framework.
  • Experience in working with different databases like Mysql, Oracle, Rainstor and Teradata.
  • Good Experience in SDLC based on Scrum, Waterfall models and various other models.
  • Experience in configuration, deployment, and management of enterprise applications with Jetty, Tomcat, JBOSS, Web Logic and Web Sphere Application servers in clustered environments, Mysql Replication and DRBD.
  • Experienced with XML related technologies such as XML, XSLT, XSD, DOM and SAX.
  • Hands on experience in using IDEs like Eclipse and JBuilder.
  • Worked extensively in Linux environment and writing shell scripts.
  • Well versed on Test driven development approach (TDD).
  • Excellent debugging skills and strong troubleshooting skills.
  • Experience in all the phases of Software Development Life Cycle from Analysis to Production support.
  • Lead the team of 7 and Involved in recruiting team members

TECHNICAL SKILLS

Hadoop Eco System: HDFS, Map Reduce, Spark, HBase, Hive, Pig, Sqoop, Kafka, Impala, Solr, Zookeeper, Storm, Scala, Phoenix and YARN

Languages: Java, XML, UML, PL - SQL, Ajax and Python.

Technologies: J2EE, EJB 3.0, JDBC, SAX/DOM.

Web services: SOAP and RESTful

Frameworks: Spring, Struts

ORM Tools: Hibernate, JPA.

Test Frameworks: JUnit, PIG Unit, HTML Unit and Easy Mock

Software tools & Utilities: Jenkins, Git, SVN, VSS, Microsoft Visio, TOAD, SQL Developer, eclipse.

Web Servers/App. Servers: Weblogic, Web Sphere, Apache Tomcat, JBoss 4.x and Jetty.

Web UI: JSP, JavaScript, Angular JS, JQuery, Ajax, Flex, Bootstrap, Html, CSS and Python

Database: Oracle, My SQL, Teradata and Rainstor

No SQL Database: HBase and Mongo DB

Operating Systems: Linux and Windows.

Methodologies: Waterfall and agile

PROFESSIONAL EXPERIENCE

Confidential, Bellevue, WA

Senior Hadoop Developer

Responsibilities:

  • Understand the business requirements and Interact with Business Analysts for requirement analysis
  • Working closely with Project Managers, Data Modeling teams, Business Analysts and testing teams.
  • Translate the customer’s requirements to formal design documents.
  • Involved in the Database modelling.
  • Working on Pig, Hive and Hive Udf’s in Java to process the event data (Xml, Text and CSV formats), create, and store in HBase/Hive table in ORC formats.
  • Using OSS Nokalva Compiler for spec validation and Decode the .ber files.
  • Sqoop the data from hive tables to Teradata to be used by different teams for Analytics.
  • Develop the application in Linux development environment, Test environment (Linux), and move the application to Production.
  • Involved in Post Production validation and support to Prod issues.
  • Documenting the changes done during the release.

Environment: Hortonworks 2.4.4, Storm, OSS Nokalva tool for Spec validation, Pig Script, Hive, Hive UDF, HBase, Phoenix, Mapreduce, Sqoop, Zookeeper, Java, Shell Script, GIT (Version Control), Jenkins, JIRA, Maven and Eclipse.

Confidential, Richardson, TX

Lead Hadoop Consultant

Responsibilities:

  • Understand the business requirements and Involved in the design and development of project.
  • Working on implementation in Hortonworks 2.4.2 cluster.
  • Working closely with Product Managers, Data Modeling teams, Business Analysts and testing teams.
  • Involved in the low level and high level design.
  • Processed the data using Spark with Scala
  • Implemented Spark using Scala and Spark SQL for faster testing and processing of data.
  • Involved in converting Hive/SQL queries into Spark transformations using Spark RDDs, Scala.
  • Performing tuning of Spark Applications for setting right Batch Interval time, caching, correct level of Parallelism and memory tuning.
  • Working on Pig scripts and Pig Udf’s in Java to process the event data (Xml, Text and CSV formats) and create and store in HBase/Hive table in ORC, Sequence and XML formats.
  • Wrote the shell scripts to process the jobs and schedule the jobs using ASG Zena scheduler tool. Configured and created jobs in event based and time based.
  • Sqoop the data from hive tables to Teradata to be used by different teams in different projects.
  • Verify and analyze the sqoop’ed data in Teradata tables using SQL Assistant. Validate the data in the partitions.
  • Loading the data from Teradata to HDFS and then load from HDFS to Hive tables.
  • Participate in daily Scrum and Sprint Planning/Retrospective meetings.
  • Develop the application in Linux development environment, Test environment (Linux), and move the application to Production.
  • Involved in Post Production validation and support to Prod issues.
  • Documenting the changes done during the release.
  • Took the leadership of the complete processes from start to end and integrating with multiple teams

Environment: Hortonworks 2.4.2, Spark, Scala, Pig Script, Hive, HBase, Mapreduce, Sqoop, Zookeeper, Oozie, ASG Zena, Shell Script and Python.

Confidential

Principal Software Developer / Senior Hadoop Consultant

Responsibilities:

  • Involved in the design and development of product, extensibly working on the design and feasibility of the required features with the existing design of the product.
  • Working on implementation and maintenance of Cloudera Hadoop cluster.
  • Assisted in upgrading, configuration and maintenance of various Hadoop infrastructures like Pig, Hive, Hbase, Kafka and Zookeeper
  • Developed and executed PigLatin scripts and Pig UDFs.
  • Used Hadoop FS scripts for HDFS (Hadoop File System) data loading and manipulation.
  • Analyzed business requirements and cross-verified them with functionality and features of NOSQL databases like HBase and developing various counters to store the aggregated values into HBase tables and mysql tables.
  • Working on the implementation updates in terms of new technology and optimization to assure the quality.
  • Working closely with development teams.
  • Ensure that approved changes are integrated into the existing code.
  • Participate in daily Scrum and Scrum of Scrums meetings with customer.
  • Following the TDD approach and writing JUnit, Pig Unit and acceptance test case for functionality by using extractor framework.
  • Developed RESTful web services to get and post data to other applications.
  • Reviewing and refactoring code to meet the code quality and to improve the performance of the code, implementing the comments from review on my code.
  • Used Jenkins with JUnit, Cobertura, Find Bugs, PMD warning tools for continuous CIT and code quality metrics.
  • Documenting the changes done during the release.
  • Interviewing, Training on the products and platform to the new team members.

Environment: Java, J2EE, Big Data Hadoop, Hive, HBase, Pig, Zookeeper, Kafka, CHD 5.0, Spring MVC, Spring AOP, Hibernate, Spring RESTful web service, My SQL, Ajax, JQuery, Python, Jetty, SVN, Jenkins and JIRA.

Tech Lead

Confidential

Responsibilities:

  • Involved in understanding the requirement, analyzing the requirements
  • Followed TDD approach while development the application.
  • Executing the unit test cases and monitor and fix the code based on the development dashboards.
  • Working closely with development leads to implement code review suggestions.
  • Analyzing and understanding requirements and existing system to develop strategic change plan.
  • Writing JUnit, EasyMock and HtmlUnit for Acceptance framework
  • Ensure that approved changes are integrated into the existing code.
  • Developing various counters to store the aggregated values into hbase table, MySQL table.
  • Writing spring beans, creating Sql tables and statements.
  • Developing controllers, model objects, service classes.
  • Developed RESTful web services to get and post data to other applications.
  • Reviewing and refactoring code to meet the code quality and to improve the performance of the code, implementing the comments from review on my code.
  • Used Jenkins with JUnit, Cobertura, Find Bugs, PMD warning tools for continuous CIT and code quality metrics.
  • Documenting the changes done during the release.
  • Deploying the application in clustered Environment.

Environment: Java, J2EE, Spring MVC, Spring AOP, RESTful webservices, Hibernate, Spring Web services, Struts, Jsp, JavaScript, Ajax, JQuery, Flex, MySQL, Jetty, JUnit, Html Unit, Easy Mock, ANT, SVN and Jenkins.

Consultant

Confidential

Responsibilities:

  • Involved in analysis, design and development activities.
  • Played a vital role in analyzing the business requirements and transforming the requirements into to the design and development.
  • Followed test driven development.
  • Involved in code reviews.
  • Analysis the specifications provided by the clients.
  • Active participant in gathering requirements from onsite coordinator.
  • Participated in designing the application.
  • Coding in Java, Spring MVC and Hibernate
  • Leading the team of 2 members
  • Functional, Integration testing and Bug fixing.

Environment: Java, J2EE, Spring, Spring MVC, XML (Using Apache Axis), XML Schemas, Xfactor Framework, WebServices, JUnit, CVS Oracle 9i, Jboss 4.0.2, Windows NT and Eclipse.

Module Leader

Confidential, USA

Responsibilities:

  • Whenever new report request comes, study the requirements and Prepare the Software Requirements Specification
  • Configuring the report fields with appropriate validations
  • Integrate all the supported classes developed by team.
  • Deploying the report on remote server.
  • Defect tracking, analysis and providing resolution during Integration Testing, System testing and UAT.

Environment: Java, JDBC 2.0, XML (Using Castor API), JSP1.1, Struts1.1, JavaScript, Servlet, Web services, Rational Clear Case, Composite Studio, Sybase, DB2, Weblogic 8.1,Windows NT and Eclipse.

Software Engineer

Confidential

Responsibilities:

  • Development of application using Java, Servlets, JSP, EJB2.0, Oracle8i and SqlServer2000.
  • Implemented J2ee Design Patterns
  • Developed a module for common Authentication to all the Products using JASS, Document Attach and Diary Control.
  • Created Session Beans (for Business logic).
  • Unit testing and Integration testing of the system.
  • Database testing and Functionality testing using DbUnit and Regression Testing.

Environment: Java, JSP, EJB, JDBC, Struts, Design Patterns, Ant, DbUnit, TestMaker, SqlServer, Oracle, Windows 2000, Eclipse and JBuilder

We'd love your feedback!