We provide IT Staff Augmentation Services!

Hadoop Devops Resume

O Fallon, MO

SUMMARY

  • Six years of IT experience with 3+ years in Hadoop Eco system.
  • Hands on experience in working with HDFS, Map Reduce, Hive, HBase, Pig, Sqoop, Flume, Oozie, YARN, Impala, Spark, Cloudera and hadoop cluster administration.
  • Experience in Apache Hadoop installation and setting up cluster using tar balls and rpms.
  • Experience in performance tuning the Hadoop cluster by gathering and analyzing the existing infrastructure/ workloads running on the cluster.
  • Experience in automating the Hadoop Installation, configuration and maintaining the cluster by using the tools like CDH manager, Whirr, Ambari, Puppet, Chef.
  • Hands on experience on deploying secure clusters which are ready to scale for increasing workloads.
  • Hands on experience in Deployment of applications using AWS EC2.
  • Installed, configured and managed the LDAP servers which act as Authentication server for UNIX environment.
  • Experienced in setting up automated 24x7x365 monitoring and escalation infrastructure for Hadoop cluster using Nagios and Ganglia.
  • Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, JobTracker, TaskTracker, NameNode, DataNode and MapReduce programming paradigm.
  • Experience with unit testing using MR Unit, local job runner against small amount of known data.
  • Experience in managing and reviewing Hadoop log files.
  • Experience in analyzing data using HiveQL, Pig Latin, HBase and custom MapReduce programs in Java.
  • Extended Hive and Pig core functionality by writing custom UDFs.
  • 4+ years of work experience as a Java/J2EE programmer developing applications using Servlets, JSP, JSTL, RMI, EJB, Struts, Spring, JSF, Java Beans, JDBC, JMS, Hibernate and MVC architecture
  • Experience in client side designing and validations using HTML, DHTML, CSS, Java Script, AJAX,JSP, and Swing.
  • Extensive knowledge of J2EE architecture, Patterns, Design and development.
  • Experience on Core java technology which includes multithreading, JDBC, RMI, network programming.
  • Experienced in the functional usage and deployment of applications in JBoss, Web Logic, Web Sphere and Apache Tomcat Server.
  • Working knowledge of database connectivity (JDBC) for databases like Oracle, DB2, SQL Server, MySql, MS Access, PostgreSQL.
  • Good knowledge of EJBSession beans with JNDI mapped naming & JMS message - driven beans.
  • Extensive experience with Eclipse with hands on experience with NetBeans, IntelliJ IDEs.
  • Proficient in using XML Suite of Technologies (XML, XSL, XSLT, DTD, XML Schema, SAX, DOM).
  • Good knowledge of Log4j for error logging.
  • Excellent analytical and problem solving capabilities.
  • Good elicitation and communication skills.
  • Quick learner with Ability to adapt to new environments and a fast learning curve with getting acquainted with new technologies.

TECHNICAL SKILLS

Hadoop and Eco system Projects: HDFS, Map/Reduce, MR1(classic), MR2(YARN),MRUnit, Pig, Hive, HBase, Oozie, Flume, Sqoop, Spark, Impala, Zookeeper.

Operating Systems: Unix, Red Hat Linux, CentOS

Scripting and web development: Perl, PHP, Python, jQuery and JavaScript.

Java Technologies: Spring, Struts, Hibernate, JSP, Servlets, JDBC, JNDI, JMS, MQSeries, RMI, ANT, Log4J, JUNIT.

Cloud Computing: Amazon AWS (EC2, RDS)

XML related Technologies: XML, XSL, XSLT,, DTD, XML Schema, SAX, DOM.

Databases: Oracle, MS SQL Server, MySQL, DB2, Microsoft Access, PostgreSQL

Application servers: WebLogic, Jboss, Tomcat

Other Tools/ Frame Works: Eclipse, Ant, Maven, Net Beans, Jenkins

PROFESSIONAL EXPERIENCE

Confidential, O’Fallon, MO

Hadoop DevOps

Environment: AWS EC2, MapReduce, Pig, Hive, Sqoop, FLUME, HBase, SOAP Web Services, WSDL, JDK 1.6, Maven, Subversion, XML, JSP, html, Linux, AJAX, JQuery, JavaScript, CSS, Unix shell scripting and Agile methodology.

Responsibilities:

  • Developed various Big Data workflows using custom MapReduce, Pig, Hive and Sqoop.
  • The data is collected from distributed sources into Avro models. Applied transformations and standardizations and loaded into Hive for further data processing.
  • Devised schemes to collect and stage large data in HDFS and also worked on compressing the data using various formats to achieve optimal storage capacity.
  • Built various big data workflows and scheduled them using Oozie.
  • Deployed on premise cluster and tuned the cluster for optimal performance for job execution needs and processes large data sets.
  • Built re-usable Hive UDF libraries which enabled various business analysts to use these UDF’s in Hive querying.
  • Used FLUME to dump the application server logs into HDFS.
  • The logs that are stored on HDFS are analyzed and the cleaned data is imported into Hive warehouse, which enabled end business analysts to write Hive queries.
  • Configured various big data workflows to run on the top of Hadoop using spring batch and these workflows comprises of heterogeneous jobs like Pig, Hive, Sqoop and MapReduce.
  • Benchmarked the Cluster performance using various benchmarking techniques like TeraGen, TeraSort and TeraValidte.
  • Experience in working with NoSQL database HBase in getting real time data analytics.
  • Developed suit of Unit Test Cases for Mapper, Reducer and Driver classes using MR Testing library.
  • Developed SOAP based web services to communicate with the client applications API.
  • Used Maven extensively for building jar files of MapReduce programs and deployed to Cluster.
  • Developed the custom writable JAVA programs to load the data into the HBase by using the Apache Crunch Java API to data pipelines.
  • Designed and Developed Java Portlets rendering Analytical Information.
  • Assigned the tasks of resolving defects found in testing the new application and existing applications.
  • Bug fixing and 24-7 production support for running Big Data Jobs.
  • Designed and coded application components in an Agile environment utilizing test driven development approach.

Confidential, Dallas, TX

Hadoop DevOps

Environment: Hadoop, MapReduce, HDFS, Pig, Hive, HBase, ZooKeeper, Oozie, Java (jdk1.7), Oracle 11g/10g, PL/SQL, SQL*PLUS, UNIX Shell Scripting.

Responsibilities:

  • Responsible for architecting Hadoop clusters with CDH3
  • Experience in Installation and configuration of Cloudera distribution Hadoop 2, 3, NameNode, Secondary NameNode, JobTracker, TaskTrackers and DataNodes.
  • Strong Experience in Installation and configuration of Hadoop ecosystem like HBase, Flume, Pig, Sqoop.
  • Expertise in adding and Removing Nodes without any effect to running map reduce jobs and data
  • Manage and review Hadoop Log files
  • Load log data into HDFS using Flume. Worked extensively in creating MapReduce jobs to power data for search and aggregation
  • Worked extensively with Sqoop for importing data from Oracle
  • Designed a data warehouse using Hive
  • Created partitioned tables in Hive
  • Mentored analyst and test team for writing Hive Queries
  • Extensively used Pig for data cleansing
  • Developed Pig Latin scripts to extract the data from the web server output files to load into HDFS
  • Developed the Pig UDF’S to pre-process the data for analysis
  • Developed workflow in Oozie to automate the tasks of loading the data into HDFS and pre-processing with Pig
  • Cluster co-ordination services through ZooKeeper.

Confidential

Java/J2EE Developer

Environment: JDK 1.6, JAVA, JAVA EE 5.0, JSF 2.0, JSP, Rich Faces, Spring IOC, Spring security, Web Logic 10.3.5, Maven, CVS, putty, JUnit, JavaScript, JQuery, CSS3, JAXP, JAXB, DB2 database, Web Services, JAX WS, SOAP, WebLogic, server, Windows, Linux.

Responsibilities:

  • Involved in analyzing requirements as per the project goals and specifications with business analysts and team
  • Designed and developed front end screens using JSF 2.0, Rich Faces, JSP, CSS, JavaScript and JQuery.
  • Wrote custom classes in JSF as per the GUI requirements.
  • Worked at the backend server side logic to process request and render response in table and tree format on to search result screens.
  • Used Spring IOC for dependency injection and Spring Security ­for securing the web application.
  • Used SOA principles based on business requirements and did integration and data transfer through Enterprise Service Bus (ESB).
  • Created SOAP based web service along with WSDL creation using JAXWS. Also did binding with JAXB for XML processing.
  • Exposed java code as a web service using Apache CXF.
  • Implemented Hibernate Relational Mapping Framework using XML configuration and developed optimized HQL/SQL queries and criteria queries.
  • Performed SQL query tuning with the tool JMeter to analyze and improve the query results retrieval time, handle multiple transactions and improve performance for user search criteria.
  • Used Maven for project building and loading dependency jars and used tomcat server while developing the code in local.
  • Wrote JUnit test cases for the modules which I have worked for unit testing.
  • Used JIRA as a bug tracking and team activity tool throughout the project.
  • Used SOAPUI for web service testing and WebLogic for deploying the developed web application.
  • Used EMMA plugin in eclipse to find out the code coverage during JUnit testing.
  • Involved in defect fixing during Integration testing, UAT and post production.
  • Used CVS as Version for maintaining common code base.
  • Parallelly working on phase1 production support along with phase2 development.

Confidential, Northbrook, IL

Java/J2EE Consultant

Environment: Java/J2EE, Oracle 10g, SQL, PL/SQL, JSP, EJB, Struts, Hibernate, WebLogic 8.0, HTML, AJAX, Java Script, JDBC, XML, JMS, XSLT, UML, JUnit, log4j, MyEclipse 6.0

Responsibilities:

  • Involved in various phases of Software Development Life Cycle.
  • Created UML Diagrams (Class and Sequence) during Design Phase using Visio.
  • Used Eclipse 6.0 as IDE for application development.
  • Validated all forms using Struts validation framework and implemented Tiles framework in the presentation layer.
  • Configured Struts framework to implement MVC design patterns
  • Designed and developed GUI using JSP, HTML, DHTML and CSS.
  • Worked with JMS for messaging interface.
  • Used Hibernate for handling database transactions and persisting objects.
  • Deployed the entire project on WebLogic application server.
  • Used AJAX for interactive user operations and client side validations.
  • Used XML for ORM mapping relations with the java classes and the database.
  • Used XSL transforms on certain XML data.
  • Developed ANT script for compiling and deployment.
  • Performed unit testing using JUnit.
  • Extensively used log4j for logging the log files.
  • Used Subversion as the version control system

Confidential

Java/J2EE Developer

Environment: Servlet, JSP, EJB, Struts, Hibernate, LDAP, JNDI, HTML, XML, DOM, SAX, ANT, Weblogic Server, Oracle9i

Responsibilities:

  • Played key role in the design for the implementation of this application.
  • Preparing the documentation for High Level design, Low Level design of the application and Process Flow of control for the entire application.
  • Designed the Web application implementing the Struts framework for Model-View - Controller (MVC) pattern to make it extensible and flexible
  • Implemented the Consolidated applications front-end pages using JSP’s, JSTL, and Struts Tag Libraries.
  • Used Spring Framework for Dependency injection and integrated with the Struts Framework and Hibernate.
  • Configured connection caches for JDBC connections.
  • Used extensive JavaScript for creating the Global templates that can be used across the JSP Pages
  • Configured Logger, appender and layout using log4j.
  • Used Hibernate for Object Relational Mapping.
  • Used Ant for building JAR s and WAR.

Hire Now