Hadoop Admin Resume
3.00/5 (Submit Your Rating)
Dallas, TX
SUMMARY
- Overall 8 years of experience in Mphasis on Big Data Technologies, Hadoop Administration.
- 3 Years of extensive experience as Hadoop Administrator with strong expertise in setting up Hadoop/HBase clusters.
- Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and MapReduce programming paradigm. Strong Understanding of NoSQL Databases (HBASE, Cassandra.)
- Experience in installation, configuration, supporting and managing Hadoop Clusters using Apache, Horton works, and Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Hands - on experience on major components in Hadoop Ecosystem including Hive, HBase, HBase-Hive Integration, PIG, Swoop, Flume & knowledge of Mapper/Reduce/HDFS Framework.
- Set up standards and processes for Hadoop based application design and implementation.
- Worked on NoSQL databases including HBase, Cassandra and MongoDB.
- Good experience in analysis using PIG and HIVE and understanding of SQOOP and Puppet.
- Expertise in database performance tuning & data modeling.
- Developed automated scripts using Unix Shell for performing RUNSTATS, REORG, REBIND, COPY, LOAD, BACKUP, IMPORT, EXPORT and other related to database activities.
- Experienced in developing MapReduce programs using Apache Hadoop for working with Big Data.
- Good understanding of XML methodologies (XML, XSL, XSD) including Web Services and SOAP.
- Expertise in working with different databases likes Oracle, MS-SQL Server, Postgress, and MS Access 2000 along with exposure to Hibernate for mapping an object-oriented domain model to a traditional relational database.
- Extensive experience in data analysis using tools like Syncsort and HZ along with Shell Scripting and UNIX.
- Involved in log file management where the logs greater than 7 days old were removed from log folder and loaded into HDFS and stored for 3 months.
- Expertise in development support activities including installation, configuration and successful deployment of changes across all environments
- Familiarity and experience with data warehousing and ETL tools.
- Good working Knowledge in OOA & OOD using UML and designing use cases.
- Good understanding of Scrum methodologies, Test Driven Development and continuous integration.
- Experience in production support and application support by fixing bugs
- Used HP Quality Center for logging test cases and defects.
- Major strengths are familiarity with multiple software systems, ability to learn quickly new technologies, adapt to new environments, self-motivated, team player, focused adaptive and quick learner with excellent interpersonal, technical and communication skills.
PROFESSIONAL EXPERIENCE
Hadoop Admin
Confidential - Dallas, TX
Responsibilities:
- Solid Understanding of Hadoop HDFS, Map-Reduce and other Eco-System Projects
- Installation and Configuration of Hadoop Cluster
- Working with Cloudera Support Team to Fine tune Cluster
- Plugin allows Hadoop MapReduce programs, HBase, Pig and Hive to work unmodified and access files directly. The plugin also provided data locality for Hadoop across host nodes and virtual machines.
- Developed map Reduce jobs to analyze data and provide heuristics reports
- Adding, Decommissioning and rebalancing nodes
- Created POC to store Server Log data into Cassandra to identify System Alert Metrics
- Rack Aware Configuration
- Configuring Client Machines
- Configuring, Monitoring and Management Tools
- HDFS Support and Maintenance
- Cluster HA Setup
- Applying Patches and Perform Version Upgrades
- Incident Management, Problem Management and Change Management
- Performance Management and Reporting
- Recover from Name Node failures
- Schedule Map Reduce Jobs -FIFO and FAIR share
- Installation and Configuration of other Open Source Software like Pig, Hive, HBASE, Flume and Sqoop
- Integration with RDBMS using swoop and JDBC Connectors
- Working with Dev Team to tune Job Knowledge of Writing Hive Jobs
Environment: RHEL, puppet, CDH 3 distribution, Tableau, Datameer, HBase, CDH Manager, YARN, Hive, Flume.
Data Base Admin
Confidential
Responsibilities:
- Created the Database, User, Environment, Activity, and Class diagram for the project (UML).
- Implement the Database using Oracle database engine.
- Designed and developed a fully functional generic n-tiered J2EE application platform the environment was Oracle technology driven. The entire infrastructure application was developed using Oracle JDeveloper in conjunction with Oracle ADF-BC and Oracle ADF- RichFaces.
- Created an entity object (business rules and policy, validation logic, default value logic, security)
- Created View objects, View Links, Association Objects, Application modules with data validation rules (Exposing Linked Views in an Application Module), LOV, dropdown, value defaulting, transaction management features.
- Web application development using J2EE: JSP, Servlets, JDBC, Java Beans, Struts, Ajax, JSF, JSTL, Custom Tags, EJB, JNDI, Hibernate, ANT, JUnit and Apache Log4J, Web Services, Message Queue (MQ).
- Designing GUI prototype using ADF 11G GUI component before finalizing it for development.
- Experience using Version controls such as CVS, PVCS, and Rational Clear Case.
- Creating Modules Using Task Flow with Bounded and Unbounded
- Generating WSDL (Web Services) And Create Work Flow Using BPEL
- Handel the AJAX functions (partial trigger, partial Submit, auto Submit)
- Extensively used XML documents with XSLT and CSS to translate the content into HTML to present to GUI
- Developed Enterprise java Beans like Entity Beans, session Beans (both Stateless and State full Session beans) and Message Driven Beans.
- Developed, Tested and Debugged the Java, JSP and EJB components using Eclipse.
- Used Struts Framework to implement J2EE design patterns (MVC).
Environment: Java, J2EE, EJB 2.1, JSP 2.0, Servlets 2.4, JNDI 1.2, Java Mail 1.2, JDBC 3.0, Struts, HTML, XML, CORBA, XSLT, Java Script, Eclipse3.2, Oracle10g, Weblogic8.1, Windows 2003.
Java Developer
Confidential
Responsibilities:
- Involved in creating use case, class, sequence, package dependency diagrams using UML.
- Also involved in analysis and requirements gathering phase.
- Developed Server side code using Servlets, JSPs running on Apache tomcat 3.0 and Enterprise Beans running on IBM Web sphere Application Server.
- Developed web pages using HTML, JSP, DHTML and CSS.
- Used JavaScript for certain form validations, submissions and other client side operations.
- Created Stateless Session Beans to communicate with the client.
- Created the database tables in Oracle 7i; created the required SQL queries and used JDBC to perform database operations.
- Developed a prototype of the application and demonstrated to business users to verify the application functionality.
- Design, develop and implement MVC Pattern based Keyword Driven automation testing framework utilizing Java, JUnit and Selenium WebDriver.
- Used automated scripts and performed functionality testing during the various phases of the application development using Selenium.
- Prepared user documentation with screenshots for UAT (User Acceptance testing).
- Environment: Java, JavaScript, HTML, CSS, Xpath, Selenium Webdriver, Eclipse, JUnit, Jmeter, Jira, Windows, Mac OSX, Oracle 10g, Agile Methodology.
- Developed and implemented the MVC Architectural Pattern using Struts Framework including JSP, Servlets, EJB, Form Bean and Action classes.
- Used parsers like SAX and DOM for parsing xml documents and used XML transformations using XSLT.
- Worked with QA team for testing and resolving defects.
- Used ANT automated build scripts to compile and package the application.
- Used Jira for bug tracking and project management.
Environment: Java, HTML, JSP, CSS, Servlets, JavaScript, JDBC, Oracle 7i, EJB 1.1, Apache tomcat 3.0, IBM Web sphere