We provide IT Staff Augmentation Services!

Sr Hadoop Admin Resume

5.00/5 (Submit Your Rating)

Madison, WI

SUMMARY

  • 7+ years of rich and in sightful experience, including Big Data Cloud (Hadoop, AWS) & Open stack projects and Java Technologies. Involved in large complex Information Systems and Projects as well as in System Integration, Multi - Vendor Service Management and Business Continuity management.
  • Having 4 years of experience in Hadoop, AWS & Open stack.
  • Having 3+ years of experience on workingJava/J2ee technologies.
  • Good expertise on SLA management & Toot cause analysis.
  • Excellent installation skills of Hadoop components: HDFS, Map, HBase, Red, Proficient in supporting Linux operating systems: CENTOS, Debian Fedora, Solid understanding of open source monitoring tools: Nagios, Ganglia, Cloudera Manager.
  • HDFS - Balancing block data, file system integrity check, performance optimization, backup, monitoring, troubleshooting, add/decommission datanodes
  • Experience with Hadoop cluster performance tuning.
  • Experience in planning, designing, deploying, fine-tuning and administering large scale Productions Hadoop clusters.
  • Experience in administrating Hadoop ecosystem components like MapReduce, Pig, Hive, HAWQ, SpringXD and Sqoop and also experience in setting Hadoop Cluster with Isilon.
  • Experience in setting up and configuring monitoring tools Nagios and Ganglia for monitoring health of the cluster.
  • Experience in configuring and setting up various components of the Hadoop ecosystem using Pivotal manager.
  • Experience in designing and implementing security for Hadoop cluster with Kerberos secure authentication.
  • Experience in setting up and configuring and administering Hadoop cluster with major Hadoop distribution Pivotal, fine tuning and benchmarking the cluster.
  • Work with various project teams to identify opportunities to optimize client infrastructure.
  • Coordinate with other BU’s to provide an integrated solution
  • Collaborate with Business Analysts in response development
  • Leverage CompanyCapabilities, IP and Alliances as appropriate in the overall Solution Model
  • Optimized the utilization of data systems, and improved the efficiency of security and storage solutions by control and prevention of loss of sensitive data.

TECHNICAL SKILLS

Cloud: Huawei, Cisco, VMware, Amazon, Eucalyptus, Hadoop

Hadoop Ecosystem Components: HDFS, MapReduce, Pig, Hive, Sqoop, Flume & Zookeeper.

Hadoop related applications: SpringXD and Alpine.

Hadoop Distribution: Pivotal and Cloudera

Languages and Technologies: Perl, Unix Scripting, Windows Scripting, Core Java, C, C++, and Data Structures, algorithms.

Scripting Languages: Shell scripting and Perl

Networking: TCP/IP Protocol, Switches & Routers, HTTP, NTP & NFS

Databases: MySQL & Oracle, Mongo DB

Storage: EMC Vmax, DMX-3000/4000, symmetric, Clarion, HP XP, EVA, MSA, P7000, IBM DS 8300/8700/4700/8000 , Hitachi USP, AMS, NetApp FAS series arrays, Symantec

PROFESSIONAL EXPERIENCE

Confidential, Madison, WI

Sr Hadoop admin

Responsibilities:

  • Responsible for Cluster maintenance, Adding and removing cluster nodes, Cluster Monitoring and Troubleshooting, Manage and review data backups, Manage and review Hadoop log files.
  • Played responsible role for deciding the hardware configurations for the cluster along with other teams in the company.
  • Resolving tickets submitted by users, P1 issues, troubleshoot the error documenting, resolving the errors.
  • Adding new Data Nodes when needed and running balancer.
  • Responsible for building scalable distributed data solutions using Hadoop.
  • Continuous monitoring and managing the Hadoop cluster through Ganglia and Nagios.
  • Done major and minor upgrades to the Hadoop cluster.
  • Done stress and performance testing, benchmark for the cluster.
  • Working closely with both internal and external cyber security customers.
  • Researching and implementing algorithms for large power-law skewed datasets.
  • Developing interactive graph visualization tool based on Prefuse vis package.
  • Developing machine-learning capability via Apache Mahout.
  • Research effort to tightly integrate Hadoop and HPC systems.
  • Deployed, and administered 70 node Hadoop cluster. Administered two smaller clusters.
  • Compared Hadoop to commercial big-data appliances from Netezza, XtremeData, and LexisNexis. Published and presentedresults.

Confidential, Charlotte, NC

Hadoop admin

Responsibilities:

  • Involved in various POC activity using technology like Map reduce, Hive, Pig, and Oogie.
  • Handling 24x7 support on Hadoop issues.
  • Involved in designing and implementation of service layer over HBase database.
  • Assisted in designing, development and architecture of Hadoop and HBase systems.
  • Coordinated with technical teams for installation of Hadoop and third related applications on systems.
  • Formulated procedures for planning and execution of system upgrades for all existing Hadoop clusters.
  • Supported technical team members for automation, installation and configuration tasks.
  • Suggested improvement processes for all process automation scripts and tasks.
  • Provided technical assistance for configuration, administration and monitoring of Hadoop clusters. .
  • Conducted detailed analysis of system and application architecture components as per functional requirements.
  • Created scripts to form EC2 clusters for training and for processing.
  • Implemented performance monitoring tools (HP)
  • Worked on Amazon cloud Migration project.
  • Worked for Amazon Elastic Cloud project using Agile Methodologies.
  • Assisted business analyst in posting migration project.
  • Reviewed firewall settings (security group) and updated on Amazon AWS.
  • Created access documents for level/tier 3/4 production support groups.
  • Created Cassandra Advanced Data Modeling course for DataStax.
  • Working on Agile methodologies.
  • Participated in evaluation and selection of new technologies to support system efficiency.
  • Importing of data from various data sources such as Oracle and Comptel server intoHDFS using transformations such as Sqoop, Map Reduce.
  • Analyzed the data by performing Hive queries and running Pig scripts to know user behavior like frequency of calls, top calling customers.
  • Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
  • Developed Hive queries to process the data and generate the data cubes for visualizing.

Environment: Hadoop, Map Reduce, HDFS, Hive, Ooozie, Java (jdk1.6), Cloudera, NoSQL, Oracle 11g, 10g, Toad 9.6, Windows 2000, Solaris, Linux

Confidential, Wilmington, DE

Cloud admin

Responsibilities:

  • Involved in various POC activity using technology like Map reduce, Hive, Pig, and Oogie.
  • Involved in designing and implementation of service layer over HBase database.
  • Importing of data from various data sources such as Oracle and Comptel server intoHDFS using transformations such as Sqoop, Map Reduce.
  • Analyzed the data by performing Hive queries and running Pig scripts to know user behavior like frequency of calls, top calling customers.
  • Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
  • Developed Hive queries to process the data and generate the data cubes for visualizing.
  • Designed and developed scalable and custom Hadoop solutions as per dynamic data needs and Coordinated with technical team for production deployment of software applications for maintenance.
  • Provided operational support services relating to Hadoop infrastructure and application installation and Supported technical team members in management and review of Hadoop log files and data backups.
  • Participated in development and execution of system and disaster recovery processes.
  • Formulated procedures for installation of Hadoop patches, updates and version upgrades.
  • Automated processes for troubleshooting, resolution and tuning of Hadoop clusters.

Environment: Hadoop, Map Reduce, HDFS, Hive, Ooozie, Java (jdk1.6), Cloudera, NoSQL, Oracle 11g, 10g, Toad 9.6, Windows NT, UNIX(Linux),Agile

Confidential, Pittsburgh, PA

Java Developer

Responsibilities:

  • Developed persistence module with Hibernate,Spring with DB2.
  • Coverted handful EJB services to RESTful Webservices using JAX-RS API.
  • Analysing new opportunities for my group. This include daily interaction with trading desk to understand the business flow and analyse the application of technology to increase the time effeciency in a business work flow.
  • Preparing the Proof of Concept and the Presentations to demostrate the solution to the business users on cloud foundry.
  • Developed the schedulers to develop automated triggering of tasks using Spring Quartz scheduler automated tasks included End of processing, Intraday Processing.
  • Development of Perl Scripts and shell scripts to easily manage the daily handling tasks.
  • Creating and implementing jobs, and box jil’s for Autosys job scheduler.

Environment: Java, J2EE, Webservice, Spring, Hibernate, Weblogic, SOAP-UI, Unix.

Confidential, San Diego, CA

Java Developer

Responsibilities:

  • System study and requirement analysis and Documenting Requirement Specifications.
  • Interacting with the system analysts & business users for design & requirement clarifications.
  • Involved in the creation of sample UI prototype for the client.
  • Used Connection Pooling to get JDBC connection and access database procedures.
  • Was responsible for designing the tables in database.
  • Used VSS repository for version control.
  • Used log4j as a debugging tool and Involved in Java, J2EE coding.
  • Testing the application manually and Involved in Tarrif planning Module coding.
  • Used ANT for compilation and building EAR files and Worked in CI Tool (Jenkin)
  • Used JUnit/Eclipse for the unit testing of various modules.
  • Responsible for reviewing the code developed by the team members and making changes for performance tuning and Involved in unit testing using Junit framework.

Environment: Java, J2EE, Struts, Maven, Eclipse 3.0, JDBC, SqlServer2005, Tomcat, Log4j, VSS, UNIX, Jenkins, Junit.

Confidential, Waltham, MA

Java Developer

Responsibilities:

  • System study and requirement analysis and Documenting Requirement Specifications.
  • Interacting with the system analysts & business users for design & requirement clarifications.
  • Involved in the creation of sample UI prototype for the client.
  • Used Connection Pooling to get JDBC connection and access database procedures.
  • Was responsible for designing the tables in database.
  • Used VSS repository for version control.
  • Used log4j as a debugging tool and Involved in Java, J2EE coding.
  • Testing the application manually and Involved in Reporting Module coding.

Environment: Java, J2EE, Struts, Ant 1.6, Eclipse 3.0, SqlServer2005, Tomcat, Log4j.

We'd love your feedback!