We provide IT Staff Augmentation Services!

Hadoop Developer Resume

5.00/5 (Submit Your Rating)

Las Vegas, NV

SUMMARY

  • 14 Years of extensive experience in Hadoop, UNIX, LAN, Linux, Oracle, Java, including two years of Big Data Ecosystem related technologies.
  • Experience in installation, configuration, management and deployment of Big Data solutions and the underlying infrastructure of Hadoop Cluster.
  • Good understanding/knowledge of Hadoop Architecture.
  • Hands - on experience on major components in Hadoop Ecosystem including Hive, HBase, HBase-Hive Integration, PIG, Sqoop, Flume and knowledge of Mapper/Reduce/HDFS Frame work.
  • Set up standards and processes for Hadoop based application design and implementation.
  • Experience in analyzing data using HIVEQL, PIG Latin and custom MapReduce programs in JAVA. Extending HIVE and PIG core functionality by using custom UDF’s.
  • Good experience in analysis using PIG and HIVE and understanding of SQOOP.
  • Experienced in developing MapReduce programs using Apache Hadoop for working with Big Data.
  • Experience in importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa.
  • Hands on experience in application development using Java, RDBMS, and Linux shell scripting.
  • Diverse experience utilizing Java tools in business, Web, and client-server environments including Java Platform, J2EE, EJB, JSP, Java Servlets, Struts, and Java database Connectivity (JDBC) technologies.
  • Major strengths are familiarity with multiple software systems, ability to learn quickly new technologies, adapt to new environments, self-motivated, team player, focused adaptive and quick learner with excellent interpersonal, technical and communication skills.
  • Good communication skills, work ethics and the ability to work in a team efficiently with good leadership skills.

TECHNICAL SKILLS

Big data/Hadoop: HDFS, Map Reduce, HIVE, PIG, HBase, Sqoop, Flume, Oozie.

Java Technologies: Core Java, JSP, JDBC, Java Beans, Servlets.

Operating Systems: UNIX, Linux, LAN, WAN, Macintosh, Windows 95/ 2000/ XP/ Vista/ 7.

Programming Languages: C, C++, Visual C, Java, Python, Linux shell scripts.

Database: Oracle 11g/10g/9i, MySQL, MS-SQL Server, DB2.

Web Servers: WebLogic, Apache Tomcat.

Web Technologies: HTML, XML, JavaScript.

Network Protocols: TCP/IP, UDP, HTTP, DNS, DHCP.

PROFESSIONAL EXPERIENCE

Confidential, Las Vegas, NV

Hadoop developer

Responsibilities:

  • Involved in the Complete Software development life cycle (SDLC) to develop the application.
  • Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, HBase database and Sqoop.
  • Involved in loading data from LINUX file system to HDFS.
  • Experience in managing and reviewing Hadoop log files.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Implemented test scripts to support test driven development and continuous integration.
  • Supported in setting up QA environment and updating configurations for implementing scripts with Pig and Sqoop.
  • Installed and configured Hadoop MapReduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
  • Created Pig Latin scripts to sort, group, join and filter the enterprise wise data.
  • Involved in creating Hive tables, loading with data and writing hive queries that will run internally in MapReduce way.
  • Supported MapReduce Programs those are running on the cluster.
  • Analyzed large data sets by running Hive queries and Pig scripts.
  • Worked on tuning the performance Pig queries.
  • Mentored analyst and test team for writing Hive Queries.
  • Installed Oozie workflow engine to run multiple MapReduce jobs.
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required.

Environment: Hadoop, HDFS, Map Reduce, Hive, Pig, Sqoop, Linux, Java, Oozie, HBase.

Confidential, Washington D.C.

Hadoop Developer

Responsibilities:

  • Responsible for building scalable distributed data solutions using Hadoop.
  • Responsible for Cluster maintenance, adding and removing cluster nodes, Cluster Monitoring and Troubleshooting, Manage and review data backups and log files.
  • Analyzed data using Hadoop components Hive and Pig.
  • Responsible for running Hadoop streaming jobs to process terabytes of xml's data.
  • Load and transform large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
  • Responsible for creating Hive tables, loading data and writing hive queries.
  • Handled importing data from various data sources, performed transformations using Hive, Map Reduce, and loaded data into HDFS.
  • Extracted the data from Teradata into HDFS using the Sqoop.
  • Exported the patterns analyzed back to Teradata using Sqoop.
  • Installed Oozie workflow engine to run multiple Hive and Pig jobs, which run independently with time and data availability.

Environment: Hadoop, HDFS, Pig, Hive, MapReduce, Sqoop, LINUX, and Big Data

Confidential, White Plains, NY

Java Developer

Responsibilities:

  • Build servers using Kick Start (managing Kick start file to meet the requirements of the clients), Red Hat Satellite Server, and vSphere Client.
  • Updating patches to keep servers updated against bugs present in the operating system using Red Hat Satellite Server, yum, etc.
  • Worked exclusively on VMware virtual environment.
  • Worked on Data migrations and system migration of one data centre to another using VM.
  • Experienced in developing automated patch delivery systems and also have worked on hardening
  • Installed packages using YUM and Red hat Package Manager (RPM) on various servers.
  • Worked on troubleshooting network administration, IIS configuration, DNS setup and modifications, firewall rule sets, local and distributed director, connectivity, and supporting applications.
  • Performed the daily system administration tasks like managing system resources and end users support operations and security.
  • Worked as independent support of Tier 2 issues that are reboots, start/stop services, reset Terminal Service and pc anywhere connections, and administrative server maintenance.
  • Daily tasks of following with clients to resolve the issues.
  • Worked and experienced with the Linux database systems.
  • Provide support to Account Managers, UNIX and Windows technicians, and other departments
  • Co-ordinated with various cross functional teams across IT operations to make sure smooth functioning of projects.
  • Worked in close quarters with DBA Team, in order to assist with kernel parameters adjustment as per the requirements.
  • Every day operations of resolution on Linux based issued though ticketing system (SMP) in compliance to SLA cycles.

Environment: Red hat EnterpriseLinux4.x/5.x/6.4, AIX 6.x, Solaris 8/ 9/10, Tivoli Storage Manager, VMware ESX5, Tivoli Net backup, and Web sphere. Windows 2008 servers, Windows 2003, IIS 7.0 & 7.5.

Confidential

Technical Support

Responsibilities:

  • Supporting Solaris, Linux servers in production/stage/development environment.
  • Unix/Linux user's creation, file/dir level permissions, Sudo permissions etc.
  • Installation and creation of 9i and 10g databases for Testing, Development and Production.
  • Experience in Identifying performance bottlenecks on high volume transaction systems. Analyze SQL statements, reorganize database objects and design indexes to improve response of queries.
  • Perform system trouble-shooting, performance tuning and capacity planning.
  • Applying operating system updates, patches, and configuration changes.
  • Work with Development teams to onboard new applications, configure monitoring, update and approve documentation and adhere to standards.
  • Performed WebLogic Server and Portal administration tasks such as installing, configuring, monitoring and performance tuning in both the UNIX and Windows based platforms.
  • Worked extensively on Solaris, Linux and Windows platforms for different applications.
  • Worked closely with database administration staff to ensure optimal performance of databases, and maintain development applications and databases.
  • Shell (ksh, bash) scripting for various admin related jobs.
  • Design and implement Backup Policies.
  • Support to any issues coming on Linux servers and for applications, which run on them.
  • MySQL and Oracle Database installation and Monitoring on Linux using Nagios.
  • Configuring and managing Cluster instances for MYSQL and Oracle database.
  • Performed connection diagnostics in order to access route specifics and diagnostics from the switching infrastructure.
  • Also concerned with every day growth and Inspections to make sure quality and goals were being met.

Environment: Solaris 8,9,10, AIX 5.3, 6.1, SAN(Netapp), Red-Hat Linux Enterprise servers (HP Proliant DL 585, BL 465/485,ML Series IBM HS22 Blade center and 7870 blades), ALU ATCAv2, Blade Logic, VERITAS Cluster Server 5.0, Windows 2003 server, Shell programming, Jboss 4.2, JDK 1.5,1.6,, VMware Virtual Client 3.5, VMware Infrastructure 3.5..

Confidential

UNIX Administrator

Responsibilities:

  • Responsible for Unix Server Administration, OS installation setup and configuration.
  • Creating users and grant privileges
  • Responsible for the implementation of a Data Domain backup and archive solution.
  • Maintaining UNIX servers in general (security, software updates, system diagnosis, debugging, tuning, advanced troubleshooting, etc.
  • Jumpstart Solaris servers, custom configure, install packages and patches
  • Managing systems routine backup, scheduling jobs, enabling system logging, network logging of servers for maintenance, performance tuning, testing.
  • Integrating UNIX file systems. Setting up printers on UNIX / NT machines to work seamlessly across platforms.
  • Administering NFS Mounts.
  • Interfacing with equipment vendors to troubleshoot service affecting issues
  • Monitoring managed-customer firewall and VPN networks for access violations

Environment: UNIX, Shell scripting, SQL Server, Work Station, Oracle

We'd love your feedback!