We provide IT Staff Augmentation Services!

Hadoop Developer Resume

5.00/5 (Submit Your Rating)

Camarillo, CA

SUMMARY

  • 9+ years of overall IT experience in a variety of industries, which includes hands on experience in Big Data technologies
  • 2.5+ years of comprehensive experience as a Hadoop Developer
  • Passionate towards working in Big Data and Analytics environment
  • Expertise in writing Hadoop Jobs for analyzing data using Hive and Pig
  • Experience in working with MapReduce programs using Apache Hadoop for working with Big Data
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa
  • In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and MapReduce concepts
  • Extending Hive and Pig core functionality by writing customUDFs
  • Good understanding of Data Mining and Machine Learning techniques
  • Experience in analyzing data using Hive QL, Pig Latin, and custom MapReduce programs in Java.
  • Extensive experience with SQL, PL/SQL and database concepts
  • Knowledge of NoSQL databases such as HBase and Casandra
  • Knowledge of job workflow scheduling and monitoring tools like oozie and Zookeeper
  • Experience in developing solutions to analyze large data sets efficiently
  • Knowledge of administrative tasks such as installing Hadoop and its ecosystem components such as Hive and Pig
  • Handled several techno-functional responsibilities including estimates, identifying functional and technical gaps, requirements gathering, designing solutions, development, developing documentation, and production support
  • An individual with excellent interpersonal and communication skills, strong business acumen, creative problem solving skills, technical competency, team-player spirit, and leadership skills

TECHNICAL SKILLS

Hadoop/Big Data: HDFS, MapReduce,Pig,Hive, Impala, HBase, Casandra, Sqoop, Oozie, Flume

Java & J2EE Technologies: Core Java

IDE Tools: Eclipse, NetBeans

Programming languages: COBOL, Java, KSH &Mark up Languages

Databases: Oracle, MySQL, DB2, IMS

Operating Systems: Windows 95/98/2000/XP/Vista/7, Unix

Reporting Tools: Tableau

Other Tools: Putty, WINSCP, EDI(Gentran), Streamweaver, Compuset

PROFESSIONAL EXPERIENCE

Confidential, Camarillo, CA

Hadoop Developer

Responsibilities:

  • Analyzed large data sets by runningHive queries and Pig scripts
  • Worked with the Data Science team to gather requirements for various data mining projects
  • Involved in creating Hive tables, and loading and analyzing data usinghive queries
  • Developed Simple to complex MapReduce Jobs using Hive and Pig
  • Involved in running Hadoop jobs for processingmillions of records of text data
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required
  • Developed multiple MapReduce jobs in java for data cleaning and preprocessing
  • Involved in loading data from LINUX file system to HDFS
  • Responsible for managing data from multiple sources
  • Extracted files from Oracle through Sqoop and placed in HDFS and processed.
  • Experienced in runningHadoopstreaming jobs to process terabytes of xml format data.
  • Load and transform large sets of structured, semi structured and unstructured data.
  • Responsible to manage data coming from different sources.
  • Assisted in reporting analyzed data on Tableau
  • Created and maintained Technical documentation for launching HADOOP Clusters and for executing Hive queries and Pig Scripts

Environment: Hadoop, HDFS, Pig, Hive, MapReduce, LINUX, Hue, Tableau

Confidential, Peoria, IL

Hadoop Developer

Responsibilities:

  • Worked on analyzing Hadoop clusterusing different big data analytic tools including Pig, Hive, and MapReduce
  • Collecting and aggregating large amounts of log data using Apache Flume and staging data in HDFS for further analysis
  • Worked on debugging, performance tuning of Hive & Pig Jobs
  • Created Hbase tables to store various data formats of PII data coming from different portfolios
  • Implemented test scripts to support test driven development and continuous integration
  • Worked on tuning the performance Pig queries
  • Involved in loading data from LINUX file system to HDFS
  • Importing and exporting data into HDFS and Hive using Sqoop
  • Experience working on processing unstructured data using Pig and Hive
  • Supported MapReduce Programs those are running on the cluster
  • Gained experience in managing and reviewing Hadoop log files
  • Involved in schedulingOozie workflow engine to run multiple Hive and pig jobs
  • Assisted in monitoring Hadoop cluster using tools like Nagios, and Ganglia
  • Created and maintained Technical documentation for launching HADOOP Clusters and for executing Hive queries and Pig Scripts

Environment: Hadoop, HDFS, Pig, Hive, MapReduce, Sqoop, Oozie, Nagios, Ganglia, LINUX, Hue

Confidential, Indianapolis, IN

UNIX, Composite Developer &Hadoop exposure

Responsibilities:

  • Implemented 60+ minor enhancements in the period covering customer management, recoveries, fulfillment management, customer acquisitions, rewards, high yield accounts, money market accounts. Changes including creation of new communication methods not limited to letters, statements and checks.
  • Represented platform in work identification, scoping, allocation and prioritization meetings with business users, application owners, architects including VPs.
  • Identified several new business streams for the application including online presentation of certain instruments, simplified handling of change requests, simplified content changes etc.
  • Built the application for the use of 40+ users from scratch identifying user requirements, technology requirements, functional and non-functional specifications. Offer presentation was handled using manual updates previously.
  • Identified application performance parameters using industry standard tools and documented them for future applications built in the same space.
  • Automated all business processes related to message processing viz. offer submission, offer review, artwork upload, text submission, rules coding, offer approval and final database
  • Identified as a major contributor to Customer Acquisitions for work in letters processing. Worked extensively on Visual Compuset. Built several reusable styles for different needs.
  • Interacted with 4 external vendors to set up file transfer and reports. Setup new interfaces for the application internally and externally. Worked in different transfer protocols including FTP, SFTP and Connect Direct.
  • Acted as the enhancements lead managing 5 resources with work done in both waterfall and periodically in Agile.
  • Loading of data into HDFS, processing of data using PIG scripts and loading of data into Hive for the purpose of reporting using Tableau.

Environment: UNIX (AIX), KSH, Compuset, Java, HDFS, PIG, Hive, Tableau

Confidential, Indianapolis, IN

COBOL and DB2 Developer

Responsibilities:

  • Implementation of convenience check and ACH origination programs for payment processing.
  • Resolution of file formatting issues facilitating ACH origination.
  • Initiation of specifically formatted ACH addenda records to process high volume of child support payments.
  • Delivered on request by client for rapid implementation adding high volume processing to ACH origination process.
  • Developed materials in support of operations training relative to check processing, ACH

Environment: IBM Mainframe OS/390, COBOL, DB2

Confidential, Santa Ana, CA

EDI Specialist

Responsibilities:

  • New EDI implementations for EDI X12 documents such as (850, 855. 810, 856 etc), test case Execution, Build deployment, Post production. Enhancing existing EDI documents based on customer requirements.
  • Worked on JCL and COBOL component enhancements related to EDI implementations.
  • Support to Business, Customer representatives EDI related queries.
  • Worked on tickets and job failures raised by business users and clients. Responsible to find the root cause and resolution for severity 3 and 4 problems. Most of the fixes include COBOL, IMS and DB2 programs, JCL changes.
  • Performed Regression testing for purchase order program processing. This process makes sure that changes to PO processing are not affected since it was considered very important to business.

Environment: IBM Mainframe OS/390, COBOL, DB2 (Platinum, Inter-test etc), IMS, EDI Gentran

Confidential

Test Engineer

Responsibilities:

  • Understanding the business Requirements and Technical Requirements.
  • Preparing the Test Cases according to the requirements.
  • Test Cases were implemented.

Environment: Test Director

We'd love your feedback!