We provide IT Staff Augmentation Services!

Hadoop Developer Resume

3.00/5 (Submit Your Rating)

CharlottE

SUMMARY:

  • Over 10 years of IT experience in leading multinational Information Technology firm with in - depth knowledge of Banking, Telecom, multimedia, PKI (Public Key Infrastructure) domains and Big data technology with multiple programming languages like Scala, Java and C++.
  • Strong experience in Big data & projects in multiple domains, tools in all phases of SDLC: Requirements gathering, System Design, Development, Enhancement, Maintenance, Testing, Deployment, Production support, System.
  • Strong experience in Big Data and Hadoop Ecosystem tools like MR, PIG, HIVE, SQOOP, OOZIE, FLUME, HBASE, Kafka and SPARK.
  • Strong experience in configuring and using Apache Hadoop ecosystem components like Hadoop MapReduce, HDFS, HBase, Oozie, Hive, Sqoop, Pig, Kafka, SPARK and Flume.
  • Experience in developing customized UDF’s in java to extend Hive and Pig Latin functionality.
  • Good understanding of HDFS Designs, Daemons, federation and HDFS high availability (HA).
  • Good understanding on Spark core, Spark SQL and Spark Streaming.
  • Good Knowledge of UNIX and shell scripting
  • Knowledge of NoSQL databases such as HBase, and DynamoDB.
  • Technical expertise in EJB, JBOSS, Restful Web services, Maven, JUnit and Arquillian Integration Framework technologies.
  • Technical expertise in GCC compiler, GDB, Wireshark, TCP/UDP Sockets technologies.
  • Experience on IDE’s namely Eclipse, IntelliJ and Visual Studio
  • Experience on version control tools like Rational Clearcase, GIT, Visual Source Safe and SVN
  • Experience in working with relational databases Postgres and SQL programming.
  • Experienced in Agile methodology as a Subject Matter Expert and Technical Coordinator for work effort estimation, allocation and Technical Docs.
  • Possess superior design and debugging capabilities, innovative problem solving and excellent analytical Skills.
  • Involved in Process improvement activities that reduce operational costs.
  • Created multiple tools that automate repetitive manual tasks and helps reduce effort by up to 50%.
  • Experienced in Quality Assurance activities including peer reviews, casual analysis for defects and creation of checklists for improving deliverable quality.
  • Focused on Quality and processes. Excellent written and verbal communication skills and team player.
  • Have flair to adapt to new software applications and products, self-starter, have excellent communication skills and good understanding of business work flow.

TECHNICAL SKILLS:

Big Data Technologies: Apache Spark, HDFS, Yarn, Hive, MapReduce, Pig,Sqoop, Flume, Oozie, Kafka

Programming Languages: Scala, Java, C/C++

RDBMS: Postgres, MySQL

NoSQL Databases: HBase, DynamoDB

Operating Systems: Linux (CentOS and SUSE), HP-Itanium and Windows

Special tools: Maven, Autosys, GDB, Wireshark, Make.

Version Control: SVN, GIT, Clearcase, Visual Source Safe

PROFESSIONAL EXPERIENCE

Confidential, Charlotte

Hadoop Developer

Responsibilities:

  • Developed Sqoop job to pull PRDS (party reference data) to the HDFS location from Teradata
  • Prepared xmls for each source system like ATM, Loans, Teller etc to validate each record from HDFS source file and these xmls are validated by XSD.
  • Files types Delimited, Position Based and Binary files are loaded in to SparkContext and validated against xml.
  • Implemented Repartition, Caching and broadcast concepts on RDD’s, DF’s and variables to achieve better performance on cluster.
  • Create parquet files for valid records and invalid records separately for all systems.
  • Storing the parquet data into hive data base with daily date partitions for further queries.
  • The validated parquet files of two or more systems got combined in curation module to get the common transactions data.
  • Data Frames are created by reading the validated Parquet Files and run the SQL queries using SQLContext to get the common transaction data from all the systems.
  • Developed Spark jobs using Scala in test environment for faster data processing and used Spark SQl for querying.
  • Involved in working with Spark on top of Yarn/MRv2 for interactive and Batch Analysis.
  • Executed Oozie workflows to run multiple Hive and Pig jobs.

Environment: JDK1.8, Apache Spark 1.6, Scala 2.10, Sqoop, Oozie, Hive, AutoSys, Yarn cluster, Cloudera Distribution, Intellij IDE, Maven.

Confidential

Hadoop Developer

Responsibilities:

  • Run the Flume Spooling technique to bring the alarm data from LFS to HDFS.
  • Run MapReduce JAR to get the node Ip address as Key and alarm severity (minor:5, major:10, critica:15) as value from all alarm input JSON data.
  • Wrote Pig UDF will arrange the whole MapReduce part-r file like (172.16.30.140, 5,10,15) and store in to a file.
  • File is loaded into hive table such that user will come to know the all alarm details with severity in hive table.
  • Run the hive queries to know the maximum critical alarms raised on the network element and store in weekly partitions

Environment: JDK1.7, CentOS Linux, HDFS, Map-Reduce, Hive, Pig, Flume.

Confidential

Java Developer

Responsibilities:

  • Developed Entity Profile module which is common for a group of profiles to which Certificate has to be issued.
  • Development of Validations for Entity Profile and Certificate Profile fields like subject, SAN etc according to RFC 5280.
  • Involved in Development of Entity module which is common for a group of entities to which Certificate has to be issued.
  • Validate the input xml date against the profile and entities XSD validator
  • Junits and Arquillian Framework for both positive and negative test cases for better code coverage.
  • Implemented Asynchronous calls in case of bulk creation of profiles/entities as part of performance improvement in the design.

Environment:: Java, EJB 3.1, Restful Web services, Postgres, Jboss EAP 6.2.4, JSON, PKI, Arquillian Integration framework, Eclipse, JIRA, Maven.

Confidential

C++/Core Java Developer, Onsite Coordinator

Responsibilities:

  • Handled Trouble Reports, Change Requests for the NET Level of the plug-in.
  • Implemented new Implementation Proposals for the NET Level of the plug-in.
  • Actively involved in various team activities involving communication with onsite associates.
  • Development of Linux Porting from HP Unix for the PMH plugin by converting the HP UNIX OTS stack to Linux 3pp Xelas stack for TCP socket communication between element node and plugin.
  • Wrote Many Technical Docs for implementation of new features in plugin.
  • Worked as Onsite Coordinator in clientt location for one year( )
  • Implemented the Linux porting from HP unix for the IMT plugin.IMT plugin acts as a driver for the node OMS32xx Trouble Reports, Change Requests for the NET Level of the plug-in.
  • Implemented Involved in Xelas implementation calls from OTS calls for the plugin.
  • Changing the NMCOM framework (used in HP UNIX platform) to SOOCM framework (compatible with both HP-Unix and LINUX).
  • Cross compilation on g++ whereas HP Unix uses aCC compiler.
  • Handled Trouble Reports, Change Requests for the EM and NM Level of the plug-in
  • Implemented Development of Plug-in which is a interface between Network Element (DXC 1600) and the Element Manager in JAVA.
  • Handled TL1 driver for communication between plugin and Network element.
  • Implemented FCAPS functionalities in the Plugin.
  • Handled Trouble Reports, Change Requests for the EM and NM Level of the plug-in.
  • Preparing Implementation Proposals and Change requests for all plugins
  • Creation and review of test cases based on the requirement and design document.
  • Ensuring that all the components have been thoroughly tested.
  • Fixing the identified bugs, by finding the root cause.

Environment: JDK1.7, HP-Itanium and Linux SuSe, aCC/gCC compilers (c/c++), GDB, WireShark, TCP/UDP Socket, Trouble Report MH-Web, Make, Shell Script.

Confidential

Developer

Responsibilities:
  • ‘Host Window’ of size 640X480 is encoded as a frame and is divided into fragments of each 1300 bytes.
  • Header consists of frameID and total fragments for a frame is added to each fragment and sent over UDP.
  • Accumulating the fragments as a single Frame by header info and decoded and finally view in ‘Viewer Window’.
  • Testing and debugging of Desktop Sharing Application.
  • Integrated this Desktop Sharing application in four party based video conference.
  • Ensuring that all the components have been thoroughly tested.
  • Technical documentation / user manual preparation
  • Fixing the identified bugs, by finding the root cause
  • Unicast video conference to work between public and private Network.
  • If PC is connected to private network then send dummy packets to the public destination IP through specific port (such that public IP will know the NAT IP) before sending of actual audio, video and probe packets.
  • If PC is connected to public n/w then receive Dummy packets from the private side before sending of actual audio, video and probe packets to NAT.
  • Tested this application with STB (Set top box) in different scenarios
  • Ensuring that all the components have been thoroughly tested.
  • Fixing the identified bugs, by finding the root cause
  • Development of feature matching logo(channel symbol like ESPN,Zee ) displayed on the monitor.
  • A template logo is matched with logo in a video using sum of absolute differences (SAD) in gray level image. When a template model logo matches with a logo channel in a video, then SAD will be minimum in this position.
  • A template logo is matched with logo in a video using correlation function in gray level image.
  • Fixing the identified bugs, by finding the root cause

Environment: VC++ Win32, MFC, TI DaVinci, Intel x86, Visual studio, MFC, Visual Source Safe, Image Processing.

We'd love your feedback!