We provide IT Staff Augmentation Services!

Hadoop Developer Resume

3.00/5 (Submit Your Rating)

Fairfax, VA

SUMMARY

  • Around 8 years of professional experience working with Java and Big Data technologies - Hadoop Ecosystem/HDFS/Map-Reduce Framework, NoSQL DB - Hbase, HIVE, Sqoop.
  • Involved in all the phases of Software Development Life Cycle (SDLC): Requirements gathering, analysis, design, development, testing, production and post-production support.
  • Experience in writing Map Reduce programs for analyzing Big Data with different file formats like structured and unstructured data.
  • Extensive experience in analyzing social media data and optimization of social media content.
  • Experience in developing and deploying applications using Web Sphere Application Server, Tomcat, and Web Logic.
  • Hands on experience with supervised and unsupervised machine learning methods.
  • Experience on working with batch processing/ Real time systems using various open source technologies like Hadoop, NoSQL DB’s, and Storm etc.
  • Collected data from different sources like web servers and social media for storing in HDFS and analyzing the data using other Hadoop technologies.
  • Involved in developingSocial MediaAnalytics tools.
  • Experienced in analyzing business requirements and translating requirements into functional and technical design specifications using UML.
  • Experience in working with different NOSQL databases like Hbase.
  • Good knowledge in Machine Learning Concepts by using Mahout and Mallet packages.
  • Capable of processing large sets of structured, semi-structured, unstructured data, researches and applies machine learning methods, implements and leverages open source technologies.
  • Hands-on experience in creating Ontologies.
  • Experience in creating and using RDF,RDFS and OWL languages.
  • Contributed the enhanced functionality towards open source products like Maven and Mahout.
  • Implemented complex projects dealing with the considerable data size (GB/ PB) and with high complexity.
  • Experienced in Editors/IDEs like Eclipse IDE, NetBeans IDE etc.
  • Hands-on experience with graph visualization tools like Gephi, GRUFF and Neo4j.
  • Understanding of semantic knowledge base modeling techniques including knowledge store technologies; RDF/OWL based Ontology modeling,SPARQLquery engines, & conceptual data modeling concepts.
  • Having working knowledge on statisticalsoftware R.
  • Having working knowledge in usage of Stanford natural language processing (NLP) library.
  • Experience in creating and querying triple stores.
  • Used the Agile methodology to develop the applications.
  • Strong and effective problem-solving, analytical and interpersonal skills, besides being a valuable team player.

TECHNICAL SKILLS

Languages: Java, C/C++, Assembly Language (8085/8086)

Big Data Ecosystems: Hadoop, Map Reduce, HDFS, Hbase, Zookeeper, Hive, Pig, Sqoop, Apache Storm.

Scripting Languages: JSP & Servlets, XML, HTML,JSON

Databases: NoSQL, Oracle, neo4j, Gruff

IDE: Eclipse, Net Beans

Application Servers: Apache Tomcat 5.x 6.0, Jboss 4.0

Tool: kits and Packages: Mahout, Mallet and Stanford NLP

Data Visualization Tools: Gephi and Neo4J

Networking Protocols: SOAP, HTTP and TCP/IP

Operating System: Windows 98/2000/XP/7/2003 Server, Linux

PROFESSIONAL EXPERIENCE

Confidential, Fairfax, VA

Hadoop Developer

Responsibilities:

  • Involved in full life-cycle of the project from Design, Analysis, logical and physical architecture modeling, development, Implementation, testing.
  • Developed MapReduce programs to parse the raw data and store the refined data in tables.
  • Designed and Modified Database tables and used HBASE Queries to insert and fetch data from tables.
  • Developed algorithms for identifying influencers with in specified social network channels.
  • Developed and updated social media analytics dashboards on regular basis.
  • Involved in fetching brands data from social media applications like Facebook, twitter.
  • Performed data mining investigations to find new insights related to customers.
  • Involved in forecast based on the present results and insights derived from data analysis.
  • Developed sentiment analysis system per particular domain using machine learning concepts by using supervised learning methodology.
  • Involved in collecting the data and identifying data patterns to build trained model using Machine Learning.
  • Responsible for managing data coming from different sources.
  • Involved in generating Analytics for brand pages.
  • Experienced in working with Apache Storm.
  • Responsible for maintaining and supporting application.
  • Developed and generated insights based on brand conversations, which in turn helpful for effectively driving brand awareness, engagement and traffic to social media pages.
  • Involved in identification of topics and trends and building context around that brand.
  • Developed different formulas for calculating engagement on social media posts.
  • Maintaining Project documentation for the module.
  • Involved in the identifying, analyzing defects, questionable function error and inconsistencies in output.
  • Involved in review technical documentation and provide feedback.
  • Involved in fixing issues arising out of duration testing.

Environment: Java, NLP,Hbase, Machine Learning, Hadoop, HDFS, Map Reduce, Apache Storm,Hibernate, Sparql, Rdf, MySQL, Structs2 and Tiles frame work

Confidential, San Mateo, CA

Software Engineer

Responsibilities:

  • Involved in full life-cycle of the project from Design, Analysis, logical and physical architecture modeling, development, Implementation, testing.
  • Designed and deployed sentiment analysis classifiers to predict the sentiment of the un-seen data using Maximum entropy and Naïve Bayesmethods serving online and offline use cases at scale.
  • Established process to use training sets developed by humans for classifiers, significantly scaling up innovation and deployment of targeted classifiers for specific business use cases.
  • Involved in identifying the patterns for training the data.
  • Involved in evaluating the trained model on test data.
  • Achieved 90% of accuracy in sentiment analysis in retail domain.
  • Developed trained models for retail, insurance, power domains.
  • Involved in fixing issues which were find out of duration testing.
  • Responsible for understanding the scope of the project and developed algorithm for identifying the influencers per brand across various social networking sites.
  • Involved in fetching data of different users from social media applications like Facebook, twitter.
  • Involved in upgrading the algorithm.
  • Involved in exploring the different open source packages which in turn helps in generating sentiment analysis.
  • Maintaining Project documentation for the module.
  • Involved in the identifying, analyzing defects, questionable function error and inconsistencies in output.
  • Involved in review technical documentation and provide feedback.
  • Involved in fixing issues arising out of duration testing.

Environment: Java,Machine Learning, Mallet, Mahout, Hbase, Machine Learning, Hadoop, HDFS, Map Reduce.

Confidential, San Mateo, CA

Software Engineer

Responsibilities:

  • Involved in full life-cycle of the project from Design, Analysis, logical and physical architecture modeling, development, Implementation, testing.
  • Designed and deployed sentiment analysis classifiers to predict the sentiment of the un-seen data using Maximum entropy and Naïve Bayes methods serving online and offline use cases at scale.
  • Established process to use training sets developed by humans for classifiers, significantly scaling up innovation and deployment of targeted classifiers for specific business use cases.
  • Involved in identifying the patterns for training the data.
  • Involved in evaluating the trained model on test data.
  • Involved in fixing issues which were find out of duration testing. nvolved in fetching data of different users from social media applications like Facebook, twitter.
  • Involved in upgrading the algorithm.
  • Maintaining Project documentation for the module.
  • Involved in review technical documentation and provide feedback.
  • Involved in fixing issues arising out of duration testing.

Environment: Java, Machine Learning, Mallet, Mahout, Hbase, Machine Learning.

Confidential

Java Developer

Responsibilities:

  • Involved in full life-cycle of the project from Design, Analysis, logical and physical architecture modeling, development, Implementation, testing.
  • Developed algorithms for calculation of influence of the user.
  • Developed algorithms to generate niche users of particular user.
  • Developed algorithm for providing recommendation to the particular user across different social networking applications.
  • Involved in fetching data of a user from social media applications like face book, twitter.
  • Designed and Modified Database tables and used HBASE Queries to insert and fetch data from tables.
  • Involved in creating triples by using different OWL properties.
  • Used different owl inference properties.
  • Involved in generating data to find interest graph for the user.
  • Involved in creating the ontology’s for different social applications.
  • Hands on experience in working with Allegro Graph server.
  • Experience in working with OWL Lite, OWL DL and OWL FULL properties.
  • Involved in building context around the user which intern used for giving recommendations.
  • Maintaining Project documentation for the module.
  • Involved in the identifying, analyzing defects, questionable function error and inconsistencies in output.
  • Involved in review technical documentation and provide feedback.
  • Involved in fixing issues arising out of duration testing.

Environment: Java, Gate, Allegro Graph Server,Gephi, Gruff, Web Ontology Language,Machine Learning, Sparql, Hibernate, Rdf, MySQL, Structs2 and Tiles frame work.

We'd love your feedback!