We provide IT Staff Augmentation Services!

Big Data Engineer Resume

New, JerseY

SUMMARY

  • Technology professional wif Java, Spark wif Scala & Big data applications wif 6 years of progressive and diverse experience in developing applications wif an emphasis on Hadoop ecosystem Tools and Technologies using industry accepted methodologies and procedures.
  • Strong ability and experience in working on object oriented and functional components of JVM based languages like Java (8), Scala and good knowledge of Python.
  • Hands on experience in working wif multiple tools in teh Hadoop ecosystem like SparkSQL, Spark Streaming, Hive, Pig, Sqoop, Kafka, MapReduce on various Hadoop distributions like Horton Works, Cloudera, EMR.
  • Experience in teh full life cycle of migrating data from existing in - house data center to teh cloud built on top of AWS wif experience in using multiple compression techniques
  • Experience working wif multiple RDBMS like MySQL, Oracle and columnar databases like Cassandra, HBase.
  • Experience wif devops tools like Jenkins for continuous integration, Git as a repository and chef for automation of system level patching and upgrades.
  • Possesses excellent work ethics and always like to explore new technology trends and apply them when teh opportunity presents.

TECHNICAL SKILLS

Big data Ecosystem: Spark Streaming, Spark SQL, Kafka, Map Reduce, Hive, Pig, Sqoop, Oozie

Operating Systems: Windows, Linux Distro (Ubuntu, Mint, Fedora)

Languages: Java, Scala,Python

Scripting Language: Unix Shell Scripting, Java Scripting

RDBMS DB: Oracle, MySQL

NOSQL DB: Hbase, Cassandra

Servers: Tomcat, JBoss

Operations: Maven, Jenkins, SVN Repository, GIT, Tableau

Web Services: REST, SOAP

MarkUp Languages: HTML/HTML5, XML,XML Schema, CSS/CSS3

Search Engine: Elastic Search, Kibana, Solr

PROFESSIONAL EXPERIENCE

Big Data Engineer

Confidential, New Jersey

Responsibilities:

  • Actively participated from design phase of teh data lake starting wif performing POC’s using multiple Big data tools (Cassandra, HBase, Kafka, DynamoDB, AWS, MS-Azure,etc) to identify teh best tools that solve teh business problem at hand.
  • Shared responsibility for teh complete life cycle of migrating data from teh existing in-house infrastructure to teh cloud built on top of AWS-S3.
  • Worked on creating data model to establish a relationship for teh tables in teh Elastic Search
  • Expansively used Kibana to search teh data in Elastic Search
  • Extensively involved in writing Spark Applications wif Spark-SQL/Streaming for faster processing of Data.
  • Responsible for design and creation of Hive tables, partitioning, bucketing, loading data and writing hive queries. Implemented, migrated existing Hive Script in SparkSQL for better performance.
  • Created RESTful services, converted data formats to make it consumable by services.
  • Developed Spark streaming jobs in JAVA 8 to receive real time data from Kafka, process and store teh data to HDFS.
  • Experience in validating and cleansing teh data using Pig statements and hands-on experience in developing Pig MACROS.
  • Used Agile(SCRUM) methodologies for Software Development.

Environment: JAVA8, Sqoop, Spark SQL, Spark Streaming, Kafka, Scala, Amazon Web Services(AWS-EMR), Hive, Pig, REST, Oozie, Maven, Elastic Search, Kibana, Jenkins.

Big Data Engineer(Hadoop, Spark)

Confidential, New Jersey

Responsibilities:

  • Actively participated in complete software development lifecycle (Scope, Design, Implement, Deploy, Test) including design and code reviews.
  • Moving Bulk amount of data into HBase wif Map Reduce Integration.
  • Developed Pig Programs for loading and filtering teh streaming data into HDFS using Kafka.
  • Used tools like MapReduce and Spark wif Scala for performing operations like Clickstream Analysis and to perform Analysis on batch Data.
  • Developed HBase data model on top of HDFS data to perform real time analytics.
  • Installed ES Hadoop Connector to load data from Hadoop to Elastic Search
  • Experienced in handling Avro data files by passing schema into HDFS using Avro tools and Map Reduce.
  • Optimizing teh Hive queries using Partitioning and Bucketing techniques, for controlling teh data distribution.
  • Developed visualizations and dashboards using multiple BI tools like Tableau, Platfora.
  • Used Oozie scheduler to automate teh pipeline workflow and orchestrate teh Map Reduce, Sqoop, hive and pig jobs that extract teh data on a timely manner.
  • Involved in story-driven agile development methodology and actively participated in daily scrum meetings.

Environment: MapReduce, Spark wif Scala, Hive, Pig, Sqoop, Oozie, HBase, Platfora, Redis, REST Services, Linux, Elastic Search, Maven, Jenkins, HDFS.

Java Developer

Confidential

Responsibilities:

  • Designed, developedand validatedUser Interface using HTML, Java Script, XML.
  • Handled teh database access by implementing Controller Servlet.
  • Implemented PL/SQL stored procedures and triggers.
  • Extensively worked wif Struts, Hibernate, Spring(Spring Core, Spring MVC) application design, development.
  • Experience in using various design patterns like Business Delegate, Session Facade, Service Locator, Singleton and Model-View-Controller
  • Extensively worked wif Web Services including SOAP over JMS & HTTP,REST(JAX-RS)
  • Experience wif multithreading, enterprise java beans(EJB), worked wif session, entity and message driven bean
  • Experience in preparation of Test procedures, Test Scenarios, Test Cases and Test Data.
  • Created Test Cases using Element Locators and Selenium WebDriver Methods.
  • Execution of Selenium Test Cases and Reporting Defects.
  • Involved in Regression Testing and Automation Infrastructure Development using Selenium.
  • Expertise in implementation of Automation FrameWork using Selenium, LoadRunner, UFT.
  • Experience in developing applications in WebSphere 8.5 & 7, JBoss, Tomcat and BEA web logic. Developed server running script for automation using teh JBoss 6.3 application server.
  • Strong experience in SOA(Service Oriented Architecture), EAI (Enterprise Application Integration) and ESB (Enterprise Service Bus)
  • Extensive experience in developing web page quickly and TEMPeffectively using JSF, Ajax JQuery, JavaScript, HTML, CSS and also in making web pages cross browser compatible
  • Good in unit testing skills using Junit framework and functional Junit capturing user entered data and mapping it back to database to provide accurate test results.
  • Interacted wif application architect to design teh workflow and service integration on top of spring MVC, Ajax and web services layers.
  • Set up Web sphere Application server and used Ant tool to build teh application and deploy teh application in Web sphere
  • Used Spring Framework for Dependency Injection and integrated wif Hibernate.
  • Involved in writing JUnit Test Cases.
  • Used Log4J for any errors in teh application

Environment: Java, Spring Framework, J2EE, HTML, JUnit, XML, JavaScript, Eclipse, WebLogic, PL/SQL, Maven, Oracle.

Hire Now