We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

Atlanta, GA

SUMMARY

  • Over 7+ years of experience in Information Technology which includes experience inBigdata, HADOOP Ecosystem, developing java code and strong in Design, Software processes, Requirement gathering, Analysis and development of software applications in the roles of Programmer Analyst,BigDataDeveloper, ETL Expert.
  • Excellent Hands on Experience in developing Hadoop Architecture in Windows and Linux platforms.
  • Experience in building Bigdata solutions using Lambda Architecture using Cloudera distribution of Hadoop, Twitter Storm, Trident, Mapreduce, Cascading, HIVE, PIG and Sqoop.
  • Experience in analyzing and recommending BigData solutions.
  • Experience in analyzing marketing requirements and translated them into technical Specifications.
  • Experience in capacity planning, hardware recommendations, performance tuning and benchmarking.
  • Experience working both independently and collaboratively to solve problems and deliver high quality results in a fast - paced, unstructured environment
  • Good experience in optimizing Map Reduce algorithms using Mappers, Reducers, combiners and partitioners to deliver the best results for the large datasets.
  • Set up standards and processes for Hadoop based application design and implementation.
  • Performed data analytics using PIG and Hive for Data Scientists within the team. Extending HIVE and PIG core functionality by using custom UDF's.
  • Strong Work experience in Enterprise Financial Transactions and Money movement activities. Dealt with huge transaction volumes while interfacing the front end application written in Java, JSP, Struts, Webworks, Spring, JSF, Hibernate, Web service and EJB with Web sphere Application server and Jboss.
  • Experiance in NOSQL Databases concepts and Hbase and MongoDB.
  • Working knowledge of Node.js and Express JavaScript Framework.
  • Exposure to Android & Iphone Mobile application Developement.
  • Over 5 years of experience in design and development of various web applications with n-tier Architecture using MVC and J2EE Architecture techniques.
  • Strong working experience using Agile methodologies including Scrum and Kanabn.
  • Experiance in Test Driven Developement (TDD), Mocking Frameworks, and Continiuos Integration (Hudson & Jenkins)
  • Delivered zero defect code for three large projects which involved changes to both front end (Java, Presentation services) and back-end (DB2).
  • Proactively suggested tactical solution for Quiet Time Alerts project by utilizing the Message broker Architecture. This saved 400 man hours of initial project estimation.
  • Strong experience in designing Message Flows and writing complex ESQL scripts and invoked Web service through message flow.
  • Designed and developed a Batch Framework similar to Spring Batch framework.
  • Expertise in using Design and Architectural patterns.
  • Exposure to PCI, HIPAA compliance policies and HL7 standards using Mith Connect.

TECHNICAL SKILLS

Big Data: Hadoop, Storm, Trident, Hbase, Hive, Flume, Cassandra, Sqoop, Oozie, PIG, MapreduceZookeeper, Yarn.

Operating Systems: UNIX, Mac, Linux, Windows 2000 / NT / XP / Vista, Android

Programming Languages: Java JDK1.4/1.5/1.6 (JDK 5/JDK 6), C/C++, Mat lab, R, HTML, SQL, PL/SQL

Frameworks: Hibernate 2.x/3.x, Spring 2.x/3.x,Struts 1.x/2.x and JPA

Web Services: WSDL, SOAP, Apache CXF/XFire, Apache Axis, REST, Jersey

Databases/technologies: Oracle 8i/9i/10g, Microsoft SQL Server, DB2 & MySQL 4.x/5.x

Middleware Technologies: Web sphere Message Queue, Web sphere Message Broker, XML gateway, JMS

Web Technologies: J2EE, Soap & REST Web Services, JSP, Servlets, EJB, JavaScript, Struts, Spring,Web works, Direct Web remoting, HTML, XML,JMS, JSF, Ajax.

Testing Frameworks: Mockito, PowerMock, EasyMock

Web/Application Servers: IBM Web sphere Application server, Jboss, Apache Tomcat

Others: SoftwareBorland Star team, Clear case, Junit, ANT, Maven, Android Platform,Microsoft Office, SQL Developer, DB2 control center,Microsoft Visio,Hudson, Subversion, GIT, Nexus, Artifactory and Trac

Developement Strategies: Agile, Lean Agile, Pair Programming, Water-Fall and Test Driven Development

PROFESSIONAL EXPERIENCE

Confidential, Atlanta, GA

Hadoop Developer

Responsibilities:

  • Gathered the business requirements from the Business Partners and Subject Matter Experts.
  • Worked with Data Modeler and DBAs to build the data model and table structures.
  • Actively participated in discussion sessions to design the ETL job flow.
  • Worked with 10+ source systems and got batch files from heterogeneous systems like Unix/windows/oracle/mainframe/db2.
  • Handled 20 TB of data volume with 10 Node cluster in Test environment.
  • Weekly meetings with technical collaborators and active participation in code review sessions with senior and junior developers.
  • Used to manage and review the Hadoop log files.
  • Supported Hbase Architecture Design with the Hadoop Architect team to develop a Database Design in HDFS.
  • Supported Map Reduce Programs those are running on the cluster and also Wrote MapReduce jobs using Java API.
  • Involved in HDFS maintenance and loading of structured and unstructured data.
  • Imported data from mainframe dataset to HDFS using Sqoop. Also handled importing of data from various data sources (i.e. Oracle, DB2, Cassandra, and MongoDB) to Hadoop, performed transformations using Hive, MapReduce.
  • Wrote Hive queries for data analysis to meet the business requirements.
  • Wrote Pig Latin scripts and also developed UDFs for Pig Data Analysis.
  • Involved in managing and reviewing Hadoop log files.
  • Developed Scripts and Batch Job to schedule various Hadoop Program.
  • Utilized Agile Scrum Methodology to help manage and organize a team of 4 developers with regular code review sessions.
  • Upgraded the Hadoop Cluster from CDH4 to CDH5 and setup High availability Cluster to Integrate the HIVE with existing applications
  • Analyzed the data by performing Hive queries and running Pig scripts to know user behavior.
  • Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
  • Developed Hive queries to process the data and generate the data cubes for visualizing.
  • Optimized the mappings using various optimization techniques and also debugged some existing mappings using the Debugger to test and fix the mappings.
  • Updated maps, sessions and workflows as a part of ETL change and also modified existing ETL Code and document the changes.
  • Installed Oozie workflow engine to run multiple Hive and Pig jobs.
  • Familiar with Scala, closures, higher order functions, monads.

Environment: Hadoop, Java, MapReduce, HDFS, Hive, Pig, Linux, XML, Eclipse, Cloudera, CDH4/5 Distribution, DB2, SQL Server, Oracle 11i, MySQL

Confidential, Lewisville, TX

Big data / Hadoop Developer

Responsibilities:

  • Maintained System integrity of all sub - components (primarily HDFS, MR, HBase, and Hive).
  • Integrated the hive warehouse with HBase
  • Migrating the needed data from MySQL into HDFS using Sqoop and importing various formats of flat files into HDFS.
  • Load the data into HBase tables for UI web application.
  • Written customized Hive UDFs in Java where the functionality is too complex.
  • Maintain System integrity of all sub-components related to Hadoop.
  • Designed and created Hive external tables using shared meta-store instead of derby with partitioning, dynamic partitioning and buckets.
  • HiveQL scripts to create, load, and query tables in a Hive.
  • Worked with HiveQL on big data of logs to perform a trend analysis of user behavior on various online modules.
  • Supported Map Reduce Programs those are running on the cluster
  • Monitored System health and logs and respond accordingly to any warning or failure conditions.
  • Performed advanced procedures like text analytics and processing, using the in-memory computing capabilities of Spark using Scala.
  • Worked on Big Data Integration and Analytics based on Hadoop, SOLR, Spark, Kafka, Storm and web Methods technologies.
  • Implemented Spark using Scala and Spark SQL for faster testing and processing of data.
  • Real time streaming the data using Spark with Kafka.
  • Worked on migrating MapReduce programs into Spark transformations using Spark and Scala
  • Generate final reporting data using Tableau for testing by connecting to the corresponding Hive tables using Hive ODBC connector.
  • Strongly recommended to bring in Elastic Search and was responsible for installing, configuring and administration.
  • Developing and maintaining efficient ETL Talend jobs for Data Ingest.
  • Worked on Talend RTX ETL tool, develop jobs and scheduled jobs in Talend integration suite.
  • Modified reports and Talend ETL jobs based on the feedback from QA testers and Users in development and staging environments.
  • Involved in migration Hadoop jobs into higher environments like SIT, UAT and Prod.

Environment: Hortonworks Hadoop2.3, HDFS, Hive, HQL scripts, Scala, Map Reduce, Storm, Java, HBase, Pig, Sqoop, Shell Scripts, Oozie Co-ordinator, MySQL, Tableau, Elastic search, Talend and SFTP.

Confidential, Jacksonville, FL

Hadoop / Java Developer

Responsibilities:

  • Re-architected all the applications to utilize the latest infrastructure in a span of three months and helped the developers to implement successfully.
  • Designed the Hadoop jobs to create the product recommendation using collaborative filtering.
  • Designed the COSA pretest utility Framework using JSF MVC, JSF Validation, Tag library and JSF Baking beans.
  • Integrated the Order Capture system with Sterling OMS using JSON Webservice
  • Configured the ESB to transform the Order capture XML to Sterling message.
  • Configured and Implemented Jenkins, Maven and Nexus for continuous integration.
  • Mentored and implemented the test driven development (TDD) strategies.
  • Loaded the data from Oracle to HDFS (Hadoop) using Sqoop
  • Developed the Data transformation script using hive and MapReduce
  • Designed and developed User Defined Function (UDF) for Hive (java)
  • Loading the data to HBASE using bulk load and HBASE API.
  • Designed and implemented the Open API using Spring REST webservice.
  • Proposed the integration pipeline testing strategy-using cargo.

Environment: Java, JSP, Spring, JSF, Rest Webservice, AspectJ, InteliJ, Weblogic, Subversion, Git, Jenkins, Nexus, Jquery, Oracle, Mockito, PowerMock, Hadoop, Sqoop, Hbase, Hive, Sterling OMS, TDD and Agile

Confidential

Java developer

Responsibilities:

  • Designed and Coded the appeals module to replicate the declined credit applications to further proceed for approval with the necessary changes using Spring, JSF and Hibernate.
  • Designed the Comparison utility to compare the Credit applications to check whether repricing is required.
  • Created the generic castor mappings and applied XSLT to generate the SOAP request, to invoke the middle-ware services.
  • Developed the pre and post request interceptors to perform the basic Database operations.
  • Developed asset level and contract level, short Funding modules and integrated with the existing modules
  • Designed and developed the Cost modules for Assets and Properties.
  • Generated the contract documents using the IText

Environment: Java, JSP, Web services, Hibernate, Log4j, Eclipse, Jboss, Subversion, Castor Mappings, Direct Web Remoting, Spring, JSF, XSLT, Webworks, Oracle, TOAD, Water Fall.

Confidential

Java Developer

Responsibilities:

  • Designed the high level and detailed design for Quiet Time alerts by utilizing the existing Message broker infrastructure. The architecture reduced 400 man hours of initial estimation.
  • Invoked the web service from Message flows through XML gateway in a secured manner.
  • Developed User Interface to allow the members to setup the preference using JSP based on Struts and Custom tag library facility using form beans and action classes.
  • Developed ANT scripts to build the different modules for the Project, such as building the binary files and scripts for deploying to the server.
  • Provided technical assistance to the Infrastructure team in successfully maneuvering the challenges faced during installation and execution of the project.
  • Worked closely with the Infrastructure team during the build process in deployment and configuration of the Message broker. Resolved the environmental issues in a timely manner and performed IT checkout to ensure sucessful deployment on all the Message broker servers.
  • Created the technical design documents and test scripts adhering to quality control standards.
  • Coded the complex ESQL work modules for the new services.
  • Mentored three new team members by involving them in hands-on training sessions.

Environment: Message Broker, ESQL, Java, JSP, Struts, Hibernate, Apache Ant, Log4j, XML gateway, Rational software Architect.

We'd love your feedback!