We provide IT Staff Augmentation Services!

Hadoop Developer Resume

4.00/5 (Submit Your Rating)

Harrisburg, PA

SUMMARY:

  • An information technology professional having 7+ years of Industry Experience as a Big Data Technical Consultant
  • In depth understanding/knowledge of Hadoop Architecture and its components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and MapReduce.
  • Experienced in Waterfall & Agile development methodology.
  • Expertise in writing Hadoop Jobs for analyzing data using MapReduce, Hive and Pig
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa
  • Experienced in extending Hive and Pig core functionality by writing custom UDFs using Java.
  • Experience with developing large-scale distributed applications.
  • Experience in developing solutions to analyze large data sets efficiently
  • Experience in Data Warehousing and ETL processes.
  • Knowledge of Star Schema Modeling, and Snowflake modeling, FACT and Dimensions tables, physical and logical modeling.
  • Strong database, SQL, ETL and data analysis skills.
  • Good understanding of Data Mining and Machine Learning techniques
  • Experienced in NoSQL databases such as HBase, and MongoDB
  • Experienced in job workflow scheduling and monitoring tools like Oozie and Zookeeper
  • Knowledge of administrative tasks such as installing Hadoop and its ecosystem components such as Hive and Pig
  • Experienced in developing applications using all Java/J2EE technologies like Servlets, JSP, EJB, JDBC, JNDI, JMS etc.
  • Experienced in developing applications using HIBERNATE (Object/Relational mapping framework).
  • Experienced in developing Web Services using JAX-RPC, JAXP, SOAP and WSDL. Also knowledgeable in using WSIF (Web Services Invocation Framework) API.
  • Thorough knowledge and experience of XML technologies (DOM, SAX parsers), and extensive experience with XPath, XML schema, DTD’s, XSLT, XML SPY, MAPForce editor.
  • Experience in Message based systems using JMS, TIBCO & MQSeries.
  • Experience in writing database objects like Stored Procedures, Triggers, SQL, PL/SQL packages and Cursors for Oracle, SQL Server, DB2 and Sybase.
  • Proficient in writing build scripts using Ant & Maven.
  • Experienced in using CVS, SVN and Sharepoint as version manager.
  • Proficient in unit testing the application using Junit, MRUnit and logging the application using Log4J.
  • Ability to learn and adapt quickly and to correctly apply new tools and technology.
  • Strong communication and analytical skills with very good experience in programming & problem solving.

ADDITIONAL SKILLS:

  • Excellent interpersonal, communication and relationship-building skills.
  • Listen attentively, communicate persuasively and follow through diligently.
  • Interpersonal managerial and organizational skills with ability to multi-task
  • Fluent in the following Indo/Pak languages: Hindi, Urdu, Punjab
  • Comprehensive problem solving and decisions making capabilities.
  • Self-Motivated, Innovative, Analytical, Inter-Personal and a team player.
  • Determined and ability to deliver with minimal guidance from seniors.
  • World minded and delivery centric.
  • Proficient in Microsoft Office

TECHNICAL SKILLS:

Hadoop/Big Data, HDFS, Map Reduce, Hive, Pig, Sqoop, Flume, Oozie, Kafka, ZooKeeper.

HBase, Cassandra, CouchDB.

Teradata, MS SQL Server, Oracle, Informix, Sybase, Informatica, Datastage.

JAVA, J2EE, Spring, Hibernate EJB, Webservices (JAX-RPC, JAXP, JAXM), JMS, JNDI, Servlets, JSP, Jakarta Struts.

BEA Web Logic, IBM Websphere, JBoss, Tomcat.

UML, OOAD.

HTML, AJAX, CSS, XHTML, XML, XSL, XSLT, WSDL, SOAP

CVS, SVN, SharePoint, Clear Case, Clear Quest, Win CVS, Junit, MRUnit, Ant, Maven, Log4j, FrontPage

Eclipse, NetBeans.

Linux, UNIX, Windows

PROFESSIONAL EXPERIENCE:

Confidential, Harrisburg, PA

Hadoop Developer

Responsibilities:

  • Created Hive Tables, loaded retail transactional data from Teradata using Scoop.
  • Loaded home mortgage data from the existing DWH tables (SQL Server) to HDFS using Scoop.
  • Wrote Hive Queries to have a consolidated view of the mortgage and retail data.
  • Data is loaded back to the Teradata for the BASEL reporting and for the business users to analyze and visualize the data using Datameer.
  • Orchestrated hundreds of sqoop scripts, pig scripts, hive queries using oozie workflows and sub-workflows.
  • Loaded the load ready files from mainframes to Hadoop and files were converted to ascii format.
  • Developed pig scripts for replacing the existing home loans legacy process to the Hadoop and the data is back fed to retail legacy mainframes systems.
  • Developed MapReduce programs to write data with headers and footers and Shell scripts to convert the data to fixed-length format suitable for Mainframes CICS consumption.
  • Used Maven for continuous build integration and deployment.
  • Agile methodology was used for development using XP Practices (TDD, Continuous Integration).
  • Participated in daily scrum meetings and iterative development.
  • Exposure to burn-up, burn-down charts, dashboards, velocity reporting of sprint and release progress.

Technology: Hadoop, MapReduce, Hive, Pig, Sqoop, Avro, Datameer, Teradata, SQL Server, IBM Mainframes, Java 7.0, Log4J, Junit, MRUnit, SVN, JIRA.

Confidential, Atlanta, GA

Hadoop Developer

Responsibilities:

  • Worked with technology and business groups for Hadoop migration strategy.
  • Research and recommend suitable technology stack for Hadoop migration considering current enterprise architecture.
  • Validated and Recommended on Hadoop Infrastructure and data center planning considering data growth.
  • Transferred data to and from cluster, using Sqoop and various storage media such as Informix tables and flat files.
  • Developed MapReduce programs and Hive queries to analyze sales pattern and customer satisfaction index over the data present in various relational database tables.
  • Worked extensively in performance optimization by adopting/deriving at appropriate design patterns of the MapReduce jobs by analyzing the I/O latency, map time, combiner time, reduce time etc.
  • Developed Pig scripts in the areas where extensive coding needs to be reduced.
  • Developed UDF’s for Pig as needed.
  • Followed Agile methodology for the entire project.

Technology: Hadoop, MapReduce, Hive, Pig, Sqoop, Kafka, Java 7.0, XML, WSDL, SOAP, Webservices, Oracle/Informix, Log4J, Junit, SVN.

Confidential, Lynchburg, VA

Sr. JAVA Developer

Responsibilities:

  • Involved in design process using UML & RUP (Rational Unified Process).
  • Developed different Components and Adapters of the integration framework using Stateless Session EJB.
  • Developed various interfaces to integrate the SAP and various legacy systems such as PMWeb, CCWeb, Etime, EEDS, JDEdwards etc.
  • Developed different interfaces using EJB Session Beans (Stateless) and Message Driven Beans for both synchronous and asynchronous communication.
  • Developed various interfaces such as FILE-RFC, RFC-FILE, JMS-RFC, RFC-Database using EJB, JMS, JDBC, JNDI, JCO, Java-Proxy and Webservices.
  • JDBC Adapter was developed using Hibernate for persisting and retrieving the data in the database.
  • Developed Webservices using SOAP, WSDL, JAXP and AXIS to integrate with the external systems.
  • Internal data representation was made in XML format. Business validation and mapping rules were applied on the XML data and transformed to the target XML structure. DOM parser was largely used in performing these operations.
  • Implemented J2EE Design patterns such as Session Façade, MVC, Business Delegate, Value Object, Data Access Object etc.
  • Extensively interacted with SAP functional and technical teams in resolving technical and functional issues.
  • Effectively performed code refactoring to modularize the code and improve error handling and fault tolerance.
  • Provided second level and third level of production support in resolving issues relating to the interfaces.
  • Developed the interfaces using Eclipse. Deployed the application in SAP Web Application Server.
  • Actively involved in configuration management tool CVS in managing the code.
  • Worked on Unit and Integration testing of the interfaces.
  • Involved in designing test plans, test cases and overall Unit and Integration testing of system.

Technology: EJB, JSP, Struts, Webservices, JMS, JNDI, JDBC, SAP Webapplication Server, Eclipse, Hibernate, SAP XI, SQL, Sybase, XML, XSD, WSDL, SOAP, CVS, Win 2003 Server.

Confidential, Winston Salem, NC

Sr. JAVA Developer

Responsibilities:

  • Analyzing the business requirements and doing the GAP analysis then transforming them to detailed design specifications.
  • Involved in design process using UML & RUP (Rational Unified Process).
  • Performed Code Reviews and responsible for Design, Code and Test signoff.
  • Assisting the team in development, clarifying on design issues and fixing the issues.
  • Involved in designing test plans, test cases and overall Unit and Integration testing of system.
  • Development of the logic for the Business tier using Session Beans (Stateful and Stateless).
  • Developed Web Services using JAX-RPC, JAXP, WSDL, SOAP, XML to provide facility to obtain quote, receive updates to the quote, customer information, status updates and confirmations.
  • Responsible for Design and development of web services to test the security aspects of Web Services enabled CICS Transaction Gateway.
  • Extensively used SQL queries, PL/SQL stored procedures & triggers in data retrieval and updating of information in the Oracle database using JDBC.
  • Expert in writing, configuring and maintaining the Hibernate configuration files and writing and updating Hibernate mapping files for each Java object to be persisted.
  • Expert in writing Hibernate Query Language (HQL) and Tuning the hibernate queries for better performance.
  • Used the design patterns such as Session Façade, Command, Adapter, Business Delegate, Data Access Object, Value Object and Transfer Object.
  • Deployed the application in Weblogic and used Weblogic Workshop for development and testing.
  • Involved in application performance tuning (code refractory).
  • Writing test cases using JUNIT, doing test first development.
  • Used Rational Clear Case & PVCS for source control. Also used Clear Quest for defect management.
  • Writing build files using ANT. Used Maven in conjunction with ANT to manage build files.
  • Running the nightly builds to deploy the application on different servers.

Technology: EJB, Webservices, Hibernate, Struts, JSP, JMS, JNDI, JDBC, Weblogic, SQL, PL/SQL, Oracle, Sybase, XML, XSLT, WSDL, SOAP, UML, Rational Rose, Weblogic Workshop, OptimizeIt, Ant, JUnit, ClearCase, PVCS, ClearQuest, Win XP, Linux.

Confidential, NYC, NY

JAVA Developer

Responsibilities:

  • Involved in designing and development using UML with Rational Rose
  • Played a significant role in performance tuning and optimizing the memory consumption of the application.
  • Developed various enhancements and features using Java 5.0
  • Developed advanced server side classes using Networks, IO and Multi-Threading.
  • Lead the issue management team and achieved significant stability to the product by bringing down the bug count to single digits.
  • Designed and developed various complex and advanced user interface using Swing.
  • Used SAX/DOM XML Parser for parsing the XML file

Technology: Java 5.0, JFC Swing, Multi-Threading, IO, Networks, XML, JBuilder, UML, CVS, WinCVS, Ant & JUnit, Win XP, Unix.

We'd love your feedback!