We provide IT Staff Augmentation Services!

Senior Hadoop Developer Resume

0/5 (Submit Your Rating)

Bloomington, IL

SUMMARY

  • Over 8 years of IT experience in the field of Information Technology that includes analysis, design, development and testing of complex applications.
  • Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.
  • Strong experience in writing custom UDFs for Hive and Pig with strong understanding in Pig and Hive analytical functions.
  • Over 30 months of strong working experience with Big Data and Hadoop Ecosystems.
  • Hands on experience with Cloudera distributions (CDH).
  • Strong experience with Hadoop components: MapReduce, HDFS, Hive, Pig, HBase and Sqoop.
  • Experience in working with BI team and transform big data requirements into Hadoop centric technologies.
  • Proficient in writing HiveQL queries and Pig based scripts.
  • Well versed with Core Java.
  • Proficient in writing Map Reduce Jobs.
  • Optimization/ performance tuning of MR, PIG & Hive Queries
  • Proficient in designing Rowkeys & Schema Design for NoSQL Databases.
  • Good understanding in Cassandra & MongoDB implementation.
  • Participated in POC (Proof of Concept) on Talend Big data Integration with Hadoop.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa.
  • Experience in working with flume to load the log data from multiple sources directly into HDFS
  • Experience in designing both time driven and data driven automated workflows using Oozie.
  • Experience in providing support to data analyst in running Pig and Hive queries.
  • Experience in working with Customer engineering teams to assist with their validation cycles.
  • Experience in handling the offshore/onsite teams.
  • Extensive experience with SQL in constructing Triggers, Tables, implementing stored Procedures, Functions, Views.
  • Good understanding in database and data warehousing concepts (OLTP & OLAP).
  • Excellent SQL development skills to write complex queries involving multiple tables, great ability to develop and maintain stored procedures, triggers, user-defined functions.
  • Experience in Performance Tuning and Query Optimization.
  • Experienced in using ETL tools like informatica.
  • Experience with logical and physical data modeling and database design using SqlDeveloper.

TECHNICAL SKILLS

Big data Technologies: Hadoop, HDFS, Pig, Hive, Map Reduce, Sqoop, Oozie, HBase, Spark, Storm Cassandra

Java Technologies: Core Java, Servlets, JSP, JDBC, Collections FrameWork, Web Services(RESTful)

Build Tools: Maven, Eclipse

Database: DB2,SQL,IMS-DB

Version Control System: SVN, Github

Operating System: Windows XP/NT/2000, DOS, Z/OS, Ubuntu, Unix, Shell Scripting

Tools: HPSM, Microsoft Visio, Share Point, Lotus Notes, Informatica

PROFESSIONAL EXPERIENCE

Confidential, Bloomington, IL

Senior Hadoop Developer

Responsibilities:

  • Involved in architecture design, development and implementation of Hadoop.
  • Designed and developed datamarts on Hive.
  • Design and developed the dedup logic for different source systems using PIG scripts.
  • Helped the team in optimizing Hive queries.
  • Extracted 500+ RDMS tables to Hadoop using SQOOP.
  • Integrated Autosys with oozie for scheduling the workflows
  • Design and created scripts for Build & Deploy using Maven
  • Developed a POC (Proof Of Concept) with Talend Big Data Integration
  • Design & developed generic workflows to support all the source feeds.
  • Developed multipleMapReducejobs for data cleansing & preprocessing huge volumes of data.
  • Worked with various Hadoop file formats, including TextFiles, SequenceFile, RCFile.

Environment: & Technology Used: Hadoop, HDFS, Hive, Pig, MapReduce, Java, Sqoop, Flume, Oozie, Eclipse, DB2, BIRA in Linux.

Confidential

Responsibilities:

  • Involved in architecture design, development and implementation of Hadoop.
  • Coordinates with customers to understand the deep insights of data.
  • Helped the team in optimizing Hive queries.
  • Extracted 500+ RDMS tables to Hadoop using SQOOP.
  • Integrated Autosys with oozie for scheduling the workflows
  • Design and created scripts for Build & Deploy using Maven
  • Experience in defining job flows.
  • Developed the Pig UDF's to preprocess the data for analysis.
  • Used MRUnit for doing unit testing.
  • Used Sqoop to move the data from relational databases to HDFS.

Environment: & Technology Used: Hadoop, MapReduce, Hive, Sqoop, Pig

Confidential

Responsibilities:

  • Flume - Load log files data into HDFS.
  • Responsible for preparing action plans to analyze the data.
  • Coordinates with customers to understand the deep insights of data.
  • Helped the team in optimizing Hive queries.
  • Involved in running MapReduce jobs for processing millions of records.
  • Developed job flows to automate the workflow for pig and hive jobs.
  • Involved in loading data from local file system (Linux) to HDFS.
  • Experience in defining job flows.
  • Experience in managing and reviewing Hadoop log files.
  • Developed the Pig UDF's to preprocess the data for analysis.
  • Worked with various Hadoop file formats, including TextFiles, SequenceFile, RCFile.
  • Analyze the data using Apache Hive.

Environment: & Technology Used: Hadoop, MapReduce, Hive, Flume, Pig

Confidential

Responsibilities:

  • Monitor the incidents, Problem tasks and Requests in HP Service manager.
  • Owing the tickets and analyzing the issue and find the fixes for them.
  • Generate reports based on user requests.
  • If the issue not resolved with me then reassigning ticket to appropriate workgroup.
  • Running maintenance jobs and monitoring the jobs.
  • Taking the Onsite calls regarding the work status of tickets.

Environment: & Technology Used: Java, J2EE, JUnit, XML, JavaScript, Eclipse.

Confidential

Responsibilities:

  • Performed analysis of existing system application code and documents in order to develop requirements.
  • Worked for reengineering work. (Reverse engineering).
  • In collaboration with Business Analyst prepare Use Case requirements documents.
  • Prepared Flow diagrams using Microsoft Visio.
  • In collaboration with Business Analyst prepared Use Case requirements document and specification.
  • Code, configure, and test assigned Use Cases using developed core API, J2EE, XML, Spring JDBC, Web Services and Restful services.
  • Unit testing using SoapUI tool.

Environment: & Technology Used: Java, J2EE, JUnit, XML, JavaScript, Eclipse, Apache Tomcat.

Confidential

Responsibilities:

  • Performed analysis of existing system application code and documents in order to develop requirements.
  • Worked for reengineering work. (Reverse engineering).
  • In collaboration with Business Analyst prepare Use Case requirements documents.
  • Prepared Flow diagrams using Microsoft Visio.
  • In collaboration with Business Analyst prepared Use Case requirements document and specification.
  • Format and document code in accordance with coding standards established by Team.
  • Provide build release notes to QA Team.
  • Submitting SQL review form if any Query changes are required.
  • Develop and code bug fixes and document in bug tracking system.
  • Participate in weekly progress meetings with Developers, Architects, Business Analyst, and Managers.

Environment: & Technology Used: Java, My SQL, HTML, Java Script, MainFrames, PL1, COBOL, DB2, IMS DB

Confidential

Responsibilities:

  • Analyzing where the code changes are required for the functionality enhancement required.
  • Prepare Specification Document for code changes.
  • Submitting SQL review form if any Query changes are required.
  • Unit testing and documenting them.
  • Support the System Testing team.

Confidential

Responsibilities:

  • Identify the possible scenarios where we may get discrepancies from the initial population.
  • Prepare Check SQL’s for the scenarios
  • Check for active and inactive role counts and prepare Elib (Electronic Library) reports accordingly.
  • Validating the SQL’s prepared by others and testing the output records.

We'd love your feedback!