We provide IT Staff Augmentation Services!

Hadoop Developer Resume

3.00/5 (Submit Your Rating)

Basking Ridge, NJ

SUMMARY

  • Overall 10+ years of experience in IT industry in which around 3+ years of experience in Big Data in implementing complete Hadoop solutions and 6+ years in Mainframe technologies.
  • Expertise in Hadoop ecosystem components like Map Reduce, HDFS, Hive, Hbase,Sqoop, Pig,Oozie.
  • Good Knowledge on Spark, Scala, Talend, Kafka
  • Certified CCA Spark and Hadoop Developer (CCA 175).
  • In depth knowledge of Hadoop Architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and MRv1 and MRv2 (YARN).
  • Hands on experience in application development using Core JAVA and Linux shell scripting.
  • Experienced in creating Map Reduce jobs in Java as per the business requirements.
  • Good experience in ETL (Data Ware House) process.
  • Good experience in implementing multithreading and concurrence concepts using Core Java.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa. Implemented data ingestion from multiple sources like IBM Mainframes, Oracle using Sqoop, SFTP.
  • Good domain knowledge on Health care.
  • Hands on experience in configuring and working with Flume to load the data from multiple sources directly into HDFS.
  • Good knowledge on Data ware House concepts.
  • Strong experience in data analytics using Hive and Pig, including by writing custom UDFs.
  • Hands on experience with Spark-Scala programming with good knowledge on Spark Architecture and its inmemory processing.
  • Knowledge of job workflow scheduling and monitoring tools like Oozie,Talend.
  • Strong understanding of NoSQL databases like HBase.
  • Knowledge of Publish-subscribe messaging system Kafka.
  • Extensive knowledge in using SQL Queries for backend database analysis.
  • Good Development experience on Mainframe Technologies like Cobol, Jcl, DB2, VSAM, TSO/ISPF, REXX, OPC,SPUFI, QMF,ENDEVOR tools.
  • Experienced in creating and analyzing Software Requirement Specifications (SRS) and Functional Specification Document (FSD).
  • Strong knowledge of Software Development Life Cycle (SDLC).
  • Excellent working experience in Scrum / Agile framework and Waterfall project execution methodologies.
  • Experienced in preparing and executing Unit Test Plan and Unit Test Cases after software development.
  • Experienced to work in multi-cultural environment with a team and also individually as per the project requirement.
  • Excellent communication and inter-personal skills, self-motivated, organized and detail-oriented, able to work well under deadlines in a changing environment and perform multiple tasks effectively and concurrently.
  • Strong analytical skills with ability to quickly understand clients business needs.

TECHNICAL SKILLS

Hadoop/Big Data: HDFS, MapReduce, Pig, Hive,Hbase, Sqoop, Oozie, Spark,Scala, Kafka, Zookeeper

Programming languages: C, Core Java, Linux shell script,Cobol,Jcl

Databases: Oracle, MySQL, DB2

Operating Systems: Windows, UNIX, LINUX and Z/OS.

Web Technologies: HTML, XML.

Tools: Eclipse, FileAid, Endevor, OPC Schedular,Xpeditor

Development Approach: Agile, Waterfall

PROFESSIONAL EXPERIENCE

Hadoop Developer

Confidential - Basking Ridge, NJ

Responsibilities:

  • Responsible for building scalable distributed data solutions using Hadoop.
  • Understanding business needs, analyzing functional specifications and map those to development.
  • Involved in loading data from Mainframe DB2 into HDFS using Sqoop.
  • Handled Delta processing or incremental updates using Hive.
  • Responsible for daily ingestion of data from DATALAKE to CDB Hadoop tenant system.
  • Developed PIG Latin scripts in transformations while extracting data from source system.
  • To work on data issue related tickets and to provide the fix.
  • To monitor and fix the production job failures.
  • Review the team members design documents and coding.
  • Documented the systems processes and procedures for future references including design and code reviews.
  • Involved in story-driven agile development methodology and actively participated in daily scrum meetings.
  • Implemented data ingestion from multiple sources like IBM Mainframes, Oracle.
  • Developed transformations and aggregated the data for large data sets using Pig and Hive scripts.
  • Worked on partitioning and used bucketing in HIVE tables and running the scripts in parallel to improve the performance.
  • Have thorough knowledge on spark architecture and how RDD's work internally.
  • Have exposure to Spark SQL.
  • Have experience in Scala programming language and used it extensively with Spark for data processing.

Environment: HDFS, Hive, Pig, Hbase,Unix Shell Script, Talend, Spark,Scala

Hadoop Developer

Confidential

Responsibilities:

  • Responsible for building scalable distributed data solutions using Hadoop.
  • Developed Sqoop Jobs to import data from different source systems and handled incremental loads.
  • Worked on Pig scripts to handle unstructured and Semi structure data.
  • Used Bedrock - Data management platform for batch processing, metadata management and scheduling workflows.
  • Handled Delta processing or incremental updates using hive tables.
  • Worked on Production support activities like Job monitoring, fixing the Job failures and Analyzing the data issues.
  • Optimizing Hive/Pig scripts for better scalability, reliability and performance.
  • Developed PIG Latin scripts while extracting data from source system.
  • Documented the systems processes and procedures for future references
  • Implemented data ingestion from multiple sources like IBM Mainframes,Oracle.
  • Developed transformations and aggregated the data for large data sets using Pig and Hive scripts.

Environment: HDFS, Map Reduce, Hbase, Hive, PIG, Spark,Unix Shell Scripting, Sqoop, Bedrock.

Mainframe Lead

Confidential

Responsibilities:

  • Responsible for understanding the scope of the project and requirement gathering.
  • Performing Impact Analysis, Designing Tech spec, UTP, Coding, Testing and Delivery of all relevant artifacts involved in the Major requests.
  • Preparing the hours Estimation for the project.
  • Leading the team and distribution of the tasks among the team members.
  • Helping BA’s to confirm the requirements.
  • Supporting the UAT and preparing the Implementation Plan.
  • Supporting Production Jobs, Fixing of production issues & ensuring timely and defect free delivery at the same time meeting the Service level agreements (SLA).
  • Involved in the War rooms with various teams to resolve the Priority 1 issues.
  • Involved in the HPSM Audit process to make sure incidents/change tickets were closed on time.
  • Handling various customer reported data issues and tracking them through HPSM incident tickets.
  • Review the team members design documents and coding.
  • Attending project Gate meetings that involve Go or Non Go decisions
  • Attending the common TRB (Technical Review Board) meetings for Design/code review.
  • Implemented auto balancing process to maintain data sync between Cosmos and DSS systems.
  • Involved in conversion of several FTP jobs process to SFTP.
  • Investigated performance issues with long running jobs, Performance improvements were suggested to the development teams resulting in significant dollar savings.
  • One to one with onshore resource manager to meet the expectations.
  • Support Implementation and analyze post implementation issues.
  • Coordinate with Project managers, Development and QA teams during the course of the project.
  • Played key Architect role in handling complex migration of Galaxy Mainframe system to well managed common platform.

Environment: Cobol, DB2, JCL, VSAM, Endevor, Fileaid, Xpediter, SPUFI, WS OPC Schedular, REXX

Mainframe Developer

Confidential

Responsibilities:

  • Participated in all phases of the Systems Development Life Cycle including initial business analysis, functional and technical specifications writing, design, and programming for all new and enhance requirements.
  • Task involved creating new and modification of the existing Programs, PROCS,JOBS and creating new DB2 tables and modifying existing components based on new requirements.
  • Developed technical and functional documentation for projects, including technical translation of the business requirements, input and output layouts and test cases scenarios, working closely with QA Group Support.
  • Participated in all phases of the Systems Development Life Cycle including initial business analysis, functional and technical specifications writing, design, and programming for all new and enhance requirements.
  • Involved in Production support activities on 24/7 model and fix the Job failures within the SLA.
  • Purging DB2 data based on customer request.
  • Extensive usage of DB2 utilities, LOAD, UNLOAD, RUNSTATS, REPAIR, Removing check pending and copy pending states, Performance Tuning and Monitoring.
  • Perform unit testing for each component and set up the SIT, UAT regions and running the cycles.
  • JOB Scheduling and triggering using OPC Scheduler.
  • Handling the various customer data issue related HPSM tickets and fix it.
  • Analyze the Production failures and fix it within the SLA
  • Worked on performance issues in the application and fix the problems.

Environment: Cobol, DB2, JCL, VSAM, Endevor, Fileaid, OPC, SPUFI, QMF, Xpediter.

Mainframe Developer 

Confidential

Responsibilities:

  • Responsible for understanding the scope of the project and requirement gathering.
  • Performing Impact Analysis, Designing Tech spec, UTP, Coding, Testing and Delivery of all relevant artifacts involved in the Major requests.
  • Preparing the hours Estimation for the project.
  • Working Closely with Business team to finalize the requirements.
  • Preparing UTP and Implementation Plan.
  • Review the team members design documents and coding.
  • Attending the TRB meetings for Design/code review
  • Support Implementation and analyze post implementation issues.
  • Supporting Production Jobs, Fixing of production issues & ensuring timely and defect free delivery.
  • Coordinate with Project managers, Development and QA teams during the course of the project.

Environment: Cobol, JCL, VSAM, IMSDB, CICS, ChangeMan, Fileaid.

We'd love your feedback!