We provide IT Staff Augmentation Services!

Hadoop Developer / Spark Developer Resume

5.00/5 (Submit Your Rating)

Bentonville, AR

PROFESSIONAL EXPERIENCE:

  • Over 15 years of professional IT experience with expertise in Mainframe, Java, Oracle, Unix and Big data (Hadoop) ecosystem related technologies.
  • 3+ years of exclusive experience in Big Data technologies and Hadoop ecosystem components like Spark, MapReduce, Hive, Pig, YARN, HDFS, HBase, Oozie, Sqoop, Zookeeper and Kafka.
  • Strong Knowledge of Architecture of Distributed systems and Parallel processing frameworks.
  • Strong knowledge on Cloudera distribution and Cloudera components Hue, Cloudera Navigator, Cloudera Manager and Sentry
  • In - depth understanding of MapReduce Framework and Spark execution model.
  • Strong experience working with both batch and real-time processing using Spark framework.
  • Performed Importing and exporting data into HDFS, Hive and HBase using Sqoop.
  • Involved in Design and Architecting of Big Data solutions using Hadoop Eco System.
  • Expertise in Java Database Connectivity (JDBC), Oracle Database Connectivity (ODBC), FTP and NDM
  • Strong knowledge of performance tuning Hive queries and troubleshoots distinct kinds issues in Hive.
  • Experienced in writing custom MapReduce programs &UDF's in Java to extend Hive and Pig core functionality.
  • Involved in finding, evaluating and deploying new Big Data technologies and tools.
  • Proficient knowledge of Apache Spark and programming SCALA to analyze large datasets using Spark and Storm &Kafka to process real time data.
  • Worked with Sqoop to move (import/export) data from a relational database into Hadoop.
  • Knowledge in UNIX Shell Scripting for automating deployments and other routine tasks.
  • Experienced in using agile methodologies including SCRUM and Kanban.
  • Experience in creating Hive tables with different file formats like Avro, Parquet, ORC.
  • Very good understanding of Partitions, bucketing concepts in Hive and designed both Managed and External tables in Hive to optimize performance.
  • Extensive working experience on ITIL Process and ServiceNow tool to manage the production support.
  • Excellent production support knowledge and Batch processing knowledge using Autosys in Unix environment.
  • Excellent knowledge on Data Analysis, Business Analysis, and Participating in WAR rooms, User story creation for Sprints and Standup meetings.
  • Flexible, enthusiastic and project-oriented team player with excellent communication skills with leadership abilities to develop creative solutions for the challenging requirement of client.
  • Expertise in COBOL, DB2, CICS, JCL, PL/1, VSAM,, IMS DB, MQ Series.
  • Expertise in SPUFI, QMF, File Master and FILEAID as testing tools and for data manipulation and Data cleansing.
  • Involved in 24x7 Production Support as a Primary Support
  • Supported maintenance of several applications in handling of user requests, system enhancements, front-end and back-end interface modifications and production installation.
  • Proficient in RDBMS systems like SQL, PL/SQL, DB2, SQL Server and Oracle.
  • Proficient in IBM utilities and Tools like CNTL-M, XPEDITER, ENDEAVOR, CHANGEMAN, CA-INTERTEST, ABEND-AID, INFOMAN, PANVALET, SYNC SORT, FILEAID FOR DB2, ELIPSE, PLATINUM FOR DB2, SCLM.

TECHNICAL SKILLS:

Big Data Ecosystem: Hadoop, MapReduce, YARN, HDFS, HBase, Zookeeper, Hive, Hue, Pig, Sqoop, Spark, Oozie, Cloudera Manager, Apache Ambari, Zookeeper, Hortonworks, Impala, Phoenix

Languages: Cobol, Java, Advanced PL/SQL, Pig Latin, Python, HiveQL, Scala, SQL

Scripting Languages: Shell Scripting, Java script

Database: Oracle 9i/10g, Microsoft SQL Server, MySQL, DB2, PostgreSQL

NOSQL Database: MongoDB, HBase

IDE & Build Tools: Eclipse, ANT, Jenkins and Maven.

Version Control System: Subversion, SVN. Endeavor

PROFESSIONAL EXPERIENCE:

Confidential, Bentonville, AR

Hadoop Developer / Spark Developer

Responsibilities:

  • Importing and exporting data into HDFS from MySQL and vice versa using Sqoop. Responsible to manage the data coming from different sources.
  • Analyzed requirements and designed data model for Cassandra, Hive from the current relational database in Oracle and Teradata.
  • Loaded the customer profiles data, customer spending data, credit from legacy warehouses onto HDFS using Sqoop.
  • Analyzed large data sets by running Hive queries.
  • Creating Hive tables, loading with data and writing Hive queries that will run internally in map reduce way.
  • Created the tasks and detailed the technical documentation for each user story assigned.
  • Design, Build, Test, Schedule and Deploy the Oozie workflows.
  • Build the Ingestion flows using Oozie for Sqoop jobs to import and export the data
  • Designed, Build, Test and deploy Spark SQL programs in SCALA and implement
  • Designed, Build, Test and deploy Hive Scripts, Shell Scripts.
  • Define and Establish Production Support process Bigdata Platform
  • Help the team on Debugging and interact with Cloudera and Infrastructure team for any issues. Worked on GIT for version control, JIRA for project tracking and Jenkins for continuous Integration.
  • Tuned Hive table and queries to achieve performance.
  • Participated in Release planning, Sprint ceremonies and Daily Stand-ups Responsible for design and development of Big Data applications using Cloudera Hadoop.
  • Attended the SCRUM meetings and provide the status to Scrum Master and back up to Scrum Master.

Environment: s: Hadoop 2.x, Pig, HDFS, Scala, Spark, Sqoop, HBase, Oozie, Java, Maven, IntelliJ, GIT, HBase, Putty, Tableau, Map Reduce, HIVE, Teradata, MySQL, SVN, Zookeeper, Linux Shell Scripting.

Confidential, Pittsburgh, PA

Hadoop Developer

Responsibilities:

  • Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, Hive and Sqoop.
  • Developed simple and complex Map Reduce programs in Java for Data Analysis on different data formats
  • Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms.
  • Experienced in implementing static and dynamic partitioning in hive.
  • Experience in customizing map reduce framework at different levels like input formats, data types and partitions.
  • Extensively Used Sqoop to import/export data between RDBMS and hive tables, incremental imports and created Sqoop jobs for last saved value.
  • Involved in loading data from LINUX file system to HDFS.
  • Developing Scripts and Autosys Jobs to schedule a bundle (group of coordinators), which consists of variousHadoop Programs using Oozie.
  • Involved in creating tables in Hive and writing scripts and queries to load data into Hive tables from HDFS.
  • Created tables in HBase and loading data into HBase tables.
  • Created Oozie workflow engine to run multiple Hive jobs.
  • Experienced with using different kind of compression techniques to save data and optimize data transfer over network using Snappy in Hive tables.
  • Creating Hive tables, loading with data and writing Hive queries which will run internally in Map Reduceway
  • Used Zookeeper for providing coordinating services to the cluster.

Environment: Hadoop, Cloudera (CDH 4), HDFS, Hive, Flume, Sqoop, Pig, Java, Eclipse, Teradata, MongoDB, Ubuntu, UNIX, and Maven.

Confidential, Milwaukee, WI

Mainframe Developer

Responsibilities:

  • Providing the customer support and resolve the issues within SLA.
  • Providing the 24x7 Production Support as a Primary Support.
  • Strictly followed the change management process to implement the changes.
  • Developed the new interfaces between the GPT and Passport.
  • Identify the issues with the existing interfaces between GPT and Passport and modify the existing functionality on legacy system.
  • Resolving the SAP & Passport hyper care issues.
  • Analyzed the new functional requirements of the application.
  • Developed test strategy for the application and developed various Test plans and Test cases.
  • Involved in Quick Analysis of Production Bugs.
  • Involved in the interaction with end user and provided timely technical support to end users.
  • Involved in the Weekly issue meeting with the customer.
  • Organizing meeting with the department heads when changes are recommended for the existing system for performance improvement.
  • Involved in Coding new programs, testing, prep of Tech Specs, Reviews.
  • Involved in different Cycles of Testing after fixing the bugs and before moving the Programs into Production Region.

Environment: s: IBM Z/OS, OS/390, CICS, COBOL, VSAM, JCL, DB2, IMS DB/DC, File Aid, Intertest, Panvalet, QMF, SPUFI. Platinum for DB2.

Confidential, Chicago, IL

Senior Mainframe Developer

Responsibilities:

  • Involved in the review and analysis of Business design summary.
  • Analyzed the functional requirements of the application.
  • Developed test strategy for the application and developed various Test plans and Test cases.
  • Extensively conducted Functional and Regression testing.
  • Responsible for performing Functional testing on the application.
  • Involved in 24x7 Production Support as a Primary Support.
  • Coding, testing of the Cobol/VSAM/DB2/CICS programs
  • Involved in Quick Analysis of Production Bugs.
  • Involved in Analysis of different applications to provide bug fixes.
  • Involved in the interaction with end user and provided timely technical support to end users.
  • Involved in the Weekly issue meeting with the customer.
  • Organizing meeting with the department heads when changes are recommended for the existing system for performance improvement.
  • Involved in Coding new programs, testing, prep of Tech Specs, Reviews.
  • Organizing meeting with the team on daily basis.
  • Involved in different Cycles of Testing after fixing the bugs and before moving the Programs into Production Region.

Environment: s: IBM Z/OS, OS/390, CICS, COBOL, VSAM, JCL, JAVA, DB2, File Aid, Intertest, Panvalet, QMF, SPUFI. Platinum for DB2.

Confidential, Chicago, IL

Mainframe Developer

Responsibilities:

  • Involved in the design and implementation of the Module
  • Involved in System Testing, UAT, Regression Testing.
  • Carry out the Enhancement work as per the MCR given by Client.
  • Code Reviews
  • Coding, testing of the Cobol/VSAM/DB2/IMS programs
  • Attending corporate meeting for the kick off major enhancements at corporate level.
  • Organizing meeting with the SME’s of the dependent systems when changes are done for the existing system.
  • Involved in documentation for the customer deliverables and preparing user documentation.
  • Conducting s for the users.
  • Involved in the Weekly issue meeting with the customer.
  • Organizing meeting with the department heads when changes are recommended for the existing system for performance improvement.
  • Organizing meeting with the team on daily basis.
  • Refreshing the DB2 Test database with the Production database.
  • Involved in the disaster recovery.
  • Involved in Production support 24X7.
  • Involved in the preparation of billing reports.
  • Involved in the design of CICS Maps using BMS Macros.

Environment: s: IBM Z/OS, OS/390, CICS, COBOL, VSAM, JCL, DB2, IMS DB/DC, File Aid, Intertest, Panvalet, QMF, SPUFI. Platinum for DB2.

Confidential

Mainframe Developer

Responsibilities:

  • Providing the customer support and resolve the issues within SLA.
  • Providing the 24x7 Production Support as a Primary Support.
  • Strictly followed the change management process to implement the changes.
  • Developed the new interfaces between the GPT and Passport.
  • Identify the issues with the existing interfaces between GPT and Passport and modify the existing functionality on legacy system.
  • Resolving the SAP & Passport hyper care issues.
  • Analyzed the new functional requirements of the application.
  • Developed test strategy for the application and developed various Test plans and Test cases.
  • Involved in Quick Analysis of Production Bugs.
  • Involved in the interaction with end user and provided timely technical support to end users.
  • Involved in the Weekly issue meeting with the customer.
  • Organizing meeting with the department heads when changes are recommended for the existing system for performance improvement.
  • Involved in Coding new programs, testing, prep of Tech Specs, Reviews.
  • Involved in different Cycles of Testing after fixing the bugs and before moving the Programs into Production Region.

Environment: s: IBM Z/OS, OS/390, CICS, COBOL, VSAM, JCL, DB2, IMS DB/DC, File Aid, Intertest, Panvalet, QMF, SPUFI. Platinum for DB2.

We'd love your feedback!