We provide IT Staff Augmentation Services!

Hadoop Developer Resume

New, JerseY

PROFESSIONAL SUMMARY:

  • A Dedicated, Assertive and Qualified Technology Professional working as Hadoop Developer.
  • 8+ years of overall IT experience in Application Development in Big Data Hadoop .
  • 5 + yrs of exclusive experience and knowledge in Big Data Hadoop and its components like HDFS, Map Reduce, Apache Pig, Hive, Sqoop, HBase, IMPALA, Flume,OOZIE, SPARK, Spark Streaming, Kafka and SCALA.
  • Extensive Experience in Setting Hadoop Cluster.
  • Good Working Experience on Hadoop HDFS, Hive, Pig, MR Jobs, Impala,flume and SQOOP and OOZie, Core Java
  • Having Experience in to Import/Export data from Existing RDBMS.
  • Good Knowledge on Oracle 10g, MySQL and NOSQL databases .
  • Good Exposure on Query Programming Model of Hadoop (Pig and Hive).
  • Good Experience in SPARK and SCALA.
  • Good Knowledge on Hcatalog, Impala, and NoSQL data bases (MongoDB and Hbase).
  • In depth understanding/knowledge of hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Data Node, Name Node and Map - reduce concepts.
  • Having Experience on Single node and multi node cluster configurations, Decommission and commissioning of nodes in the cluster.
  • Participated in the client calls to gather the requirements and involved in the preparation of design, mapping as well as unit test cases documents.
  • Good exposure on data warehousing concepts, UNIX and oracle.
  • Excellent communication, interpersonal, analytical skills, and strong ability to perform as part of team.
  • Exceptional ability to learn new concepts.
  • Willing to walk an extra mile to achieve excellence.
  • Trained in Sales force from the Organization and having good knowledge in implementing the POCs in Sales Force and Sales force certified.
  • Learn new technologies and ready to reach the management expectations..
  • Having experience to work and involved in few POC implementations.

TECHNICAL SKILLS:

Languages: Core Java and Pig Latin, eScript.

Operating Systems: Windows and Linux.

Big Data Technologies: Hadoop, Hive, Pig, Sqoop, Flume, HUE, Oozie

RDBMS: Oracle 10g and My SQL 5.5.35.

Distributed Database (NoSQL): HBase,MongoDB.

IDE's: Eclipse 3.7.2.

Other Technologies: Sales force, Siebel.

Domain knowledge: Siebel Communications, Call Centre, Financial Services, Banking

PROFESSIONAL EXPERIENCE:

Confidential

Hadoop Developer, New jersey

Responsibilities:

  • Working on the enhancements and change requests for the Confidential phase 1 release using SPARK with Java and defect fixing.
  • Analysed the Source data table from Oracle and MySQL and imported the data using Sqoop to Hadoop.
  • Involved in designing the Data models and processed the data using Impala and writing shell scripts to execute the impala queries based on the conditions.
  • Exporting the Data tables back to the MySQL data base.
  • Implemented Hive tables and HQL Queries for the reports.
  • Converting the MySQL Stored Procedures to Impala ETL jobs for faster performance.
  • Unit testing and defect fixing.
  • Designed Hadoop Data Lake and developed Data ingestion framework supporting RDBMS and Flatfiles.
  • Migrating back-dated and Ingesting Daily data into the Data Lake from multiple sources.
  • Generating the Canned reports on the Daily data using HQL queries and Impala queries and reports handover to business users.
  • Developed utility tools using shell scripting, Spark, and python scripting for reconciliation.
  • Categorizing the huge volume of data in the data lake based on the business requirement to enhance the generation of canned reports performance and query execution time using Spark.
  • Supporting UAT and Production Environments.
  • Handling data types effectively between RDBMS to HIVE without any data loss.

Confidential

Technologies: Hadoop (Spark, hive, impala, sqoop), Java.

Hadoop Developer, New jersey

Responsibilities:

  • Working on the enhancements and change requests for the Confidential phase 1 release using SPARK with Java and defect fixing.
  • Analysed the Source data table from Oracle and MySQL and imported the data using Sqoop to Hadoop.
  • Involved in designing the Data models and processed the data using Impala and writing shell scripts to execute the impala queries based on the conditions.
  • Exporting the Data tables back to the MySQL data base.
  • Implemented Hive tables and HQL Queries for the reports.
  • Converting the MySQL Stored Procedures to Impala ETL jobs for faster performance.
  • Unit testing and defect fixing.
  • Implemented POC for the Neilson Using SPARK Streaming and Kafka.
  • Implemented the POC on MongoDB.

Confidential

Technologies: Hadoop (PIG, Hive, Map Reduce and Shell scripting),Java.

Hadoop Developer

Platform: Windows, Linux.

Responsibilities:

  • Involved in the Business client calls and requirement analysis and design.
  • Actively involved in business requirement discussions from Onsite Counterpart and discussions done with Offshore Team for the implementation and coordinated all project related activities.
  • Developed Sqoop scripts to import/export data between HDFS and MySQL Database.
  • Involved in creating Hive tables, loading with data and writing hive queries that will run internally in mapreduce way.
  • Supported MapReduce Programs those are running on the cluster.
  • Analyzed large data sets by running Hive queries and Pig scripts.
  • Worked on tuning the performance Pig queries.
  • Involved in Developing the Pig scripts for processing data.
  • Written Hive queries to transform the data into tabular format and process the results using Hive Query Language.
  • Created Pig Latin scripts to sort, group, join and filter the enterprise wise data.
  • Analysed the functional specifications.
  • Implemented PIG scripts According business rules.
  • Implemented Hive tables and HQL Queries for the reports.
  • Unit testing and defect fixing.

Confidential, New Jersey

Technologies: Hadoop, PIG, HIVE, Map reduce.

Hadoop Developer

Responsibilities:

  • Involved in requirement analysis.
  • Extensively involved in installation and configuration of Cloudera distribution of Hadoop, its Name node, Secondary Name node, Job tracker, Task trackers and Data nodes
  • Worked on analyzing Hadoop stack and different big data analytic tools including Pig and Hive, HBase database and Sqoop.
  • Created Pig Latin scripts to sort, group, join and filter the enterprise wise data.
  • Involved in creating Hive tables, loading with data and writing hive queries that will run internally in map reduce way.
  • Supported Map Reduce Programs those are running on the cluster.
  • Analyzed large data sets by running Hive queries and Pig scripts.
  • Worked on tuning the performance Pig queries.
  • Worked on installing cluster, commissioning & decommissioning of Data nodes, Name node recovery, capacity planning, JVM tuning, map and slots configuration.
  • Involved in exploring Hadoop Map-reduce Programming and Cluster configuration and installation. written pig scripts for pre-processing of customer data.
  • Developed map reduce programs for batch processing and customer data analysis to suggest the best plan for customer.
  • Worked on tuning the performance of Hive and Pig queries.
  • Writing java code for custom partitioner and writables
  • Integrated hive with Pentaho for data reports. unit testing of web application and hadoop programs. involved in bug fixing and testing of the application.

Confidential

Technologies: Data stage and oracle .

Hadoop Developer

Responsibilities:

  • Involved in requirement analysis.
  • Extensively involved in installation and configuration of Cloudera distribution of Hadoop, its Name node, Secondary Name node, Job tracker, Task trackers and Data nodes
  • Worked on analyzing Hadoop stack and different big data analytic tools including Hive, HBase database and Sqoop.
  • Created Pig Latin scripts to sort, group, join and filter the enterprise wise data.
  • Involved in creating Hive tables, loading with data and writing hive queries that will run internally in map reduce way.
  • Analyzed large data sets by running Hive queries .
  • Worked on installing cluster, commissioning & decommissioning of Data nodes, Name node recovery, capacity planning, JVM tuning, map and slots configuration.
  • Involved in exploring Hadoop Map-reduce Programming and Cluster configuration and installation. written pig scripts for pre-processing of customer data.
  • Developed map reduce programs for batch processing and customer data analysis to suggest the best plan for customer.
  • Worked on tuning the performance of Hive .
  • Writing java code for custom partitioner and writables
  • Integrated hive with Pentaho for data reports. unit testing of web application and hadoop programs. involved in bug fixing and testing of the application.

Confidential

Configurator/upgrade and support

Responsibilities:

  • Worked on pre upgrade and post upgrade tasks
  • Fixing defects as per 7.8 functionalists
  • Tested and fixed defects in List Management module, Event Management Performance tuning.
  • Tested Inbound and Outbound web services using SoapUI.

Confidential

Siebel Developer

Responsibilities:

  • Team Member in the Account Management Module(Customer/Billing Management)
  • Involved in Requirement Analysis and clarification with Client team
  • Solution Design(Low Level Design)
  • Siebel Configuration
  • Understanding the design requirements and contributing to design with POCs
  • Configuration of Applets, Views, Screens.
  • Configured and Customized the Siebel application screens Service Request, Activity, etc using Siebel Tools in accordance with the business requirements.
  • Involved in customizing User Interface Layer, Business Objects Layer and Data Objects Layer.
  • Modified Business Components to allow updates and add necessary controls to the Applet.
  • Designed and developed new pick list components, MVG lists, Links, Joins.
  • Created and customized Drilldowns (Static Drilldowns and Dynamic Drilldowns).
  • Modified applets and Special purpose Applets such as Association Applet, MVG Applet and Pick Applet, Toggle Applets.
  • Creating and Modifying Workflows
  • Collating Risks, Issues and Assumptions from design level.
  • Preparing test cases and Unit testing
  • Defect fixing(Assembly & Integration system testing)
  • Analysis and Retro fitment/Code merge to meet the requirement.
  • Analysing and assigning the defects to the team, sending the report to management with the status.
  • Conducted KT sessions to the new joiners and Entry Level Trainees about the project.
  • Conducted Siebel sessions to the ELTs.
  • Taken up the additional responsibility of gathering the HLD documents for all the 18 modules from client and shared with the team up to date.
  • Taken up the additional responsibility of preparing the Release Notes for 12 Drops, Conducting KT sessions to the entire team about the basic guide lines while preparing the LLDs (Low Level Design).
  • Peer review of all LLD and successfully delivered the LLD to client along with Release notes.
  • Taken up addition responsibility of Coordinating with the team (60+), collate the LLD status, development status, and reporting to the management on a weekly basis.

Hire Now