We provide IT Staff Augmentation Services!

Hadoop Lead / Sr Developer Resume

Quincy, MA

PROFESSIONAL SUMMARY:

  • Around 10+ years of experience in all phases of SDLC including application design, development, production support & maintenance projects.
  • Having 5+ years of experience in Hadoop stack, HDFS, Map Reduce, Sqoop, Pig, Hive, HBase, Strom, Spark, Scala, Parquet & Kafka.
  • Expertise in Designing and Architecting Hadoop Applications and recommending the right solutions and technologies for the applications.
  • Having good expertise on Hadoop tools like Mapreduce, HiveQL, PIG and Sqoop.
  • Having sound knowledge in Data warehousing Concepts and Tools like Teradata and BASE SAS.
  • Expertise in Teradata Tools like Fast Load, Fast Export, Multi Load, TPUMP and TPT Utilities.
  • Good expertise in reporting tool BASE - SAS 9.1.3
  • Good expertise with the DB2 SQL performance tools like Apptune and Visual explain.
  • Good Experience as a Tech / Project Lead.
  • Excellent understanding and knowledge of NOSQL databases like MongoDB HBase Cassandra.
  • Have sound exposure to Retail Market including Retail Delivery System.
  • Hands on experience in application development using Java RDBMS and Linux shell scripting.
  • Hands on experience in VPN Putty winSCP VNCviewer etc.
  • Extensive experience working in Oracle DB2 SQL Server and My SQL database.
  • Experience in Object Oriented Analysis Design OOAD and development of software using UML Methodolgy, good knowledge of J2EE design patterns and core Java design pattern.
  • Experience in Java JSP Servlets EJB WebLogics WebSphere Hibernate Spring JBoss JDBC RMI Java script Ajax Jquery XML and HTMIL.
  • Ability to adapt to evolving technology and strong sense of responsibility and accomplishment.

TECHNICAL SKILLS:

  • Cloudera Distribution for Hadoop (CDH), MapReduce, HDFS, YARN, Hive, Pig, Sqoop, Storm, Spark, Scala, Elastic search, Kibana, Parquet, Flume, AWS
  • Core Java
  • LINUX, UNIX, Windows
  • ORACLE, MySQL
  • Eclipse
  • Teradata, Base SAS
  • Waterfall, Agile

WORK EXPERIENCE:

Hadoop Lead / Sr Developer

Confidential, Quincy, MA

Responsibilities:

  • Work on Hadoop Cluster with current size of 56 Nodes and 896 Terabytes capacity.
  • Write Map Reduce Jobs, HIVEQL, Pig, Spark.
  • Import data using Sqoop into Hive and Hbase from existing SQL Server.
  • Support code/design analysis, strategy development and project planning.
  • Create reports for the BI team using Sqoop to export data into HDFS and Hive.
  • Develop multiple MapReduce jobs in Java for data cleaning and preprocessing.
  • Involve in Requirement Analysis, Design, and Development.
  • Export and Import data into HDFS, HBase and Hive using Sqoop.
  • Involve in create Hive tables, loading with data and writing Hive queries which will run internally in MapReduce way.
  • Work closely with the business and analytics team in gathering the system requirements.
  • Load and transform large sets of structured and semi structured data.
  • Load data into HBase tables using Java MapReduce.
  • Load data into Hive partitioned tables.

Confidential, Quincy, MA

Lead / Sr. Hadoop Developer

Technologies: CDH, HDFS, MapReduce, Hive, Pig, Flume, Spark, Scala, Elastic search, Kibana, Shell scripting, UNIX.

Responsibilities:

  • Provide design recommendations and thought leadership to sponsors /stakeholders that improved review processes and resolved technical problems.
  • Co - coordinate between the Business and the Off-shore team.
  • Requirement gathering and prepare the Design.
  • Export and Import data into HDFS, HBase and Hive using Sqoop.
  • Involve in creating Hive tables, loading with data and writing Hive queries.
  • Bulk load HBase using Pig.
  • Implement solutions using Hadoop, HBase, Hive, Sqoop, Java API, etc.
  • Work closely with the business and analytics team in gathering the system requirements.
  • Load and transform large sets of structured and semi structured data.
  • Load data into HBase tables using Java MapReduce.
  • Load data into Hive partitioned tables.

Confidential, Quincy, MA

Lead / Sr. Hadoop Developer

Technologies: HDFS, Core Java, MapReduce, Hive, Pig, Sqoop, Shell scripting, UNIX.

Responsibilities:

  • Work on a Hadoop Cluster with current size of 56 Nodes and 896 Terabytes capacity.
  • Write Map Reduce Jobs, HIVEQL, Pig.
  • Import data using Sqoop into Hive and Hbase from existing SQL Server.
  • Support code/design analysis, strategy development and project planning.
  • Create reports for the BI team using Sqoop to export data into HDFS and Hive.
  • Develop multiple MapReduce jobs in Java for data cleaning and preprocessing.
  • Involve in Requirement Analysis, Design, and Development.
  • Export and Import data into HDFS, HBase and Hive using Sqoop.
  • Involve in creating Hive tables, loading with data and writing Hive queries which will run internally in MapReduce way.
  • Work closely with the business and analytics team in gathering the system requirements.
  • Load and transform large sets of structured and semi structured data.
  • Load data into HBase tables using Java MapReduce.
  • Load data into Hive partitioned tables.

Confidential, Quincy, MA

Hadoop Developer

Technologies: CDH, HDFS, Core Java, MapReduce, Hive, Pig, Hbase, Sqoop, Shell scripting, UNIX.

Responsibilities:

  • Supported code/design analysis, strategy development and project planning.
  • Created reports for the BI team using Sqoop to export data into HDFS and Hive.
  • Involved in Requirement Analysis, Design, and Development.
  • Export and Import data into HDFS, HBase and Hive using Sqoop.
  • Involved in creating Hive tables, loading with data and writing Hive queries which will run internally in MapReduce way.
  • Work closely with the business and analytics team in gathering the system requirements.
  • Load and transform large sets of structured and semi structured data.
  • Loading data into Hive partitioned tables.

Confidential, Quincy, MA

Project Lead

Technologies: COBOL, Easytrieve, IMS DB, DB2, JCL, Dump Master, IsyncVisual Explain, FTP/PGP/TIBCO transfers, Endevor

Hire Now