We provide IT Staff Augmentation Services!

Sr. Hadoop Developer Resume

0/5 (Submit Your Rating)

New York, NY

SUMMARY

  • Having 8 years of IT experience and 3 Years of working experience in Big Data Hadoop technologies like Map Reduce, Hive, HBase, Pig, Sqoop, Oozie, Zookeeper and HDFS.
  • Experience in building, maintaining multiple HADOOP clusters (prod, dev etc.,) of different sizesand configuration and setting up the rack topology for large clusters.
  • Extensive experience inHDFS,Map Reduce, PIG, Hive, Sqoop, Flume, Oozie, Zookeeper, Maven, HBase and Cassandra.
  • Good Experience in core and advanced java concepts.
  • Extensive experience withETLand Query big data tools likePig Latin and Hive QL.
  • Hands on experience in big data ingestion tools likeFlume and Sqoop.
  • Experience intuning and troubleshooting performance issuesin Hadoop cluster.
  • Implemented Proofs of Concept on Hadoop stack and different big data analytic tools.
  • Migration from different databases (i.e.VSAM, DB2, Pl/sql and MYSQL) to Hadoop.
  • Hands onNoSQLdatabase experience withHBase, Cassandra.
  • Good understanding of Data Lakes.
  • Experience in data management and implementation of Big Data applications using HADOOP frameworks.
  • Having good knowledge of spark and spark SQL
  • Good database experience usingSQL Server,Stored Procedures, Cursors, Constraints and Triggers.
  • Experience in designing, sizing and configuring Hadoop environments.
  • Worked with application teams to install operating system,Hadoop updates, patches andversion upgradesas required.
  • Extensive experience in documenting requirements, functional specifications, technical specifications.
  • Highlymotivated,adaptiveandquick learner.
  • Good functional Knowledge on Financial and Capital Markets.
  • Exhibitedexcellent communicationandleadership capabilities.
  • Excellent Analytical, Problem solving and technical skills.
  • Holds strong ability to handle multiple priorities and work load and also has ability to understand and adapt to new technologies and environments faster.
  • Certified (CCDH - 410) Cloudera Developer for Apache Hadoop.

TECHNICAL SKILLS

Operating System: MVS, Windows, Linux

Database: VSAM, MySQL, Oracle, No SQL, Talend, Tableau

Big Data: Apache Hadoop, Hbase, Hive, Pig, Sqoop, Oozie, Zookeeper, Flume, Kafka, Strom, Spark

Languages: Java, Pig Latin, HiveQL, COBOL, CICS, JCL, EZYTRIEVE

Technologies: Credit Card Processing Application (Vision PlusTM package) domain

Web Technologies: Servlets, JSP, XML, Tomcat, HTML, JavaScript, Prime Faces, JSF

IDE’S: Eclipse, Net beans

Web Server: Tomcat

Development Methodologies: Agile/Scrum, Waterfall

PROFESSIONAL EXPERIENCE

Confidential, New York, NY

Sr. Hadoop Developer

Responsibilities:

  • Developed various data cleansing features like Schema validation, Row Count and data profiling using map reduce jobs.
  • Created hive tables for storing the logs, whenever a map reduce job is executed.
  • Created a hive aggregator to update the hive table after running the data profiling job.
  • Extracted data from Teradata to HDFS using sqoop.
  • Analyzed the data by performing Hive queries
  • Implemented Partitioning, Dynamic Partitioning and Bucketing in HIVE.
  • Developed Hive queries to process the data and generate the data cubes for visualizing.

Confidential, Minneapolis, MN

Sr. Hadoop Developer

Responsibilities:

  • Moved all crawl data flat files generated from various retailers to HDFS for further processing.
  • Written the Apache PIG scripts to process the HDFS data.
  • Created Hive tables to store the processed results in a tabular format.
  • Developed the sqoop scripts in order to make the interaction between HDFS and MySQL Database.
  • Involved in gathering the requirements, designing, development and testing
  • Writing the script files for processing data and loading to HDFS
  • Writing CLI commands using HDFS.
  • Developed the UNIX shell scripts for creating the reports from Hive data.
  • Completely involved in the requirement analysis phase.
  • Analyzing the requirement to setup a cluster
  • Created two different users (hduser for performing hdfs operations and map red user for performing map reduce operations only)
  • Data is processed using MapReduceand the result is stored in Hbase and displayed as per the user requirement either Pie/Bar chart or both.
  • Setting Password less Hadoop
  • Setting up cron job to delete Hadoop logs/local old job files/cluster temp files
  • Setup Hive with MySQL as a Remote Metastore
  • Moved all log/text files generated by various products into HDFS location
  • Written Map Reduce code that will take input as log files and parse the logs and structure them in tabular format to facilitate effective querying on the log data
  • Created External Hive Table on top of parsed data.

Confidential, FL

Data Analyst

Responsibilities:

  • Setup Distributed cluster.
  • Involved in requirements gathering, analysis.
  • Understanding and designing the architecture.
  • Storage of data RDBMS.
  • Data is processed using SQLand the result is stored in Oracle
  • Responsible for delivery and review of all tasks delivery
  • Preparing weekly status and monthly status report.
  • Attending Defect calls to provide latest status to client.

Confidential, Tampa, FL

Software Developer

Responsibilities:

  • Working as Project Team from client location.
  • Coding and unit testing
  • Analysis, resolving batch & online tickets of Production.
  • Responsible for delivery and review of all tasks delivery
  • Defect Fixes tracking.
  • Responsible for release upgrade implementation in Production.
  • Provide business analysis documents to understand the functionality of the particular incident raised, system flow.
  • Handling Code Management for multiple region architecture.
  • Preparing weekly status and monthly status report
  • Attending Defect calls to provide latest status to clients
  • Involved in designing the environment like CICS regions, Scheduler deigning
  • Handled Code Management for multi region operation which helps in parallel development of code.
  • Provided timely solutions to the issues faced, which helped in meeting the deadlines of the project.
  • Ensuring the quality of deliverables
  • Attending defect calls and providing updates to clients on regular intervals.

We'd love your feedback!