We provide IT Staff Augmentation Services!

Hadoop Developer Resume

PROFESSIONAL SUMMARY:

  • Around 9+ years of experience in IT industry as a Certified Hadoop Developer for 4 years and as SQL developer for 5 years in industries like Utility, Entertainment and Manufacturing.
  • Excellent understanding of Hadoop architecture and components such as HDFS, Job Tracker, Task Tracker, Namenode, Datanode and MapReduce programming paradigm.
  • Expertise in development on Hadoop eco systems - Hive, Sqoop, PIG, HBase.
  • Excellent understanding of YARN architecture.
  • Utilization of Ambari environment for monitoring the workflow and performance issues of production runs.
  • Worked extensively in writing PL/SQL procedures, Functions, Triggers, Cursors.
  • Experience in PL/SQL programming in developing advanced Object Relational Database objects, Packages, Triggers and Subprogram units.
  • Experience in Data Migration using SQL-Loader, day to day DBA activities such as Import-Export, user maintenance and performance tuning
  • Experienced in working with Agile and Waterfall methodologies.
  • In-depth understanding of Software Development Life Cycle, Methodologies and strategies.

TECHNICAL SKILLS:

Hadoop Framework: HDFS, MapReduce, Java, Hive, PIG, Hbase, Sqoop, Flume, Oozie

Databases: Oracle 10g/9i, MySQL, DB2

Languages: SQL, Core Java, Shell script, HiveQL, Pig Latin

Tools: TOAD, Oracle SQL Developer, HP ALM, SOAP UI

Operating Systems: Windows, Linux

IDE Tools: Eclipse

Web Technologies: HTML, JavaScript, CSS

Servers: Apache, Tomcat

PROFESSIONAL EXPERIENCE:

Confidential

Hadoop Developer

Responsibilities:

  • Develop Applications specific to the business needs in Hadoop Environment.
  • Responsible for Application migration from Teradata to Hadoop environment.
  • Processed data in Hadoop environment based on the business logic, requirements and compatibility.
  • Proficient with Hadoop resource allocation for different applications based on the amount of data being processed.
  • Extensive use of shell scripts for passing parameters to HQL and PIG scripts.
  • Use of PIG Latin for routing intermediate data to different applications.
  • Migrating processed data to MySQL using Sqoop.
  • Tuning performance parameters in Hive to handle huge amounts of data being processed.
  • Implementation of Partition/Bucketing schemes in Hive for easier access of data.
  • Hive tables are partitioned for different runs to store the data and compare the results after metadata and business logic changes.
  • Proficient in transformation of different types of data like structured, semi structured and unstructured data.
  • Implementing compression techniques on hive tables for efficient disk usage.
  • Generating user end reports in Hadoop and exporting to ftp locations as desired using shell scripting.
  • Moving the data efficiently between clusters using distcp.
  • Excellent understanding of YARN architecture.
  • Utilization of Ambari environment for monitoring the workflow and performance issues of production runs.
  • Participated in estimation activities and facilitated project scoping, requirements meetings.

Confidential, New York

Oracle SQL Developer

Responsibilities:

  • Developed operational plan for ETL, Data Loading and data cleaning process and wrote scripts for automation using shell scripting.
  • Worked with Data Modeling (both Physical and Logical Design)
  • Wrote stored procedures, functions and other PL/SQL blocks for automations in processing and loading data in to tables.
  • Wrote Unix Shell script to monitors Oracle instance performance, tablespace, users and objects
  • Oracle Performance Tuning. Involved in designing database, modeling database and maintaining database
  • Involved in Monitoring, Tuning, auditing users, assigning roles and privileges, backup and recovery.
  • Involved in partition designing and table space and table partitioning for data warehousing purpose.
  • Created perfect Models for database structures, created star/snow flake Schemas, sizing of schemas including Tables, Indexes, tablespace size calculations, specifying extent sizes to database objects.
  • Wrote stored procedures, functions and other PL/SQL blocks for automations in processing and loading data in to tables.
  • Wrote SQL Loader scripts for bulk load operations in Tables.
  • Developed shell scripts for the execution of different module procedures for batch processing.

Confidential, Orlando, Florida

PL/SQL Developer

Responsibilities:

  • Define database structure, mapping and transformation logic. Creation of External Table scripts for loading the data from source.
  • Wrote UNIX Shell Scripts to run database jobs on server side.
  • Developed new and modified existing packages, Database triggers, stored procedure and other code modules using PL/SQL in support of business requirements.
  • Worked with various functional experts to implement their functional knowledge into business rules in turn as working code modules like procedures and functions.
  • Effectively involved in Functional and Technical documentation for Report development as per business needs, developed stored procedures for the reporting needs, converting regular reports into HTML, Excel and PDF formats.
  • Used TOAD and SQL navigator extensively. production support activities by writing automation scripts using UNIX Shell Scripting to generate business critical reports (Daily, Weekly and monthly)
  • Operated under Agile environment, with daily scrum meetings, stand up meetings, burn-down charts, presentations and review etc

Hire Now