We provide IT Staff Augmentation Services!

Hadoop Administrator Resume

4.00/5 (Submit Your Rating)

Salt Lake City, UT

SUMMARY:

  • 7+ years of professional experience in Information Technology.
  • 2+ years of experience in Hadoop and its Eco - system components.
  • Experience in working with Large Data warehousing environments.
  • Experience in setting up in-house Hadoop clusters.
  • Experience in using Ambari, Hive, Sqoop, HBase and Cloudera Manager.
  • Experience in data movement using Sqoop from HDFS to Relational Database Systems and vice-verse.
  • Experienced in maintenance of Single instance and Clustered databases.
  • Involvement in various stages of Software Development Life Cycle (SDLC).
  • 4 years of experience in design and development of small scale applications, batch processes and scripts using Sybase, DB2 UDB, Oracle, UNIX, Perl and Shell scripts.
  • 1.8 years of experience in implementing procedures/functions/triggers in Oracle PL/SQL along with database adminstration.
  • 4 years of experience in managing high volume 24x7 production batch (L2 and L3 support)
  • 4 years of experience in Treasury related financial applications and handling escalations in the Appbank Operations for Confidential .
  • Possess excellent knoweldge in swift messaging, swift message structure, payment and cheque processing.
  • Possess excellent optimized query writing techniques in Sybase, DB2 and Oracle.
  • Have Good knowledge on Autosys job Schedulers and have worked on various ticketing tools.
  • Possess excellent knowldge on debugging techniques across multiple technologies like Java, Shell scripting, Databases and Linux/Unix.
  • Leading the disaster recovery projects ( BCP/production swap ) and system upgrades for Treasury to ensure Production system stability & software's used being up-to-date.
  • Worked closely with Treasury operations to ensure seamless operation of business
  • 1 year of experience in Managing 5 member cross functional team located in India(offshore)
  • Organized, dependable and possess excellent communication and interpersonal skills.
  • Possess excellent problem solving and leadership skills.
  • Possess strong analytical and client interfacing skills.
  • Strong ability in multi tasking and prioritizing work load.
  • Highly motivated, detail oriented and has ability to work independently and also as a part of the team with excellent technical, analytical and communication skills.

PROFESSIONAL EXPERIENCE:

Hadoop Administrator

Confidential, Salt Lake City, UT

Responsibilities:

  • Involved as a Hadoop Cluster environment administrator.
  • Cluster capacity planning, performance tuning, cluster Monitoring, Troubleshooting.
  • Configured Pseudo-distributed and Fully-distributed Hadoop clusters.
  • Using Service monitoring tools like Ambari, Nagios and Fabric.
  • Decommissioning and commissioning the Node on running cluster.
  • Installation of Hadoop Ecosystems and Hadoop Daemons.
  • Installation and configuration of Sqoop and Hive.
  • Configured various property files like core-site.xml, hdfs-site.xml, mapred-site.xml based upon the requirement.
  • Cluster monitoring, maintenance and troubleshooting;
  • Importing and exporting data into HDFS using Sqoop.
  • Experienced in defining job flows with Autosys.
  • Experienced in managing and reviewing Hadoop log files.
  • Setup alerts with Ambari about memory and disk usage on the cluster.
  • Used HIVE to aggregate the raw data that we get from clients.
  • Involved with Kerberos Security on Hadoop Clusters.
  • Take ad-hoc requests from the developers to get the non-PII (Personally Identifiable Information) from the production cluster to non-Prod cluster to test out various scenarios.

Principle Consultant

Confidential

Responsibilities:

  • Maintenance, development and enhancement of applications and batch scripts using SQL, Shell Scripts and Perl Scripts. These are used for incoming payment processing and outgoing swift messages and incoming bank swift messages processing.
  • Setting up and managing Autosys Batch for all the above said processes.
  • Re engineering of Production Autosys Batch.
  • Tuning Autosys Batch for quicker delivery.
  • Automation of manual data sanity checks.
  • Providing CSAW scripts for L1 to just run SQL queries by providing input parameters (by using CSAW application)
  • Rewriting/decommissioning legacy processes.
  • Regression and UAT testing of new models/releases.
  • Writing Tables, Views, Triggers, Stored Procedures.
  • Fine tuning longer running Stored Procedures and SQLs.
  • Creating ad-hoc and automated reports for Business Users using complex queries and stored procedure.
  • Analyzing data quality issues and user queries/concerns.
  • Code debugging for production issues in Java/J2EE, Scripts or stored procedures etc.
  • L2/L3 support of Production Batch, which includes monitoring batch on rotational basis, on-call support, job failure analysis and bug fixing, batch tuning, basic UNIX system and process management.
  • Liaise with client managers on the process flow design solution to reduce support costs and have strategies around quick turnaround during production issues, and drive for automation of repetitive manual work
  • Coordination with Risk Managers / Margin lending ops / Client reps to understand grounds for improvement and provide feedback to Application developers.
  • User Support for issues related to Treasury applicatioin and reconciliation as well as infrastructure issues.
  • Code release related review and change management.
  • Coordination between onsite and offshore team.

Software Engineer

Confidential

Programming Languages/Tools used: Oracle Sql, PL/SQL, Core Java, Toad for Oracle, Windows

Responsibilities:

  • Involved in Installation and configuration of Oracle database and Toad for Oracle software.
  • Involved in the designing/development of new proposal system for Confidential by using SQL and PL/SQL.
  • Involved in Database performance tuning, Analysis and Administration.
  • Responsible for logical and physical database design, implementation, transforming logical data models into physical databases, and defining strategies for database implementation, high performance, replication & failover.
  • Writing scripts for moving data from old system to new system.
  • Writing queries for checking all the data is moved to new system or not (manual testing).
  • Supporting the clients on the errors they came across while inserting the deals into new proposed system.
  • On call 24X7 Application Support.

We'd love your feedback!