We provide IT Staff Augmentation Services!

Hadoop Consultant Resume

0/5 (Submit Your Rating)

NJ

OBJECTIVE:

  • To pursue a rewarding and challenging position within a company that provides a high level of professional satisfaction and the opportunity to succeed in my goals as well helping in growth and success of the company.

PROFESSIONAL SUMMARY:

  • 10 years of overall IT experience in software Development in variety of industries, which includes hands on experience in Oracle Pl/SQL and Big Data Hadoop
  • Over 2 years of comprehensive experience as a Hadoop Developer in all phases of Hadoop and HDFS development.
  • Good Knowledge on Hadoop stack, Cluster architecture and monitoring the cluster.
  • Well versed with Developing and Implementing MapReduce programs using Hadoop to work with BigData
  • Hands on experience withHadoop, HDFS, MapReduce andHadoopEcosystem (Pig, Hive, Oozie, Flume, Storm and Sqoop) and good knowledge on Spark and Scala
  • Experience with NoSQL databases like HBase
  • Experience in analyzing data using HiveQL, Pig Latin, and custom Map Reduce programs in Java.
  • Experience in Hadoop administration activities such as installation and configuration of clusters using Apache and Cloudera
  • Wrote custom UDFs for extending Hive and Pig core functionality.
  • Developed the Pig UDF'S to pre - process the data for analysis.
  • Experience in implementing in setting up standards and processes for Hadoop based application design and implementation.
  • Experienced with creating workflows using Oozie for cron jobs.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems, Teradata and vice versa.
  • Experience in managing Hadoop clusters using Cloudera Manager Tool.
  • Hands on experience in application development using RDBMS, and Linux shell scripting.
  • Involved in converting the business requirements into System Requirements specification (SRS)
  • Extensive experience in development and maintenance using Tableau 8.1, 7, Oracle SQL, PL/SQL, SQL Loader,OBIEE.
  • Experienced with version controller systems like CVS, SVN.
  • Experience on Investment Banking and Insurance Domain.
  • Excellent global exposure to various work cultures and client interaction with diverse teams.

TECHNICAL SKILLS:

Hadoop/Big Data/NoSQL Technologies: HDFS, MapReduce, Hive, Pig, Sqoop, Flume, HBase, Oozie, Zookeeper

Programming Languages: C,SQL, PL/SQL, Python,HTML, XML, Shell Script

IDE Tools: Eclipse

Software Tools: Remedy,Putty,Toad,Sql Developer, DB Symphony

BANKING TOOLS / AREAS: TOOLS: EDTS, ZTS, CMT, EMMA, ESALES, OMGEO, DREAM, GCE, PRIMOCLIENT SIMULATER, COMET: AREAS: FO,MO,BO and experience in OG, FIX and SAS allocation technique

Databases: Oracle 11g/10g/9i, MySQL, DB2,MS-SQL Server

Operating Systems: AIX Unix, Windows 2000 Server, Windows 95/98 NT

Reporting Tools: Tableau 8.1, OBIEE

PROFESSIONAL EXPERIENCE:

Confidential, NJ

Hadoop Consultant

Responsibilities:

  • Moved all crawl data flat files generated from various Customers to HDFS for further processing.
  • Written the Apache PIG scripts to process the HDFS data.
  • Created Hive tables to store the processed results in a tabular format.
  • Developed the sqoop scripts in order to make the interaction between Pig and MySQL Database.
  • Involved in gathering the requirements, designing, development and testing.
  • Writing the script files for processing data and loading into HDFS.
  • Writing CLI commands using HDFS.
  • Developed the UNIX shell scripts for creating the reports from Hive data.
  • Completely involved in the requirement analysis phase.
  • Analyzing the requirement to setup a cluster.
  • Installed and configured Hive, Pig, Sqoop and Oozie on Hadoop cluster.
  • Created two different users (hduser for performing hdfs operations and map red user for performing map reduce operations only).
  • Setting up cron job to delete Hadoop logs/local old job files/cluster temp files.
  • Setup Hive with MySQL as a Remote Metastore.
  • Moved all log/text files generated by various products into HDFS location.
  • Written Map Reduce code that will take input as log files and parse the logs and structure them in tabular format to facilitate effective querying on the log data.
  • Created External Hive Table on top of parsed data.

Environment: Cloudera CDH 5.0, CDH 4.3, MySQL, Map Reduce, Hive, Pig, Oozie & Sqoop

Confidential, NJ

PL/SQL developer /Technical Lead

Responsibilities:

  • Gathered all the requirements and created Technical Specifications document, Detailed Design for OneSource Application.
  • Understands priorities and seeks resources that are needed When faced with a road block explores alternative options to bring the deliverables back on track
  • Proactively Identified unhandled exceptions and modified the code to bring the deliverables on track
  • Wrote new programs of moderate complexity and scope, working with basic applications systems
  • Developed Code to handle taxation and rounding rules of OneSource trading application.
  • Closely working with different automation value ads and performance area to stabilize the system
  • Enhanced system performance
  • Involved in Performance, volume testing and Query optimization
  • UTC and ITC preparation, QA Test Case Review, preparing test plan and execution of test cases with test logs
  • Performing load testing
  • Release notes preparation, QA, UAT and post production support
  • Provided application and user support and performs troubleshooting
  • Provided complete production support as necessary

Environment: Tableau 7, Oracle 10g, SQL, Oracle PL/SQL, UNIX, Shell Scripting

Confidential

Senior Oracle Developer

Responsibilities:

  • Handling all the trade related issues while trades are moving from EDTS/COMET (FO) to ZTS (MO)
  • Sending daily checkouts mail to business users for all the tools which are being used in trading life cycle.
  • Automation program developed for morning checkouts.
  • Server maintenance.
  • Checking logs if exception(s) happened.
  • ITRS configuration and checking server health checks (Daily).
  • Checks hung query, database log etc through ITRS.
  • Team meeting.
  • Attend CITI technical sessions.

Environment: Oracle 10g, SQL, Oracle PL/SQL, UNIX, Shell Scripting

Confidential

Senior Oracle Developer

Responsibilities:

  • Involved in the requirement study.
  • Developing components for Middle Tier as per various business rules.
  • Designing of User interface.
  • Testing of Modules and Code review.
  • Performance Monitoring (CPU, Memory, Paging, Network Latency, HTTP Response)
  • Writing Shell scripts for automating all the manual tasks on UNIX
  • Generate the reports for the client as and when required.
  • Root cause analysis including code investigation and problem management.
  • Scripts and batch schedule maintenance.
  • Maintenance due to hardware, software, OS or layered product upgrades.
  • Manual Processing of jobs.
  • Incident analysis and providing solutions including if code changes required
  • Proper load balancing of tickets between team members
  • Monitor Incident queue to ensure urgent attention to critical issues
  • Supervision and escalation of critical issues.
  • Work with global support teams on a continuous basis for resolution of critical issues
  • Timely and accurate outage communication to relevant stakeholders
  • Identification of recurring problems which are in scope of L3 for permanent fix.
  • Connectivity testing with UP Streams
  • Resolve incidents Within SLA
  • Proactive monitoring, Ad hoc Process monitoring
  • Deployment Management, Running the scripts for deployment
  • Getting involved in dry runs after deployment, Release Signoff
  • Application Restart, Handling ad hoc Jobs, Automation of support tasks
  • Technical resource monitoring such as Memory, Temp space, CPU etc.

Environment: Oracle 9i, SQL, Oracle PL/SQL, UNIX, Shell Scripting

Confidential

Software Engineer/Developer

Responsibilities:

  • Involved in the Requirements collection stage and meeting with Clients to collect the information about the System and prepared functional requirements document.
  • Working on UML design models and develop proto-type
  • Designing the system and documenting the design
  • Preparing the Unit Test cases
  • Executing the Unit Test cases
  • Managing Month end cycle
  • Incident analysis and providing solutions including if code changes required proper load balancing of tickets between team members
  • Monitor Incident queue to ensure urgent attention to critical issues appropriate attention to aged tickets
  • Supervision and escalation of critical issues. Assume the role of situation manager during critical outages
  • Work with global support teams on a continuous basis for resolution of critical issues
  • Timely and accurate outage communication to relevant stakeholders
  • Conduct daily and periodic
  • Identification of recurring problems which are in scope of L3 for permanent fix.
  • Connectivity testing with UP Streams
  • Resolve incidents Within SLA

Environment: Oracle 9i, SQL, Oracle PL/SQL, UNIX, Shell Scripting

We'd love your feedback!