Hadoop Administrator Resume
SUMMARY
- More than 5 years of experience as Technology Analyst. As part of assignments, has been working on Database Administration, Application Development, Automation & Maintenance, Coding, and performing few Project Management activities.
TECHNICAL SKILLS
Big Data Eco system: Hadoop, Hbase, Hive, Sqoop, Oozie, Flume, Zoo Keeper
Databases: Oracle 9i, 10g, 11g, IBM Netezza
Software, Applications and Tools.: BEA Weblogic 8.1, Tomcat Apache, Subversion, Toad, SQL Plus, Amdocs Clarify CRM 13 SR 108.
Operating Systems: HP - UX 11.31, 11.23, 11.11, 11.0, Solaris 8,9,10, Red Hat Linux 7.2, 8 and 9, ES/AS 2.x/3.x/4.x,Windows NT/2000/2003, MS-DOS.
Software Languages: C++, C, Java, SQL, PL/SQL, Korn Shell Programming, BASH, Perl, Awk, Sed
Other software(s): Microsoft Office (MS Excel, MS Word, MS Power Point), MS-Project, Putty, Cygwin
PROFESSIONAL EXPERIENCE
Confidential
Hadoop Administrator
Responsibilities:
- Hadoopinstallation, configuration of multiple nodes in AWS-EC2
- Setup and optimize Standalone-System/Pseudo-Distributed/Distributed Clusters
- Build/Tune/Maintain Hive QL and Pig Scripts for reporting purpose
- Manage and reviewHadooplog files
- Support/Troubleshoot mapreduce programs running on the cluster
- Load data from Linux/UNIX file system into HDFS
- Install and configure Hive
- Create tables, load data, and write queries in Hive
- Develop scripts to automate routine DBA tasks using Linux/UNIX Shell Scripts(i.e. database refresh, backups, monitoring etc.)
- Tune/Modify SQL for batch and online processes
- Monitor cluster using monitoring tools and optimize system based on job performance and criteria
- Manage cluster through performance tuning and enhancements
Confidential
Hadoop Administrator
Responsibilities:
- Expertise in HDFS Architecture and Cluster concepts.
- Strong shell writing and programming
- Solid understanding of Apache Hadoop
- Proficient with creation of scalable databases
- Solid understanding of agile SDLC environments
- Deep knowledge of Pig, Hive, Scoop, Hbase, Flume, and Oozie
- Good Experience in SQL performance Tuning.
- Installed and configured multi-nodes fully distributed Hadoop cluster
- Involved in installing Hadoop Ecosystem components.
- Responsible to manage data coming from different sources
- Responsible for implementation and ongoing administration of Hadoop infrastructure
- Maintenance of Clusters including installing and configuring new clusters or nodes
- Involved in Installation and configurations of patches and version upgrades
- Involved in Hadoop Cluster environment administration that includes adding and removing cluster nodes, cluster capacity planning, performance tuning, cluster Monitoring, Troubleshooting.
- Supported Map Reduce Programs those are running on the cluster
- Involved in HDFS maintenance and administering it through Hadoop-Java API.
- Configured Fair Scheduler to provide service-level agreements for multiple users of a cluster.
- Maintaining and monitoring clusters. Loaded data into the cluster from dynamically generated files using Flume and from relational database management systems using Sqoop.
- Responsible for Resource management and monitoring.
- Managing nodes on Hadoop cluster connectivity and security.
- Used HBase as the data storage
- Developed the Sqoop scripts in order to make the interaction between Pig and MySQL Database.
Confidential
Database Administrator
Responsibilities:
- Installation, Creation and Maintenance 20+ oracle (10g,11g) databases for Confidential Order Management and fulfillment application
- Tuning Databases at the instance, database and server level by correcting SGA and other memory related parameters.
- Implementing proactive measures in production databases by monitoring alert log and user trace files for identification and resolution of ORA-errors to maximize database availability.
- Administered all database objects like schemas, table spaces, data files, indexes, procedures.
- Involved in Oracle upgrade/patch evaluation and installation to ensure database patch levels are up to date.
- Worked with System Administrators and Backup Administrators in configuration, maintenance and monitoring of RMAN backups.
- Logical Backup of Databases using Export and Import Utility as well as using Data pump.
- Configuration & Maintenance of Oracle Data guard for data protection and disaster recovery.
- Verifying data files Consistency during backups, refreshes and recovery
- Resolving Database connectivity issues between application and database.
- Performance-tuning services for Clients with ADDM, AWR and troubleshooting databases, expertise developing PL/SQL queries and SQL tuning.
- Maintaining the registers to document important database activities such as Server down using oracle utilities.
- Refreshing database schemas with export/import & Data pump utility
- Recognized by the current employer as the key resource in the team and appraised with SPOT award given for excellence in projects.
Confidential
Application and Database Administrator
Responsibilities:
- Developing and Automation of build and code delta deployment through using ANT, Perl, Shell and JSP scripts for wBC sub-stream of BTW.
- Administration of Weblogic & Database and deployments pertaining to the application through UNIX.
- Maintained environments hosted on Weblogic Domains and CRM Clarify Applications hosted on them.
- Maintenance of all database objects like schemas, table spaces, data files, indexes, procedures.
- Configuring manager Anchor for BTW LOB (Subversion CM TOOL).
- Involved in the development of Delta deployment tool and desktop deployment tool of all instances.
- Track the releases and ensure the code is dropped on agreed dates.
- Point of contact for all configuration issues related to BT Wholesale & Open Reach Module.
- Build the code and deployed across Dev/Testing/Pre-prod environment.
- Analyzed and resolved the critical issues related to weblogic
- Worked with clients in upgrading the Clarify from SR 13.108 to SR 6.0.1.44, which enhanced more functionality and reduce bugs in application.
- Logical Backup of Databases using Export and Import Utility as well as using Data pump.
- Capability to debug the complicated issues which will block the testing required on the application.
- Ability to provide support at critical time lines and deliveries.
- Received appreciations from the client for the extreme support provided during the critical deliveries avoiding the blockers.
- The Production Clarify Upgrade has been performed with an issueless application.
- Also developed tool “Issue Repository” to store issues using JSP technology.
- Has been very interactive in sharing the knowledge with the team members and proactively educated them.