Hadoop Admin Resume
Washington, DC
SUMMARY:
Over 12 years’ experience in IT as an Cloudera Hadoop Admin,Oracle Developer/DBA, experienced in both Production/Development/Testing environments and also experienced on Hadoop/Big Data/Java/Python and migration on AWS . Also involved developing ETL jobs and working with automation and manual/backend testing. Finally maintained on Nexidia Audio application which run based on MS SQL Server.
SKILL:
- Talend Open Studio
- Oracle 10g and 11g/12c Database Administration
- Oracle RAC & ASM Administration/Installation
- Performance Tuning
- Backup and Recovery
- Primary/Standby/Failover Administration(Data Guard)
- Data Modeling
- PL/SQL
- Disaster Recovery
- EC2
- RDS
- VPC
- REDSHIFT
- S3
- CloudFront
- MapReduce
- Pig
- Hive
- Sqoop
- Oozie
- Flume
- Big Data Infrastructure
- Hbase
- Spark
- Python
PREFESSIONAL EXPERIENCE:
Confidential, Washington, DC
Hadoop Admin
Responsibilities:
- Involved in all phases of Software Development Life Cycle (SDLC) and Worked on all activities related to the development, implementation, administration and support for Hadoop.
- Installed and Configured Apache Hadoop clusters for application development and Hadoop tools like Hive, Pig, HBase, Zookeeper and Sqoop.
- Implemented multiple Map Reduce Jobs in java for data cleansing and pre - processing.
- Worked with the team to increase cluster, the configuration for additional data nodes was done by Commissioning process in Hadoop.
- Responsible for Cluster maintenance, adding and removing cluster nodes, Cluster Monitoring and Troubleshooting, manage and review data backups and log files.
- Worked with systems engineering team to plan and deploy new Hadoop environments and expand existing Hadoop clusters.
- Managed and scheduled Jobs on a Hadoop cluster.
- Involved in defining job flows, managing and reviewing log files.
- Collected the log data from web servers and integrated into HDFS using Flume.
- Involved in HDFS maintenance and administering it through Hadoop-Java API.
- Created and maintained Technical documentation for launching HADOOP Clusters and for executing Hive queries and Pig Scripts.
- Experience in managing and reviewing Hadoop log files.
- Worked on setting up the Kerberos installation.
- Installed Oozie workflow engine to run multiple Hive and pig jobs.
- Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
- Supported in setting up QA environment and updating configurations for implementing scripts with Pig and Sqoop.
- Worked on tuning the performance Pig queries.
Environment: Hadoop, Map Reduce, Spark, Kafka, HDFS, Zoo Keeper, Hive, Pig, Core Java, Eclipse, Hbase, Sqoop, Flume, Oracle 11g, Knox, SQL, SharePoint, UNIX Shell Scripting.
Confidential, Washington, DC
Data Analyst
Responsibilities:
- Oracle Installation in Centos environment
- Oracle Patch in Centos environment
- Lead role on Oracle DBA Administration
- Import and export Oracle dump file
- Nexidia Audio Ingestion and metadata configuration
- Developed PL/SQL to convert blob data into appropriate type file documents
- Developed python code to ingest and extract data from Oracle database
- Developed java program to read data from different files and published a report
- Developed ETL jobs to merge multiple thousand files into single file and also process the file to input for another system.
Confidential
Hadoop Admin
Responsibilities:
- Experience in installation, configuration, supporting and managing Hadoop clusters using CLOUDERA distribution.
- Experience in Cloud Environments like Amazon AWS and its segments like EC2, S3,
- Configured 100 GB of everyday growing data, commonly known as Data In Motion and sources of data endpoints like Applications, Logs, Sensors, RDBMS with Capacity Planning.
- Loading Data at Rest from Tape Drives, NAS, SAN, S3.
- Working with Data Integration, Data Ingestion, Data Pipelining Team with streamsets.
- Ingestion of RDBMS data on the cluster in the secured environment with compliance
- Governance principles like Authentication, Authorization and Accounting using Kerberos, Data Encryption, JVM encryption, Sentry, ACL and Active Directory.
- Working in HA environment of HDFS and YARN where we have concurrent jobs are running.
- Setting up, Installing, configuring, maintaining and monitoring HDFS, YARN, Sqoop, Hive, Hue.
- Capacity and Cluster planning for Hadoop cluster.
- Setting up, installing, and configuring Hadoop Security using Kerberos.
- Installing and configuring Sentry for Authorization.
- Enabled High Availability for Namenode and Resource Manager.
- Commissioning and Decommissioning the nodes in the Hadoop cluster.
- Experience in Performance tuning, cluster monitoring and troubleshooting.
- Answering service requests from users through ticketing tool.
- Configured Policy for Trash, Replication, Meta Data Backup and Snapshots, Data Retention Policy, Storage Policy, user Quota Policy, Data Compression like snappy To rebalance the data on the cluster.
- Backups, snapshots and recovery from node failure.
- Importing / Exporting data to/from RDBMS.
- Loading data from S3 to HDFS and vice versa.
- Performing Distributed copy between clusters.
- To set priority of jobs on request from any particular team.
- To recognize and kill the jobs which are stuck and not progress to maintain the clusters
- Performance.
- Troubleshooting, diagnosing, tuning cluster and reporting to senior administrator.
- Setting up Alerts for services in Hadoop Eco-system.
- Report generation of running nodes using various benchmarking operations.
- Provided support in creation of POC for Hadoop deployment decision.
- Communicated all issues and participated in weekly strategy meetings.
Confidential, Chevy Chase, MD
Senior IT Specialist
Responsibilities:
- Play a lead role in NOC data services management, multi-node cluster management, and distributed data center management
- Design, Implement, and Participates in the disaster recovery process and database backups in Solaris.
- Responsible for capacity planning, space management, problem resolution, database monitoring and performance tuning of Production Databases.
- Installation, Upgrade and Configuration of Oracle (8i, 9i, 10g).
- Upgrading the Oracle Server to 8i and 9i and 10g
- Oracle9i ETL supplies robust ETL infrastructure right in the database
- Storage Management Managing Space, Tablespaces, Segments and Extents, Rollback Segments and Data Dictionary.
- Played the key DBA role in conversion of data from legacy systems to Oracle database.
- Supported and troubleshoot Applications running on Web Logic, OLTP.
- Used Transparent Application Failover (TAF) to configure type and method of failover for each Oracle Net client.
- Synchronized Transparent Application Failover (TAF) with Oracle 11g RAC.
Confidential, Washington, DC
Senior IT Specialist
Responsibilities:
- Installing and configuring of Recovery Manager.
- Completed Migration of Oracle Databases from AIX IBM machines to Sun Sparc Servers as well as with Linux.
- Expertise in Scheduling Control-m for scheduling jobs
- Security User Management, Privileges, Roles, Auditing, Profiling, Authentication.
- Backup and Restore on-line and off-line backup planning, procedure, exp/imp, restores process.
- Used Oracle Internet Directory to leverages the scalability, high availability and security features of the Oracle database
- Extensive Performance Tuning, and Memory (SGA) Tuning.
- Configured Oracle data Guard and OID (Oracle Internet Directory).
- Working with Blobs and Clobs, Materialized views with Query Rewrite, and Bitmapped Indexes.
- Applying Patches both Server side and Client side.
- Cloning databases.
- Setting up new environments (Development, Test, and Production)
- Setting up advanced replication.
- Taking Hot and Cold Backups, and Exports.
- Setting and Installing Oracle Enterprise Manager and Enterprise Backup Utility.
Confidential, Bethesda, MD
Senior Automation Tester
Responsibilities:
- Involved with NESS Application to automate the whole process.
- Involved with backend testing
- Involved with Automation Framework
- Involved with Manual Testing
Confidential, Cleveland
Oracle Backend Tester
Responsibilities:
- Conducted Manual testing, Test Validation. Supported various QA Test Cycles and Production Cycles independently.
- Extensively developed SQL queries and performed Backend testing for WEB/Internet Applications.
- Involved with receiving HIPPA Training and,
- Used TestDirector for maintaining Defect Cycle (creation, reporting, fixing, retesting and closing).
Environment: Windows XP Professional, SQL, Oracle, Test Director
Confidential, Washington, DC
Software Quality Analyst
Responsibilities:
- Testing the Application Manually and Automation with QTP
- The executing and scheduling of test cases was done using Test Director.
- Involved in Creating parameterization, Check Points and Data Driven Testing using QTP.
- Performed Smoked, GUI, Functional and Regression Testing of the Application.
- Generated reports for test metrics in each build upper management.
- Performed manual testing for the entire application and reported the defects to the developer through TestDirector
- Performed developing additional automation script using QTP and defect management using Test Director
Environment: Window NT, Zope Server, Plone, HTML, XML, Test Director and QTP