Sr. Oracle Dba Resume
Houston, TX
PROFESSIONAL SUMMARY:
- Having 15 plus years of IT experience of which around 5 years in Solid experience with Hadoop Administration of Cloudera and HDP distributions in Enterprise Linux environment Knowledge of server and application security.
- Experience in Hadoop Architecting, Designing, Building and implementing end to end solution.
- 10 years of Extensive domain expertise in the areas of Performance tuning, Backup, Cloning, remote replication and disaster recovery solutions for Oracle 12c/11gR2/10g/9i/8i Cluster database deployments.
- Having expertise in Hadoop Solution Architecting, Building Hadoop cluster, upgrade, Administration Performance tuning and Bench mark testing using Cloudera Distribution and Hortonworks.
- Expertise in Hadoop DR solutions, Replication, Switchover, Switchback.
- Improved the Hadoop cluster performance by considering the Disk I/O, Networking, memory, reducer buffer, mapper task, JVM task and HDFS by setting appropriate configuration parameters.
- Experience in monitoring cluster performance using Splunk and Cloudera manager to ensure the availability and integrity and confidentiality of application.
- Experience in deploying and managing the multimode development, testing and Production Hadoop cluster with different Hadoop components (HBASE, HIVE PIG, SQOOP, OOZIE, FLUME, ZOOKEEPER) using Cloudera Manager and Hartonworks Ambari.
- Expertise in HBase Administration and HBase Replication
- Experience in setup, configuration and management of security for Hadoop clusters using Kerberos and integration with LDAP/AD at an Enterprise level
- Experience in EDB Postgres Solution, Installing, Administration, DR Solution, Backup and Recovery
- Built proactive monitoring of all Splunk systems, forwarders, data velocity, and license utilization
- Working with testing teams to identify potential problems and their appropriate solutions.
- Investigated and escalated potential security incidents
- Done couple of POC in Hadoop cluster environment to monitor and improve the performance of across Hadoop cluster.
- Done couple of POC in Pivotal Tomcat environment to monitor, reporting live data streaming and do the business analytics process.
- Architecting, Designing, Building and implementing end - to-end proven/validated solutions using EMC SAN/NAS storage products/applications with Oracle 10g/11g technology along with VMware VSphere offerings
- Experienced in planning and deriving Oracle solutions for mission critical applications on Linux Platform and Confidential products
- Sound knowledge in configuration and administration of database performance Benchmarking tools using Benchmark Factory, SwingBench.
- Expertise in most advanced features of Oracle 10g/11g such as RAC, Data Guard, ASM, dNFS, RMAN. The responsibilities also involved installation, configuration and tuning the database for achieving optimal performance
- Expertise in configuration and deployment of different features of VMware VSphere 4.0 and Oracle-VM
- Creating Validation Test Results documents, Reference Architecture Guides, Best Practices Guides, step by step Applied Technology guides and Videos for easy deployment of the solutions
- Exposure in design and optimizing performances of Production and Development databases
- Excellent analytical and troubleshooting skills
TECHNICAL SKILLS:
Hadoop / Big Data: Cloudera & Hortonworks with HDFS, MapReduce, HBase, Hue, Hive, Pig, Flume, Sqoop, Oozie, Zookeeper
Splunk: Splunk Enterprise Edition V 6.5.5
Database Versions: 10g, 11gR1, 11gR2 and 12cR1
Database Technologies: Oracle Clusterware, RAC, Automatic Storage Management (ASM), Data Guard, RAC One Node, RMAN
Operating Systems: RHEL 4/5/6/7, OEL 4/5/6/7
Scripting Language: Ansible, Shell script
Storage Technologies: MirrorView Sync and Async, SnapView Snapshot, SnapView Clone, SnapSure Checkpoint, SnapSure Writeable Checkpoint
Storage Arrays: EMC Clariion (CX-3 and CX-4 series) and EMC Celerra (NS-40) (NS-480) (NS-960)
Storage Testing Knowledge: Validation Testing for EMC Clariion and Celerra on Oracle Database 10g and 11g on both VMware Infrastructure and Physical Server.
Storage Application Suite: Navisphere Manager, Celerra Manager, Replication Manager.
Virtualization: VMware ESX 3.0/3.5 vSphere 4; Virtual Center 2.0/2.5/4; Oracle-VM
Benchmarking Tools: Benchmark factory
PROFESSIONAL EXPERIENCE:
Confidential
Responsibilities:
- Responsible for planning, installing, configuring and maintaining CDH Cluster.
- Design and Implement Hadoop Solutions, Upgrades, Performance tuning on Cloudera Distributions
- Responsible for production system capacity planning for future growths
- Installation of Hadoop Ecosystem products through Cloudera manager
- Upgrading minor and major Cloudera Hadoop versions
- Benchmarking Hadoop cluster to identify performance issues
- Administration and maintaining cluster data ensuring the availability, integrity and confidentiality of application
- Monitor Hadoop environment and ensure that Hadoop cluster is available for business continuity
- Ensure availability of Hadoop all the services like HDFS, Flume, HBase Hive, Hue, Oozie and Zookeepers
- Perform planned and emergency failover from active environment to standby environment both for Hadoop and EDB Postgres as part of system high availability.
- Configure HBase replication, Administration and Manage HBase.
- Configure HBase replication between production and Disaster Recovery environment
- Responsible for HDFS Data Archive and restore across environments.
- Implementing Change request according to the application team requirement
- Enterprise DB Installation, Configuration, Administration, Replication, Backup and Recovery
- Installing and configuring EDB Postgres in multiple environments
- Configuring Replication between Production and Disaster Recovery site.
- Ensure proper switchover between master and standby databases on production
- Implement master - standby relation between production and standby environments
- Realtime Data Analytics using Splunk.
- Perform installation of Splunk on multiple environments
- Implement log rotation and enable real time analysis on production data
- Apply patches in coordination with Vendor against the vulnerabilities identified by Cyber team.
- Responsible for availability of Non-Hadoop applications like Tomcat Servers, RabbitMQ
- Install and configure Tomcat Servers and deploy customer specific application code.
- Perform continuous tuning and apply best practices on application servers to meet the growing demand of data from the mobile devices
- Collaborate with cross functional teams like Cyber, Firewall, Network teams and ensure high availability of Tomcat servers
- Install and configure RabbitMQ messaging services and ensure all the incoming data is routed to appropriate client applications
- Manage RMQ channels, connections and mirroring
- Configure HBase replication between production and Disaster Recovery environment
- Responsible for reporting production metrics
- Responsible for generating daily, weekly and monthly production metrics which will enable business teams to take appropriate decisions.
- Responsible for change management across multiple environments
- Identify the potential issues and prepare change requests
- Publish the changes and seek approvals in CAB meetings
- Schedule the changes and ensure the change life cycle is followed to successful implementation of the change.
- Involved in setup, configuration and management of security for Hadoop clusters using Kerberos and integration with LDAP/AD at an Enterprise level
- Managing and reviewing Hadoop log files and supporting MapReduce programs running on the cluster.
- Set up automated processes to analyze the System and Hadoop log files for predefined errors and send alerts to appropriate groups.
- Developed automated scripts using Unix Shell for running Balancer, file system health check,
- Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, and MapReduce access for the new users.
- Participated in evaluation and selection of new technologies to support system efficiency.
- Worked with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop ad to expand existing environments.
- Worked with the data center planning groups, assisting with network capacity and high availability requirements.
- Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.
- Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.
- Possess good Linux and Hadoop System Administration skills, networking, shell scripting and familiarity with open source configuration management and deployment tools such as Ansible.
Environment: Hadoop Cloudera distributions CDH5.11.2, Redhat Linux 7.x, Kerberos, LDAP/ADTC Server and RMQ Server.
Confidential, Houston, TX
Responsibilities:
- Responsible for installing, configuring and monitoring Hadoop cluster on Cloudera distribution
- Worked closely with Cloudera Technical team for designing and implementation and completed Couple of POC
- Extensively involved in Cluster Capacity planning, Hardware planning, Installation, Performance Tuning of the Hadoop Cluster.
- Responsible for Installing Splunk and Integrating with Hadoop cluster environments
- Configured Data lake which serves as a base layer to store and do analytics on data flowing from multiple sources into Hadoop Platform
- Integrating Hadoop cluster with Kerberos authentication infrastructure- KDC server setup, crating realm /domain, managing principles, generation key tab file each service and managing keytab using keytab tools.
- Monitored multiple Hadoop clusters environments using Splunk
- Completed Splunk POC for Business data log ingestion to Splunk Servers and created couple of Business related dashboards.
- Completed POC for Splunk Alerting triggering on Business critical events which is found the in the Application log.
- Completed POC for detecting and generating Vulnerability report on Linux System, Network Vulnerability using Splunk generating metrics.
- Monitored workload, job performance and capacity planning using Splunk
- Involved in initiating and successfully completing of POC on Sqoop to prove Reliability and Ease of Scalability over traditional database
- Building Hadoop Clusters using Cloudera and Hortonworks for POC to choose right distribution for enterprise solution.
- Configure and setup multiple nodes using Ansible Playbooks.
- Building Virtual Linux servers for POC environments
- Worked with Infrastructure team and various groups on lease rolling servers, co-ordinate and build the new server’s for setting up POC environment.
- Aligning with the systems engineering team to propose and deploy new hardware and software
- Environments required for Hadoop and to expand existing environments.
Confidential, Aurora, CO
Oracle SME
Responsibilities:
- Designing and Implementing Disaster Recovery solutions For 28 Application Databases.
- Building RAC for Disaster Recovery Servers (Standby Server).
- Building Data Guard and establishing Replication.
- Around 28 POC has been done RAC and Data Guard solutions and on different plat forms
- Around 32 Cluster upgrade and Database upgrade has been done successfully.
- Developed Monitoring Scripts for Data Guard.
- Automation of Database Administration Special Activity like Flashback Database, Cloning and Snapshot database and many day to day activities.
- Database Refresh using Flash Database and Restore point Technologies.
- Designed DR Drill Implementation Plan and Conducted Successfully.
- Tuning queries which are running slow using Profiler and Statistics Io by using different Methods in terms of evaluating joins, indexes, updating Statistics and code modifications.
- Maintained compliance with Sarbanes-Oxley and PHI compliance for protecting patient information.
Confidential, Minneapolis, MN
Oracle Solution Architect
Responsibilities:
- Worked on Database consolidation Project and successfully completed Database consolidation.
- Migrated 10g databases to 11g R2 Databases
Confidential, San Francisco, CA
Sr. Oracle DBA
Responsibilities:
- Installation, setup and configuration of DATA GUARD.
- Application Tuning by tuning queries, identifying problematic code, creation of indexes, re-writing queries using an optimized path, creation of materialized Views and gathering necessary statistics on objects etc.
- Implementing Database Security, Management of role-based security.
- Write and modify UNIX shell scripts to manage Oracle environment.
- Performed Data Migration as a part of enterprise data warehouse project.
- Extensive Performance Tuning, Memory (SGA) Tuning and Application Tuning,
- Established database backup/recovery strategy for different databases.
- Recovery procedures are applied for complete and incomplete recoveries depending on the type of failure.
Confidential, Oklahoma City, OK
Sr. Oracle DBA
Responsibilities:
- Provided Primary architectural and administrative support for new projects.
- Installed 10g RAC and 11g RAC database
- Installed RAC on Oracle 10g and worked on ASM feature of Oracle.
- Performance Tuning i.e., Tuning RAC, tuning applications, shared pool, I/O distribution, rollback segments, buffer cache, redo mechanisms.
- Configured, administered and monitored Streams environment using various tools like OEM Console.
- Oracle Grid Control Management Server Installation.
- Performance tuning of Queries by maintaining Aggregates, Compression, partition, indexing and use of Hints, Stored outlines, Statistics for the same.
- Implemented Backup-using RMAN (Recovery Manager) as a media management layer.
Confidential
Responsibilities:
- Basic backup solution using RMAN on both physical booted and VMware Infrastructure.
- Advance backup solution using SnapView Snapshot on both Physical Booted and VMware Infrastructure.
- Advance backup solution using Celerra Checkpoint on both Physical Booted and VMware Infrastructure.
- Test/Dev solution using SnapView Clone on both Physical Booted and VMware Infrastructure.
- Test/Dev solution using Writeable Checkpoint on both Physical Booted and VMware Infrastructure.
- Basic protect solution using Dataguard on both Physical Booted and VMware Infrastructure.
- Advance protects solution using MirrorView on both Physical Booted and VMware Infrastructure.
- Test/Dev solution using Confidential Replication Manager on Physical Booted Infrastructure.
- Installation of Oracle 11g R2 Grid Infrastructure on VMware vSphere4.0
- Oracle RAC: ON and OFF(convert the RAC binaries to Single Instance binaries)
- DRS Solution on VMware Infrastructure.
Confidential
Responsibilities:
- Monitoring of Production database server.
- I've been doing SQL tuning and Oracle Applications performance tuning.
- Monitoring long running programs and identifying programs which impacting server and fixing.
- Discoverer reports Performance review.
- Performance review of Package, concurrent program and forms with developers of Oracle Applications.
- Tuning of SQL code extracted from oracle Application’s concurrent program, packages requested
- Developers through tuning request process.
- Expertise in troubleshooting of oracle applications performance issues.