Bigdata Devops Engineer Resume
High Point, NC
SUMMARY
- I have 10 years of experience in the Information Technology industry, providing companies with technical solutions to support business goals.
- Extensive background combining both systems integration and operations procedures, along with a keen understanding of customer service and requirements.
- Highly efficient at problem solving and process improvement. Can adapt well to hanging priorities while managing scope, time, and budget constraints.
TECHNICAL SKILLS
Operating Systems: RHEL, CentOS, Ubuntu, SUSE Linux
Hadoop Distributions: Cloudera, Pivotal HD, Hortonworks, IBM Big Insights
Web Servers: Apache2, Apache Tomcat, Jetty
Database: MySQL, PostgreSQL, Oracle
VoIP: Asterisk, FreePBX
Firewall: Iptables, Sonicwall
Virtualization: Docker, Open Vz, KVM, Citrix Xen Server.
Public Cloud: Windows Azure, Rackspace cloud, Amazon, TATA Instacompute.
Private Cloud: Cloudstack, Eucalyptus, Openstack.
Mail Server: Postfix, Sendmail, VMware Zimbra
Scripting: Bash, Python, Php
Monitoring ToolsGanglia, Nagios, Wireshark, Tcpdump:
Automation tools: Puppet, Jenkins
Version control: Svn, Git
BI Tools: Pentaho Server, Datameer, Informatica.
Data Visualization: Spotfire, Tablue, D3js.
Other Services: LDAP, MIT Kerberos, Red hat IPA.
PROFESSIONAL EXPERIENCE
Confidential,High Point NC
Bigdata DevOps Engineer
Responsibilities
- Maintain Hortonworks Data Platform and Hortonworks Dataflow environments.
- Implementing Hadoop Security on Hortonworks Cluster (Kerberos, Two - way SSL, Knox, Ranger)
- Hadoop Performance tuning and optimization.
- SAML SSO Integration with CAS server (Knox SSO, Tableau, Zeppelin)
- Automating day to day operations.
- Designing and Implementing Highly Availability in Hadoop environment.
- Implementing data Visualization tools (Pentaho, Tableau)
- Installing and Configuring Hadoop Ecosystems (Hdfs, Yarn, Map reduce, Spark, Kafka, Hbase, Hive, Oozie, Zookeeper etc).
- Running benchmark tests, analyze system bottlenecks and prepare solutions to eliminate them
- Active directory integration for Hadoop
- Capacity planning and screening of Hadoop job performances
- Document Installation steps, use cases, solutions and recommendations.
Confidential
Hadoop Administrator
Responsibilities
- Designing Hadoop architecture and Implementation.
- Implementing Hadoop Security on Hortonworks Cluster (Kerberos, Two-way SSL, Knox, Ranger)
- Hadoop Deployment on Microsoft Azure environment (Python, Powershell, Bash Scripting).
- Created Spotfire dashboard to Visualize HDFS Utilization and Cluster performance.
- Creating automation scripts for installing Hadoop with Ambari Blueprints.
- Designing and Implementing Highly Availability in Hadoop environment.
- Performance tuning and optimization.
- Installation of Alteryx and creating workflows for data analytics.
- Data visualization with D3.js and JavaScript.
- Implementing data Visualization tools (Spotfire, Tableau, Informatica, Datameer)
- Installing and Configuring Hadoop Ecosystems (Hdfs, Yarn, Map reduce, Spark, Kafka, Hbase, Hive, Oozie, Zookeeper etc).
- Creating automation scripts using java, shell, PowerShell and python
- Running benchmark tests, analyze system bottlenecks and prepare solutions to eliminate them
- Active directory integration for Hadoop
- Implementing Hadoop cluster on top of Virtualized environments.
- Implementing NTLM single sign on for Spotfire.
- Capacity planning and screening of Hadoop job performances
- Hadoop infrastructure administration.
- Designing and implementing new technologies.
- Document Installation steps, use cases, solutions and recommendations.
Confidential
Hadoop Administrator
Responsibilities
- Hadoop Infrastructure design and implementation.
- Installing and Configuring Hortonworks Hadoop Clusters and Ecosystems (Hdfs, Yarn, Map reduce, Spark, Kafka, Hbase, Hive, Oozie, Zookeeper etc).
- Monitoring cluster status by using Ganglia, Nagios and our monitoring scripts.
- Monitoring HDFS Integrity.
- MIT Kerberos and Apache Knox security implementation.
- Hadoop performance tuning as per the system resource utilization.
- Running benchmark tests with different values to Hadoop configuration parameters to get an optimized system configuration.
- Implementing Hadoop Security
- Automation script development for cluster installation and configuration.
- Performed the role of Hadoop Admin in TCS Big Data team. There my role involved installing and configuring different Hadoop distributions on clusters - (Cloudera, Apache, Pivotal, IBM BigInsights, HortonWorks), Automation scripts and common maintenance activities.
- Implementing Hadoop High Availability
- Capacity planning and screening of Hadoop job performances
- Active directory integration for Hadoop
- Installing and Configuring Hadoop Ecosystems (Hdfs, Yarn, Map reduce, Spark, Kafka, Hbase, Hive, Oozie, Zookeeper etc).
- Experience in implementing data visualization tools (Spotfire, Tablue, Informatica, Datameer)
- Monitoring cluster status by using Ganglia, Nagios and custom monitoring scripts.
- Kerberos and Apache Knox security implementation.
- Hadoop performance tuning as per the system resource utilization.
- Running benchmark tests with different values to Hadoop configuration parameters to get an optimized system configuration.
Confidential
Linux System Administrator
Responsibilities
- Linux, Cloud system administration, maintaining fault tolerance.
- Maintain a high level of stability and availability for all supported systems.
- Implementation of Cloud Stack, Eucalyptus Private Cloud Environment.
- Manage Rackspace cloud and Amazon cloud environments.
- Manage Xen Server, OpenVz virtualization environment.
- Create and maintain up-to-date documentation for all supported systems provide insight to allow continuous improvement.
- Creating Production/Testing environment
- Shell / Python Scripting.
- Server Monitoring.
- Server installation and L3 support for clients.
- Implementation of Samba OpenLDAP Domain Controller.
- Implementation of Asterisk VoIP System.
- Implementation of Zimbra Mail server.
- Data Backup.
- Administration and Management of Servers in Data center (Government of Kerala).
- Installation and managing Apache, Tomcat Servers.
- Installation and managing Server 2003, IIS, FTP Servers.
- Implemented Samba OpenLDAP Domain Controller.
- Installation & maintenance of Linux Servers.
- Installation & maintenance of Windows2000, 2003 & 2008 Servers
- Remote Administration using terminal service, VNC and PCAnywhere
- Data Backup.
- Installation and servicing PC’s and Servers at Secretariat (Government of Kerala).
- Administration and Management of Windows/Linux based network of over 3500 Desktops.
- Monitoring all network related issues in Linux and Windows Servers.