Database Administrator (hadoop) Resume
Irvine, CA
PROFESSIONAL SUMMARY:
- Seeking a challenging Hadoop Administration & Systems Administration/Electrical Engineering /Kronos Business Analyst career with progressive result oriented organization which offer ample opportunity to prove, improve and grow in career at professional advancement and a challenging and rewarding technical environment where I can expand upon my existing skill base while contributing to a lively team.
- Experience in installing, configuring and administrating Hadoop cluster for major Hadoop distributions like Cloudera Version CDH5.7,CDH 5.9, CDH 5.10
- Experience in setting up automated monitoring and escalation infrastructure for Hadoop Cluster using Ganglia and Nagios
- Experience in Hadoop infrastructure which include Map reduce, Hive, Oozie, Scoop, Hbase, hive, Pig, HDFS, Yarn, Hbase, HUE, Spark, Kafka,Key - value store Indexer in direct Client role
- Having Strong Experience in LINUX/UNIX Administration, expertise in Red Hat Enterprise Linux 4, 5 and 6, familiar with Solaris 9 &10 and IBM AIX 6
- Excellent knowledge of in NOSQL databases like HBase, Cassandra.
- Experience in monitoring and troubleshooting issues with Linux memory, CPU, OS, storage and network
- Strong knowledge of Hadoop platforms and other distributed data processing platforms
- Worked with business users to extract clear requirements to create business value
- Investigated on new technologies like Spark to catch up with industry developments.
- Exceptionally well organized that demonstrates self-motivation, learning, creativity & initiatives, extremely dedicated & possess skills in actively learning new technologies within short span of time
- Manages support for Analytics, Attendance, Advance Scheduling, Time Clocks, day to day operations and pay issues.
- Heavy involvement in configuration & customizations, including WIM, Navigator processes,
- Attestation, Time-Off Requests, Delegation, Advanced Scheduling, SDM (Record
- Manager), Accruals, Cutover Plans, System documentation.
- Strong experience in Splunk configuration files, RegEx and comfort in using the Linux CLI and Windows. experience with Splunk real-time processing architecture and deployment; Splunk dashboard design
- Experience in SOAP, REST API, web-based technologies and scripting languages including Javascript, Python, Perl and shell scripting, XML, and HTML.
TECHNICAL SKILLS:
Big Data Technologies: HDFS, Hive, Map Reduce, Cassandra, Pig, Hcatalog, Phoenix, Falcon, Scoop, Flume, Zookeeper, Mahout, Oozie, Avro, HBase, MapReduce, HDFS, Storm, CDH 5.3, CDH 5.5
Monitoring Tools: Cloudera Manager, Ambari, Nagios, Ganglia
Scripting Languages: Shell Scripting, Puppet, Scripting, Python, Bash, CSH.
Programming Languages: C, Python, SQL, and PL/SQL.
Front End Technologies: HTML, XHTML, XML.
Application Servers: Apache Tomcat, WebLogic Server, Web sphere
Databases: Oracle 11g, MySQL, MS SQL Server, IBM DB2.
NoSQL Databases:: HBase, Cassandra, MongoDB
Operating Systems: Linux, UNIX, MAC, Windows NT / 98 /2000/ XP / Vista, Windows 7, Windows 8.
Networks:: HTTP, HTTPS, FTP, UDP, TCP/TP, SNMP, SMTP.
Security:: Kerberos, Ranger.
Kronos Skills: Labor Levels, Labor Accounts, Pay Codes, Work Rules, Holiday & Overtime, Access Profiles, Display Profiles, Accruals
Workforce management Tools: Kronos (Work force Central 6.7 and 8.0)
Kronos Version Navigator: Workforce Timekeeper, Leave, Activities, Mobile and Tablets, Advanced Reports, Advanced Schedule, Process Designer, Integrated Manager
WORK EXPERIENCE:
Database Administrator (Hadoop)
Confidential, Irvine, CA
Responsibilities:
- Installed and configured Hadoop clusters and Eco-system components like spark, Hive, Scala, Yarn, Map Reduce and HBase.
- Worked on installing cluster, commissioning & decommissioning of DataNodes, NameNode recovery, capacity planning and slots configuration.
- Collaborated with the infrastructure, network, database, application and BI teams to ensure data quality and availability.
- Installed security components like Authentication (Kerberos)
- Supporting Cloudera Hadoop infrastructure and other services in its eco system.
- Providing subject matter expertise in management of Cloudera Hadoop infrastructure
- Maintaining and provide day to day to administration of Cloudera Hadoop infrastructure
- Helping with ongoing Splunk workload and LASR migration
- Managing regular patching and upgrade of CDH
- Experience in managing Microsoft Azure cloud based and on Premise Cloudera Clusters
- Used Spark API over Cloudera Hadoop YARN to perform analytics on data in Hive.
- Developed Spark scripts by using Scala shell commands as per the requirement.
- Excellent command in implementing High Availability,creating backups and recovery and Disaster recovery procedures.
- Currently handling 4 on premise(CDH 5.13.3) and 4 Micorsoft Azure Cloudera Clusters (CDH 5.15)
- Worked closely with Informatica BDM Team and Autosys Teams
Business Analyst
Confidential, Glendale, CA
Responsibilities:
- Experience on various Kronos modules like Workforce Timekeeper, Accruals, Activities, Leave, attendance, advanced Scheduler and Forecaster.
- Manages support for Analytics, Attendance, Advance Scheduling, Time Clocks, day to day operations and pay issues.
- Module experience includes Timekeeper, Accruals, Attendance, Leave, and Connect.
- Worked on environment configurations necessary to successfully develop, deploy and maintain Kronos interfaces integrating HRMS and Clocks like Kronos 4500.
- Install, support and configure Kronos applications to include Workforce Central Scheduler for Healthcare and optimized Scheduler, Workforce Central Timekeeper.
- Configure new functionality within Kronos to support changing business needs such as pay rules, customer reporting, attendance rules, and automated alerts.
- Experienced in analyzing, business requirement gathering, planning, and deploying solutions in support of corporate strategies and business initiatives. Adept with creation/maintenance of Kronos pay rule configuration, timecard calculation interpretation, schedules (basic scheduling).
- Kronos Workforce Central 7.x, Kronos Workforce Central 6.x, Kronos Connect 6.x.
- Tracked employee's attendance, schedules as well as their benefit time off through the Kronos Time Keeping system.
- Advanced understanding and experience with Kronos 6.0-8.0 WFC products and Implementation &configuration to include Workforce Timekeeper, Workforce Connect 6.0, Advance Scheduler, Leave& Attendance, Accruals, Analytics and Device Manager (4500 Terminals) and Experience in Up gradation from 6.2 to 6.3, 6.3 to 7.0 and 7.0 to 8.0 in Kronos module.
Hadoop Administrator
Confidential, Sunnyvale, CA
Responsibilities:
- Cluster maintenance, Adding and removing cluster nodes, Cluster Monitoring and Troubleshooting,
- Manage and review data backups, Manage and review Hadoop log files on Cloudera clusters
- Architecting Hadoop clusters with Cloudera CDH 5.7,CDH 5.9 Built a UAT and Production Cloudera Cluster on CDH 5.9 Commissioned and decommissioned the Data Nodes in the cluster in case of the problems.
- Debug and solve the major issues with Cloudera manager by interacting with the Cloudera team from Cloudera Continuous monitoring and managing the Hadoop cluster through Ganglia and Nagios Giving presentations about new ecosystems to be implemented in the cluster with the teams and managers.
- Helped the users in production deployments throughout the process Resolved tickets submitted by users, P2,P3 issues, troubleshoot the errors, documenting, resolving the errors On-boarding new users to the Hadoop cluster (adding user a home directory and providing access to the datasets).
- Installed Oozie workflow engine to run multiple Hive and Pig jobs, which run independently with time and data availability.
- Also Done major and minor upgrades to the Hadoop cluster Monitoring Hadoop Cluster through
- Cloudera Manager and implementing alerts based on Error messages.
- Providing reports to management on Cluster Usage Metrics Benchmarking and Stress Testing an
- Hadoop Cluster With TeraSort, TeraGen and Teravalidate,TestDFSIO & Co. experience with Splunk real-time processing architecture and deployment
- Comfortable in using Linux CLI and windows
- Experience with splunk configuration files
- Experience with Python,HTML,XML
Python Developer
Confidential, San Jose, CA
Responsibilities:
- Django Framework that was used in developing web applications to implement the model view control architecture.
- Used Django configuration to manage URLs and application parameters Designed and
- Developed data management system using MySQL Developed frontend and backend modules using Python on Django Web Framework.
- Wrote SQL Queries, Store Procedures, Triggers and functions in MySQL Databases Worked with various Python Integrated Development Environments like Net Beans, PyCharm, PyStudio, PyDev and Sublime Text Used MySQL as backend database and MySQL dB of python as database connector to interact with MySQL server.
- Used Python and Django creating graphics, XML processing of documents, data exchange and business logic implementation between servers.