- Contribute to an organization’s mission by utilizing my technical training and real - world experience in Systems Administration and Systems Engineering.
- Relationship- and team-building
- Decision-making and problem-solving
- Technically competent in Linux and Windows
- High-quality verbal, written, and Customer Service skills
- Strong aptitude for troubleshooting hardware, network and application issues
HARDWARE and SOFTWARE:
- Coudera Enterprise
- Cloudera Manager
- Hadoop (CDH3/4/5)
- Linux (Red Hat, CentOS)
- Lotus Notes
- Red Hat Satellite (5/6/7)
- Windows (server/desktop)
Senior Systems Engineer
Confidential, Washington, DC
- Design content for delivery in online environments and completes instructional design documentation to ensure quality and consistency.
- Test and implement emerging technologies in courses and learning experiences.
- Perform market research of big data platform solutions.
- Provide input into the procurement of hardware, software, and professional services for analytic environment.
- Physically install racks and servers for Hadoop environment.
- Install and configure RHEL 6.4 on all Hadoop nodes using kickstart file and cobbler.
- Collaborate with information security team to ensure security scans are performed and remedial actions are completed.
- Install and configure master and slave Confidential nodes.
- Administer Hadoop environment via Linux CLI and Cloudera Manager.
- Manage Confidential configurations on production, analytic, and development Hadoop clusters.
- Build and maintain single-node pseudo-Hadoop clusters for end-users.
- Upgrade clusters from CDH4.x to CDH5.x.
- Transfer data between clusters. Configure Kerberos authentication for Hadoop environment.
- Puppetize OS-level configurations. Update and patch RHEL systems using Red Hat Satellite Server.
- Maintain Windows-to-Linux UID settings and user access to Linux machines using Centrify.
- Create virtual machines (VMs) using vCenter 5.5.
- Increase virtual system resources (i.e. memory, disk space, cores) on the VM and at the operating system level.
- Communicate with vendors for replacement hardware, major version upgrades, technical issues and best practice advice.
- Customize Hadoop environment to ensure best functionality - with system security as a priority - and scalability as a must.
- Provide analytics platform and ecosystem component training and person-to-person guidance to end-users.
- Conduct code reviews of pig, python, and various MapReduce code submitted by end-users.
- Provide input regarding code optimization, cluster balancing and configuration.
Senior Systems Engineer
Confidential, Ft. Belvoir, VA
- Deployed, integrated, and configured custom Apache Hadoop software components, Open-Source Software (OSS), Government Open-Source Software (GOSS), Government off-the-shelf (GOTS), and Commercial off-the-shelf (COTS) software packages within a large Hadoop cloud-computing environment.
- Responsible for systems designed and developed specifically for integrating in-house and third-party developed software packages into the Hadoop cloud-computing based system.
- Conduct technical analysis of complex infrastructure and integration challenges taking into account all technical, financial, programmatic, and operational constraints.
- Maintained a stable Hadoop cloud computing environment as well as ensuring all software components are integrated properly into the cloud computing solution.
- Work with the Systems Engineering, Development, and Test Teams, as well as external organizations.
- Utilize Puppet to provision and manage infrastructure.
- Create and modify Puppet automation scripts to support deployments and upgrades.
- Integrate Hadoop ecosystem analytic products into baseline cloud environment.
- Puppetize external products being integrated into environment. Test, debug, and troubleshoot Java programs and scripts for various functions.
- Modify existing Java code to add new feature, or bug fix. Install RPMs; update repositories; ensure all dependencies are resolved.
- Run MapReduce jobs against large datasets in hdfs.
- Comfortable and capable of providing detailed written communications of problems, and in developing and presenting verbal presentations of design approach, status, tools, techniques, processes, procedures, and methodologies.
Confidential, Ft. Meade, MD
- Supported Tier-2 and day-to-day administration of fielded production Hadoop systems.
- Deployed, configured and maintained Apache Hadoop clusters.
- Responded to and diagnosed incidents not resolved at Tier-1 level, 24x7.
- Restored hdfs service and proactively and reactively identified and resolved systemic problems.
- Performed hardware maintenance in conjunction with site, mission, or partner representative.
- Patched and upgraded Linux OS and applications on fielded systems.
- Facilitated emergency software and/or configuration changes to correct operational incidents with fielded systems.
- Escalated issues to and coordinated with Tier-3 personnel to return systems to full operational capability.
- Provided input into determining whether or not Apache Hadoop is providing the intended results and how it can be improved.
- Provided technical direction to design and development teams, and monitored progress and productivity through the use of metrics.
- Provided detailed technical support for software development programs.
- Working knowledge of software development lifecycle and the use of object-oriented design (OOD) and associated tools.
- Facilitated translation of operational requirements into system requirements.
- Integrated open-source and COTS products into system architectures.
- Maintained and modified existing Puppet scripts.
- Utilized Puppet to model the cloud infrastructure in code.
Confidential, Ft. Meade, MD
- Installed and configured new Linux servers in preparation for transition from physical to virtual infrastructure.
- Maintained and supported existing Solaris systems housing Oracle. Wrote and maintained documentation within Clearcase for day-to-day processes and procedures.
- Managed workload problem requests and change requests through Clearquest.
- Created and managed virtual machines in Virtual Center.
- Analyzed and troubleshoot system performance issues.
- Performed Solaris filesystem backups to Confidential tape.
- .NET Developers/Architects Resumes
- Java Developers/Architects Resumes
- Informatica Developers/Architects Resumes
- Business Analyst (BA) Resumes
- Quality Assurance (QA) Resumes
- Network and Systems Administrators Resumes
- Help Desk and Support specialists Resumes
- Oracle Developers Resumes
- SAP Resumes
- Web Developer Resumes
- Datawarehousing, ETL, Informatica Resumes
- Business Intelligence, Business Object Resumes
- MainFrame Resumes
- Network Admin Resumes
- Oracle Resumes
- ORACLE DBA Resumes
- Other Resumes
- Peoplesoft Resumes
- Project Manager Resumes
- Quality Assurance Resumes
- Recruiter Resumes
- SAS Resumes
- Sharepoint Resumes
- SQL Developers Resumes
- Technical Writers Resumes
- WebSphere Resumes
- Hot Resumes