We provide IT Staff Augmentation Services!

Devops Engineer/ Hadoop Admin Resume

4.00/5 (Submit Your Rating)

Alpharetta, GA

SUMMARY:

  • IT Professional on Development Operations (DevOps), end - to-end code configuration and Build & Release Management and Hadoop Administration.
  • Extensive experience as a DevOps engineer.
  • Extensive experience in Infrastructure management tools and data centre automation.
  • Strong Knowledge on Amazon Web Services (AWS) administration
  • Experience in Linux and Windows systems
  • Experience in administering and automation using Puppet, Ansible and Chef
  • Experience in administering Subversion (SVN), GIT, CVS
  • Experience in setting up the Ant and Maven build scripts for JAVA and J2EE applications
  • Strong knowledge on CI tools Jenkins, Release Management, Bamboo and Build Forge for automated builds and deployments
  • Experience in maintenance of udeploy servers.
  • Experience in installation, configuration and deployment of Big Data solutions.
  • Experience in installation, configuration and management of Hadoop Clusters.
  • Good understanding of NoSQL databases such as HBase and Cassandra.
  • Experience in analysing data on HDFS through MapReduce, Hive and Pig
  • Extensive experience with ETL and Query big data tools like Pig Latin and Hive QL
  • Experience in setting up workflows and scheduling the workflows using Oozie
  • Experience in integrating Unit Tests and Code Quality Analysis tools like JUnit, Cobertura, Clover, PMD, find bugs and check style
  • Experience writing Ruby, Python, Shell (Bash), Perl and Batch/PowerShell scripts to automate the deployments
  • Managed environments DEV, INT, QA, UAT and PROD
  • Experience in Installing Firmware Upgrades, kernel patches, systems configuration, performance tuning on UNIX/Linux systems.
  • Installed and Configured Nagios and Splunk and AppD
  • Vast knowledge of utilizing cloud technologies including Amazon Web Services (AWS), Microsoft Azure and Pivotal Cloud Foundry (PCF)
  • Experience in SDLC, Agile and Scrum Methodologies

PROFESSIONAL EXPERIENCE:

Confidential, Alpharetta, GA

DevOps Engineer/ Hadoop Admin

Responsibilities:

  • Working on to design automated deployment process using Release management tool.
  • Monitoring the instances on clusters based on applications, artifacts and based host instances. To maintain the server and app configurations.
  • Created build and Release -definition to deploy the MSI of related applications into target server.
  • Created release templates for different applications to deployed it in a target server using Release management tool.
  • Worked on to develop a framework using PowerShell to automate the data-fix deployments like update, insert, select, delete statements.
  • Setting up workflows and scheduling the workflows using Oozie.
  • Interacting with business analysts and developers to analyse the user requirements, functional specifications and system specifications.
  • Developed automation framework for Application Deployments to the cloud environments.
  • Worked on Managing the Private Cloud Environment using Chef.
  • Performed Branching, Tagging, Release Activities on Version Control Tools: SVN, GIT.
  • Developed Perl and shell scripts for automation of the build and release process, developed Custom Scripts to monitor repositories, Server storage.
  • Used Maven as build tool on Java projects for the development of build artifacts on the source code.
  • Installing and configuring Splunk instances in windows and Linux and cloud.
  • Using Splunk monitoring console to check the environment health, logs etc.
  • Setting up the AppD machine agents and app agents in windows and Linux environments.
  • Implemented the CI/CD with Microsoft Azure.
  • Strong experience using Ambari administering large Hadoop clusters > 100
  • Installed/Configured/Maintained Apache Hadoop clusters for application development and Hadoop tools like Hive, Pig, HBase, Zookeeper, Cassandra and Sqoop.
  • Experience in scheduling jobs in Talend.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, and loaded data into HDFS.
  • Using GitHub as source code management for setting projects in TFS for deployments.
  • Configured TFS builds with continuous integration and build notifications .
  • Configured users and permissions for TFS, SharePoint portals and Source Control Explorer folders
  • Enabled HA for Name Node, Resource Manager, Yarn Configuration and Hive Megastore.
  • Worked on implementation of SSL /TLS implementation.
  • Created Talend Mappings to populate data into dimensions and fact tables
  • Registered all the RHEL servers in Red hat Satellite 5.x, 6.x and performed necessary patches.
  • Experience with Docker for different infrastructure setup and testing of code.
  • Involved with the build team in troubleshooting and fixing day-to- day problems of the applications in production on 24/7 schedule.
  • Installation and configuration of the spotfire.
  • Maintained the spotfire application and doing the import/export of the reports.
  • Deploying the IX and MTK components on the Spotfire resource servers and the core servers.
  • Integrating and Maintenance of the Hadoop clusters on all environments.
  • Configured Journal nodes and Zookeeper Services for the cluster using Hortonworks.

Environment: Maven, Release Management, Microsoft Azure, Chef, Puppet, Python, AWS, GIT, Apache Webserver, SVN, Windows. MapReduce, HDFS, HBase, Hive, SQL, Oozie, Sqoop, Talend, UNIX Shell Scripting, Spotfire, IX components, MTK, Splunk and AppD.

Confidential, Los Angeles, CA

Build Release Engineer / DevOps Engineer

Responsibilities:

  • Developed and implemented Software Release Management strategies for various applications according to the agile process.
  • Troubleshooting with the agents problem in udeploy server
  • Disc usage maintenance of udeploy server
  • Installed, Configured and Administered Hudson/Jenkins Continuous Integration Tool.
  • Developed build and deployment scripts using ANT and MAVEN as build tools in Jenkins to move from one environment to other environments.
  • Developed automation framework for Application Deployments to the cloud environments.
  • Worked on Managing the Private Cloud Environment using Chef.
  • Implemented AWS solutions using EC2, S3, RDS, EBS, Elastic Load Balancer, and Auto scaling groups, Optimized volumes and EC2 instances.
  • Performed Branching, Tagging, Release Activities on Version Control Tools: SVN, GIT.
  • Developed Perl and shell scripts for automation of the build and release process, developed Custom Scripts to monitor repositories, Server storage.
  • Automated the cloud deployments using chef, python (boto & fabric) and AWS Cloud Formation Templates.
  • Created deployment tickets using Jira for build deployment in Production.
  • Developed some Python Scripts for Diagnosis module.
  • Python scripting implementation for processing database records.
  • Creating Linux Virtual Machines using VMware Virtual Centre and AIX LPARs on P550 and P570.
  • Configured TFS builds with continuous integration and build notifications .
  • Configured users and permissions for TFS, SharePoint portals and Source Control Explorer folders
  • Manage Enterprise Database Environments in Physical and Virtual Environments, Which Consist of a Mixture of SQL SERVER Databases, AMAZON REDSHIFT.
  • Experience with Docker and Vagrant for different infrastructure setup and testing of code.
  • Experienced in building and maintaining Docker infrastructure for SOA applications in agile environment.
  • Deployed Docker Engines in Virtualized Platforms for containerization of multiple apps.
  • Implemented a Nagios monitoring system to notify of system issues
  • Worked in design and deployment of national data center using the Open stack
  • Proficiency in multiple databases like MongoDB, Cassandra, MySQL, ORACLE and MS SQL Server.
  • Installed Pivotal Cloud Foundry on EC2 to manage the containers created by PCF. Used Docker to virtualize deployment containers and push the code to EC2 cloud using PCF.
  • Participated in installing and configuring of UNIX/Linux based Oracle 10g products

Environment: DevOps, Java, Ant, Maven, Jenkins, Hudson, Chef, Puppet, Python, Perl, AWS, GIT, SVN, Apache Webserver, JBoss, Apache JMETER, MetaCase, GIT, SVN, Windows.

Confidential, Atlanta, GA

DevOps Engineer

Responsibilities:

  • Wrote and maintained build scripts for E2Open cloud platform SaaS applications modules like supplier services and admin portal using Maven
  • Responsible for proper functioning DEV/TEST/STG/PROD environments for these applications
  • Maintained Jenkins continuous integration infrastructure and automated releases to DEV/TEST/STG/PROD environments
  • Participated in after hours on-call rotation to support Ops performs deployments on PROD environment
  • Server configure management via Puppet or Chef
  • Setup various Team Projects into different Team Project Collections in TFS 2010/2012.
  • Configured TFS 2010 Environment along with Default Project Collections, Build Definitions, Work Items, Share point Services, and Reporting Services.
  • Worked with multiple development teams to troubleshot and resolve issues
  • Integrated Selenium automation regression test suite in Jenkins build pipeline
  • Worked with Puppet and Chef which is used to manage Linux but later versions support Microsoft Windows
  • Setup Jenkins maven build automations with uploads to Pivotal Cloud Foundry
  • Create and rapid development of Restful services and using Pivotal Cloud Foundry to deploy the services in hybrid cloud.
  • Implemented Micro-services using Pivotal Cloud Foundry platform build upon Amazon Web Services
  • Implemented comprehensive cloud monitoring and incident management solution using Cloud kick, Data dog
  • Planning, deployment and tuning of Elastic search for Linux based infrastructure
  • Maintain Chef and Puppet servers and management application that can use Service Now (CI) data to bring computers into a desired state by managing files, services, or packages installed on physical or virtual machines
  • Involved on migrating SQL Server databases to SQL Azure Database using SQL Azure Migration Wizard.
  • Perform maintenance activities in a large environment composed of a mix of different UNIX/Linux, platforms and configurations. Trouble-shoot hardware, software and network related problems.
  • Responsible for backup and restoration of UNIX and VMware servers by using VERITAS NetBackup software.
  • Deployed application to Azure Cloud.
  • Wrote Python scripts for pushing data from Mongo DB to MySQL Database
  • Understanding & usage of Atlas Sian tools (i.e. Bamboo, JIRA, Nexus)
  • Converted .Net application to Microsoft Azure Cloud Service Project as part of cloud deployment.
  • Migrated SQL Server 2008 database to Windows Azure SQL Database and updating the Connection Strings based on this.
  • Managed virtual machine's setup and maintenance with Chef.

Environment: SaaS applications, SVN, Maven, Bamboo, Nexus, Bash Scripting, Chef, SoapUI, Selenium WebDriver, JIRA, Tomcat, Java, JUnit, Docker, XML, XPATH, SAN

Confidential

Internship/Build & Release Engineer

Responsibilities:

  • Worked closely with the Development Team in the design phase and developed use case diagrams using Rational Rose.
  • Worked with the Architects on SDLC process being the owner of post development environments
  • Coordinated the resources by working closely with Project Managers for the release and carried deployments and builds on various environments using continuous integration tool
  • Developed and implemented the software release management for the release of web applications.
  • Maintaining the build environment, the source code control system and managing build packages using TFS.
  • Administering and Monitoring TFS Servers.
  • Wrote ANT and MAVEN Scripts to automate the build process.
  • Used Shell/Perl scripts to automate the deployment process.
  • Performed Load/Functional testing using Mercury Load runner and HP QTP.
  • Designed a customized Status Reporting tool used currently, based on the specific requirements using J2EE/ Struts and WebSphere Application Server with DB2 as Database.
  • Coordinated Instance Refreshes and Environment Re-base lining.
  • Coordinated all the IT projects successfully by resolving release interdependencies and planning release.
  • Planning, scheduling and documenting releases at code Freeze Periods.
  • Worked with many teams of strength more than 30 and managed 2 release/build engineers.
  • Stand by administrator for Clear Case and PVCS.
  • Performed Functional and Stress Testing for a few applications using Mercury Load runner
  • Worked with HP QTP for Functional Testing.

Environment: Subversion, Clear Case, Hudson, Java/J2EE, JDK, ANT, MAVEN, DB2, UNIX, Windows Server 2003, Windows XP, Web Sphere, Perl Scripting, HP Quality Center.

We'd love your feedback!