We provide IT Staff Augmentation Services!

Build And Release Engineer Resume

0/5 (Submit Your Rating)

Washington, DC

SUMMARY

  • Having around 7+ years of experience as Software Engineer with major focus on Devops/Aws such as Continuous Integration, Continuous Delivery, Continuous Deployment, Automation of Configuration Management and Security.
  • Knowledge of Big Data technologies - Hadoop Ecosystem/HDFS/ Map-Reduce Framework, No SQL DB - Hbase, HIVE, Sqoop,KAFKA,OOZIE.
  • Experience in Aws using EC2, Dynamo DB, S3, VPC, and IAM services.
  • Experience setting up instances behind Elastic Load Balancer in AWS for high availability.
  • Hands-on experience in latest automation tool Terraform and server less automation tool Lambda.
  • Experience in working with CI/CD pipeline using tools like Jenkins and Chef.
  • Worked on Jenkins for continuous integration and for End to End automation for all the build and deployments by managing different plugins Maven and Ant.
  • Wrote cookbooks in chef to automate the system operations.
  • Hands-on experience in SCM tools like GIT and SVN for merging and branching.
  • Knowledge in working with continuous deployment tools like Chef, Puppet, Ansible.
  • Hands-on experience in Ansible server and workstation to manage deployments.
  • Worked on creating the Docker containers and Docker consoles for managing the application life.
  • Good understanding of Open shift platform in managing Docker containers using Docker swarm, Kubernetes Clusters.
  • Implemented Terraform modules for deployment of various applications across multiple cloud providers and managing infrastructure.
  • Excellent communications skills, configuration skills and technical documentation skills.
  • Ability to work closely with teams to ensure high quality timely delivery of builds & release
  • Excellent relationship management skills & ability to conceive efficient solutions utilizing technology.
  • Industrious individual who thrives on a challenge, working effectively with all levels of management.

TECHNICAL SKILLS

Cloud Platform: AWS, Microsoft Azure, and Open stack.

Big data Technologies: Hadoop, HDFS, Pig, Hive, MapReduce,Cassandra, Kafka

Configuration Management: Chef, Puppet, Vagrant, Maven, Ansible, Dockers, Gradle, Splunk, OPS Work.

Database: Oracle,DB2,MySQL,MongoDB7SQLServer,MS Sql.

Build Tools: ANT, MAVEN, Make file, Hudson, Jenkins, BAMBOO, Code Deploy.

Version Control Tools: Sub version (SVN), Clear case, GIT, GIT Hub, Perforce, Code Commit.

Web Servers: Apache, Tomcat, Web Sphere, Nix, JBOSS,WebSphere.

Languages/Scripts: C, HTML, Shell, Bash, PHP, Python, PHP, Ruby,Perl,powershell

SDLC: Agile, Scrum.

Web Technologies.: HTML, CSS, Java Script, JQuery, Bootstrap, XML, JSON,XSD, XSL,XPATH.

Operating Systems: Red hat, Ubuntu, Linux and WINDOWS, Cent OS, SUSE.

PROFESSIONAL EXPERIENCE

Confidential, Washington DC

HADOOP KAFKA DEVELOPER

Environment: Hadoop, HDFS, Pig, Hive, MapReduce, Agile, Cassandra, Kafka, Storm, AWS, YARN, Spark, ETL, Teradata, NoSQL, Oozie, Java, AWS, Talend, LINUX.

Responsibilities:

  • Involved in importing the real time data to Hadoop using Kafka and implemented the Oozie job for daily
  • Loaded the data from Teradata to HDFS using Teradata Hadoop connectors.
  • Import the data from different sources like HDFS/Hbase into Spark RDD
  • Developed Spark scripts by using Python shell commands as per the requirement
  • Issued SQL queries via Impala to process the data stored in HDFS and HBase.
  • Used the Spark - Cassandra Connector to load data to and from Cassandra.
  • Used Restful Web Services API to connect with the MapR table. The connection to Database was developed through restful web services API.
  • Involved in developing Hive DDLs to create, alter and drop Hive tables and storm, & Kafka.
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required.
  • Experience in data migration from RDBMS to Cassandra. Created data-models for customer data using the Cassandra Query Language.
  • Responsible for building scalable distributed data solutions using Hadoop cluster environment with Horton works distribution
  • Experienced in developing Spark scripts for data analysis in both python and Scala.
  • Designed and developed various modules of the application with J2EE design architecture.
  • Implemented modules using Core Java APIs, Java collection and integrating the modules.
  • Experienced in transferring data from different data sources into HDFS systems using Kafka producers, consumers and Kafka brokers
  • Involved in creating Hive tables, and loading and analyzing data using hive queries.
  • Developed multiple MapReduce jobs in java for data cleaning and pre-processing.
  • Loading data from different source (database & files) into Hive using Talend tool.
  • Created and maintained Technical documentation for launching HADOOP Clusters and for executing Hive queries and Pig Scripts.
  • Experienced in using Apache Mesos for running many applications on a dynamically shared pool of nodes.
  • Used Oozie and Zookeeper operational services for coordinating cluster and scheduling workflows.
  • Implemented Flume, Spark, and Spark Streaming framework for real time data processing.

Confidential, San Jose, CA

Devops/ AWS Engineer

Environment: Amazon Web Services, Chef, Vagrant, Scrum, Subversion (SVN), ANT, UDeploy, DB2, JIRA, Confluence, Shell Scripts, Web Sphere

Responsibilities:

  • Planning, deploying, monitoring, and maintaining Amazon AWS cloud infrastructure consisting of multiple EC2 nodes and VMWare Vm's as required in the environment.
  • Used security groups, network ACLs, Internet Gateways, NAT instances and Route tables to ensure a secure zone for organizations in AWS public cloud.
  • Created monitors, alarms and notifications for EC2 hosts using CloudWatch.
  • Implemented and maintained Chef Configuration management spanning several environments in VMware and the AWS cloud.
  • Working on Multiple AWS instances, set the security groups, Elastic Load Balancer and AMIs, Auto scaling to design cost effective, fault tolerant and highly available systems.
  • Creating S3 buckets and also managing policies for S3 buckets and Utilized S3 bucket and Glacier for Archival storage and backup on AWS.
  • Creating public and private subnets within the VPC and attaching them to the EC2 instances based on the requirement.
  • Utilize AWS CLI to automate backups of ephemeral data-stores to S3 buckets, EBS and create nightly AMIs for mission critical production servers as backups.
  • Virtualized the servers using the Docker for the test environments and dev-environments needs.
  • Well Versed with Configuring Access for inbound and outbound traffic RDS DB services, DynamoDB tables, EBS volumes to set alarms for notifications or automated actions.
  • Expert Knowledge in Bash Shell Scripting, Automation of cron Jobs.
  • Implemented a GIT mirror for SVN repository, which enables users to use both GIT and SVN.
  • Implemented Continuous Integration using Jenkins and GIT.
  • Developed build and deployment scripts using ANT and MAVEN as build tools in Jenkins to move from one environment to other environments.
  • Configure and ensure connection to RDS database running on MySQL engines.
  • Responsible for Plugin Management, User Management, regular incremental backups and regular maintenance for recovery

Confidential, Santa Ana, CA

Build & Release Engineer / Devops Engineer

Environment: TFS 2008, TFS 2010, Sun Solaris, Clearcase, UNIX, windows, CVS, Perforce, .Net, C#, Java, Eclipse, Ant, Jenkins, Maven, VB Script, Install Anywhere, Visual Studio, Oracle,Tomcat Apache Application Server

Responsibilities:

  • Installed and configured AnthillPro/Jenkins for Automating Deployments and providing a complete automation solution.
  • Responsible for managing Code Repository in TFS 2010.
  • Implemented TFS branching and merging operations for .NET Source Code in the Agile Development Methodologies.
  • Created continuous integration system using Ant, Jetbrains, Team City, full automation, Continuous Integration, faster and flawless deployments.
  • Setting up Ant and Maven scripts for JAVA and J2EE Builds
  • Involved in release and deployment of large-scale C#, Website and Web applications using TFS repository.
  • Designed the Release Plans while coordinating with Stake Holders including Project Management Lead, Development Lead, QA Team Lead and ClearCase Administrator.
  • Extensive experience as Oracle Applications e-Business Suite Consultant and Technical Developer in Design, Development and Implementation of Oracle ApplicationsR12/11i.
  • Configured and Installed GIT with TFS as VSTS
  • Maintained activities related to security in TFS.Used Build Forge for enterprise scale infrastructure configuration and application deployments.
  • Integrated Subversion into AnthillPro/Jenkins to automate the code check-out process.
  • Wrote scripts to perform deploys to tomcat Webserver and WebSphere App Servers.
  • Proposed and implemented branching strategy suitable for agile development in Subversion.
  • Imported and managed multiple corporate applications in Subversion (SVN).
  • Knowledge on all the platforms of Microsoft Azure Cloud Technology.
  • Provided end-user training for all Subversion (SVN) users to effectively use the tool.
  • Owned build farm and produced effective multiple branch builds to support parallel development.
  • Managed the entire Release Communication and Co-ordination Process.
  • Maintained the Shell and Perl scripts for the automation purposes.
  • Deployed Dynamic content to Application servers like WebSphere and WebLogic.
  • Involved in editing the existing ANT/MAVEN files in case of errors or changes in the project requirement

Confidential

Build and release Engineer

Environment: IBM WebSphere, IBM Websphere MQ Explorer, Jenkins, Bladelogic Server Automation, Crystal Report Server, IBM Http-Server, Oracle SQL Developer, Microsoft Visio, WINSCP, SVN, O/S: Manager, Suse Linux 10/11

Responsibilities:

  • Installed & configuring WebSphere Application & IBM HTTP Server Webservers.
  • Performance monitoring for Application Servers, applications & Webservers.
  • Application deployment for IFT, DIT, FIT, Integrated Performance and Field environments.
  • Creation of SSL certificates.
  • Administration of SuSe 10/11 Servers on Local Environments.
  • Resolving SVN Merge conflicts, creating Braches and client control.
  • Jenkins setup to automate the continuous build and deployment integration process.
  • Development-Setup for associates to perform regression testing of Applications before IFT.
  • Automated the process of creating Component Templates, BL Packages, BL Deploy Jobs, Properties and Configuration Objects Using python, NSH and BLCLIS.
  • Automated Hot-fix process for enterprise application using combination of wsadmin and shell scripting.
  • Written Compliance scripts to check the Websphere configuration, installed applications and application servers.
  • Migrated all the application modules ANT build scripts to Maven.
  • Preparing the readme file for IAT and Productions deployments.
  • Worked on BL web-services.
  • IAT/Production support on configuration issues.
  • Configured and used HP operations orchestration based deployments.
  • Currently managing team of 6 members all direct reporters.
  • Configured Splunk forwarders and Dashboards.
  • Configured Site-scope Alerts for Non-production environment.
  • Designed and developed configuration live-view page using CGI and Perl.

We'd love your feedback!