We provide IT Staff Augmentation Services!

Devops/infrastructure Engineer Resume

3.00/5 (Submit Your Rating)

Charlotte, NC

SUMMARY

  • Over 8+ years of experience in DevOps, build automation, Software Configuration, Build & Release Engineer, Linux Administration, Openstack, experience in large and small software development organizations involving cloud computing platforms like Amazon Web Services (AWS), Azure and Google Cloud (GCP).
  • 4+year of work with DevOps OpenStack tools like Puppet, chef, Docker, Openshift, Redhat cloud forms.
  • Transformed traditional environment to virtualized environments with, AWS - EC2, S3, EBS, EMR, ELB, EBS, Kinesis, Redshift, Matillion, chef, Puppet, Jenkins, Jira, Dockers, Vagrant, OpenStack - Nova, Neutron, Swift, Cinder, and VMware.
  • Experienced in all phases of the software development lifecycle (SDLC) with specific focus on the build and release of quality software. Experienced in Waterfall, Agile/Scrum, Lean and most recently Continuous Integration (CI) and Continuous Deployment (CD) practices.
  • Experience in Branching, Merging, Tagging and maintaining the version across the environments using SCM tools like Subversion (SVN) , GIT (GitHub) and ClearCase .
  • Strong experience on C, Multi-threading, Boost, STL, Sqlite, GDB, Purify, Quntify, Fortify and Makefile on Unix/Windows platforms.
  • Experience configuring Azure App services, Azure Application insights, Azure Application gateway, Azure DNS, Azure Traffic manager, App services, Analyzing Azure Networks with Azure Network Watcher, Implementing Azure Site Recovery, Azure stack, Azure Backup and Azure Automation.
  • Experience in managing and reviewing Hadoop log files.
  • Experience with Snowflake Multi-Cluster and Virtual Warehouses.
  • In-depth knowledge of Data Sharing in Snowflake.
  • In-depth knowledge of. Snowflake Database, Schema and Table structures.
  • Experience in using Snowflake Clone and Time Travel.
  • Experience in Python, Perl, bash, Ruby, Groovy and Shell Scripting .
  • Experience in software methodologies like Waterfall model, Agile Methodology, Scrum, and TDD.
  • Extensive experience in using Continuous Integration tools like Cruise Control, Jenkins/Hudson, Build Forge, Team City, and Bamboo.
  • Extensive experience creating Datamart and entering market data, trades, and securities into Murex systems.
  • Hands-on experience performing extractions, margin calculations, and validations on data via Java and re-entering the data into the Murex system.
  • Expertise in using build tools like MAVEN and ANT for the building of deployable Artifacts such as war and ear from Source Code.
  • Championed in cloud provisioning tools such as Terraform and CloudFormation.
  • Developed processes, tools, automation for UrbanCode based software for build system and delivering SW Builds.
  • Good working Experience in client sidedevelopmentwith HTML, XHTML,CSS, JavaScript, JQuery and AJAX.
  • Developed User Interface in JSP, JavaScript and HTML with Backbone JS Framework.
  • Experience in spring module like MVC, AOP, JDBC, ORM, JMS, and Web Services using Eclipse and STS IDE.
  • Good knowledge configuring VNet to another VNet using Virtual network peering or An Azure VPN Gateway.
  • Ability to act as the Murex API Architect, Java Developer and Analyst in many areas on projects.
  • Experience with AWS instances spanning across Dev, Test and Pre-production and Cloud Automation through Open Source DevOps tools like Ansible, Jenkins, Openshift & Kubernetes .
  • Working Knowledge of Visual Studio Build Professional, NANT, MSBUILD.
  • Experience SonarQube and JUnit for testing and reviewing the code and code quality in CI/CD processes.
  • Gained extensive experience in RPM deployment via Chef, build automation through Jenkins , and server management RHEL .
  • Experience in designing/working on Amazon Web Services such as EC2, S3, Route 53, ELB, VPC, Auto-Scaling, AMI, EBS, IAM, Cloud Formations and Cloud Watch.
  • Wrote various chef modules, python & bash scripts to automate deployment of openstack components, Linux components and many other tools to list
  • Engineered OpenStack (Grizzle and Havana) private/public cloud on RHEL6.X/RHEL 7.x for kraft client.
  • Expertise in scanning and remediating application vulnerabilities using SSAP scan analysis and Black Duck scanning.
  • Worked in infrastructure team on installation, configuration and administration of CentOS 5.x/6.x/7, Red Hat Linux 8/9, RHEL 5.x/6.x/7, Windows Server and SUSE Linux 10.x/11.
  • Experienced in configuring and integrating the servers with different environments to automatically provisioning and cresting new machines using CM/ Provisioning tools like Ansible , Chef and Puppet .
  • Automated deployment of builds to different environments using TeamCity, Jenkins.
  • Involved in Design, development and testing of web application and integration projects using Object Oriented technologies such as Core Java, J2EE, Struts, JSP, JDBC, Spring Framework, Hibernate, Java Beans, Web Services REST/SOAP, XML,XSLT,XSL, and Ant.
  • E xperienced in Managing DNS, LDAP, FTP, JBOSS, Tomcat and Apache web servers on Linux servers.
  • Machine learning and GPU utilization for general computing.
  • Experience in designing and developing applications in Spark using Scala to compare the performance of Spark with Hive and SQL/Oracle.
  • Experience in Designing, Architecting and implementing scalable cloud-based web applications using AWS and GCP.
  • Develop data pipelines for Machine Learning models to improve functionality and user experience of the google Test cloud platform and its data.
  • Areas of expertise includes analysis design and development of software involving technologies like Java J2EE Servlets JSP JDBC JSTL SPRING 3.0/2.5 JPA Hibernate 3.0 Struts 2.0 Web Services WSDL JMS EJB XML XSLT JNDI HTML JavaScript AJAX and JSF Prime faces.
  • Set up a GCP Firewall rules in order to allow or deny traffic to and from the VM's instances based on specified configuration and used GCP cloud CDN (content delivery network) to deliver content from GCP cache locations drastically improving user experience and latency.
  • Implemented and Managed Docker and Kubernetes infrastructure and working on Worked in DevOps group running Jenkins in a Docker container with EC2 slaves in Amazon AWS cloud configuration.
  • Worked on maintaining Docker Images and containers.
  • Performance tuning the tables in Redshift, data Validation, Quality check in Redshift using Python.
  • Strong working knowledge in developing Restful webservices and Micro Services using Golang.
  • Experienced in handling big data systems using NoSQL DB, Cassandra & data streaming tools like Kafka in multi-data center cluster.
  • Experience is using Tomcat, JBOSS, Web logic and Web Sphere Application servers for deployment.
  • Performed automation tasks on various Docker components like Docker Hub, Docker Engine, Docker Machine, Compose and Docker Registry. Deployment and maintenance using Micro services using Docker.
  • Monitor major metrics like Network packets, CPU utilization, Load Balancer Latency.
  • Excellent communication, interpersonal, and analytical skills to work efficiently in both independent and teamwork environments.
  • Excellent Experience in Hadoop architecture and various components such as HDFS Job Tracker Task Tracker NameNode Data Node and MapReduce programming paradigm.
  • Have sound exposure to Retail market including Retail Delivery System.
  • Hands on experience in installing configuring and using Hadoop ecosystem components like Hadoop MapReduce HDFS HBase Hive Sqoop Pig Zookeeper and Flume.
  • Good Exposure on Apache Hadoop Map Reduce programming PIG Scripting and Distribute Application and HDFS.
  • Good Knowledge on Hadoop Cluster architecture and monitoring the cluster.
  • In-depth understanding of Data Structure and Algorithms.

TECHNICAL SKILLS

Version Control: SCM, SVN, GIT, Clear Case, GitHub, Bit Bucket, TFS.

Scripting: Perl, Ant, Maven, Shell Scripting, JMS, JavaScript and Python

CI/CD Tools: Jenkins, Hudson, AnthillPro, Build Forge, uBuild, Bamboo

Build Tools: MAVEN, Gradle, ANT, Make and MSBuild

Container Technologies: Docker, KubernetesConfiguration Mgmt: Chef, Puppet, Ansible and Vagrant

Deployment Tools: U-Deploy, Octopus Deploy, Run deck

Testing Tools: SonarQube, Junit, Fortify

Tracking Tools: IBM Clear Quest, Perforce, JIRA

Databases: Oracle 9i/8i/10g, IBM DB2/UDB, Cassandra, MangoDB

Platforms: Windows, UNIX, Ubuntu and Linux

Servers: Apache Web Server, WebSphere, WebLogic, Tomcat, and JBoss

Cloud Technologies: AWS (VPC, EC2, S3, Cloud Watch, Lambda, RDS, EBS, IAM), GCP, (IaaS, PaaS, SaaS) XaaS, G, Murex, Snowflake, Hadoop, Golang, Spark, Terraform, Cloudant, Redis, Rabit, PostgreSQL, Kafka, Openshift

PROFESSIONAL EXPERIENCE

Confidential, Charlotte, NC

Devops/Infrastructure Engineer

Responsibilities:

  • Develop tools to automate the deployment, administration, and monitoring of a large-scale AWS Linux environment.
  • Analyze the business, technical, functional, performance and infrastructure requirements needed to access and process large amounts of data.
  • Experience in using GIT/GITHUB integrating with Jenkins (CI) with Groovy with NEXUS .
  • Extensive experience using Groovy as build tools for the building of deployable artifacts (jar, war & ear) from the source code.
  • Experience in building CI/CD pipelines using Jenkins for deployments for End to End automation to support all build and deployment as a pipeline.
  • Automated machine learning dnn training and optimized it on a gpu with cuda.
  • Experience as a GIT , Environment Management and Build/Release Engineering for automating, building, release, Go-live and configuring changes from one to another environment.
  • Developed Vagrant file for Updating, Upgrading, Deploying and configuring the existing and new Infrastructure applications.
  • Wrote Terraform templates for AWS Infrastructure as a code to build staging, production environments & set up build & automations for Jenkins.
  • Developed new RESTful API services that work as a middleware between our application and third-party APIs that we will used using Golang.
  • Experience writing data APIs and multi-server applications to meet product needs using Golang.
  • Create and maintain highly scalable and fault tolerant multi-tier AWS and Azure environments spanning across multiple availability zones using Terraform and CloudFormation.
  • Experience in System Administration, System & Server builds, Upgrades, Patches, Migration, Troubleshooting, Security, Backup, Disaster Recovery, Performance Monitoring and Fine-Tuning on Linux Severs.
  • Working on Deployment procedures using middleware like Tomcat , creating deploy scripts and setting for the Production Release .
  • Expertise in scanning and remediating application vulnerabilities using static code analysis and Black Duck scanning.
  • Strong production experience and insights of Consulting, Architecting/Designing and Implementing virtual environments for continuous delivery systems and its methodologies.
  • Managing DNS, LDAP, FTP, JBOSS, Tomcat and Apache web servers on Linux servers .
  • Developed automation scripts for various Configuration Management Tools Including AWS Lambda Functions to reduce day-to-day repetitive work using Python/Shell .
  • Responsible for ensuring that team delivers projects that are technically sound and comply with defined standards and procedures.
  • Experience in writing Shell, Perl, Python and JSON scripts .
  • Automated RabbitMQ cluster installations and configuration using Python/Bash.
  • Installed on Puppet/Chef/Dockers for the Openstack and Openshift environment along with scripting in PERL/RUBY and PYTHON
  • Configured and maintained Jenkins to implement the CI process and integrated the tool with Ant and Maven to schedule the builds and automated the deployment on the application servers using the " code deploy " plugin for Jenkins.
  • Written Corn Jobs to automate daily scripts.
  • Continuous integration with Jenkins , continuously evaluate and recommend improvement to CI/CD processes.
  • Provided access to data necessary to perform analysis on scheduling, pricing, bus bunching and performance. Queries in the Redshift environment performed 100-1000x faster than in legacy environments.
  • Created a Python process hosted on Elastic Beanstalk to load the Redshift database daily from several source.
  • Successfully migrated the website's main database from MySQL to PostgreSQL.
  • Worked on Continuous Integration (CI) workflow using Virtual Environments like Docker and kubernetes to build various containers to deploy the micro services-oriented environments for scalable applications. Implemented Twist lock for containers and application security.
  • Design, build, configure, test, install software, manage and support all aspects and components ( Chef ) of the application development environments in AWS .
  • Writing Chef recipes and cookbooks and uploading them to Chef server , managing on-site OS, Applications, Services, Packages using Chef .
  • Maintained Chef Configuration Management spanning several environments in VMWare and AWS.
  • Using Docker , Openshift and Amazon Cloud Architecture that will best utilize our existing technology patents to serve real time needs and deployments.
  • Experience in dealing with Windows Azure IaaS - Virtual Networks, Virtual Machines, Cloud Services, Resource Groups, Express Route, Traffic Manager, VPN, Load Balancing, Application Gateways, and Auto-Scaling.
  • Building the AWS Infrastructure using VPC, EC2, S3, Route 53, EBS, Security Group, Auto Scaling, and RDS in Cloud Formation, AMI, EBS, IAM, and Cloud Watch.
  • Experience in Designing the DataDog to Monitor the Docker containers, RDS Storage and CPU Usage.
  • Experience in Agile/Scrum methodologies on most recent Continuous Integration (CI) and Continuous Deployment (CD) practices.
  • Deployment of web, enterprise java components, messaging components, concurrency and multi-threading.
  • Working with Docker , Openshift, Kubernetes for the Container Security Engineer implementing monitoring/auditing security events on container and implement container network security detection.
  • Designed and implemented application using JSP Spring MVC JNDI Spring IOC Spring Annotations Spring AOP Spring Transactions Hibernate 3.0 SQL ANT JMS Oracle and Oracle Web Logic Application server.
  • Developed custom consumers and producers for Apache Kafka in Go (Golang) for cars monitoring system.
  • Designed the real-time analytics and ingestion platform using Storm and Kafka. Wrote Storm topology to accept the events from Kafka producer and emit into Cassandra DB.
  • Provided Infrastructure support and user support for AWS .
  • Experience in Setting up the build and deployment automation for Terraform scripts using Jenkins.
  • Provisioned the highly available EC2 Instances using Terraform and cloud formation and wrote new plugins to support new functionality in Terraform.
  • Used Bash and Python included Boto3 to supplement automation provided by Ansible and Terraform for tasks such as encrypting EBS volumes backing AMIs and scheduling Lambda functions for routine AWS tasks.
  • Managed, developed, and designed a dashboard control panel for customers and Administrators using Django, Oracle DB, PostgreSQL, and VMWare API calls.

Environment: Linux, Shell, Python, Java, Git, Gradle, Chef, Vagrant, Tomcat, JBoss, AWS services (EC2, VPC, S3, IAM, RDS, SNS, Cloud Watch, Elastic Beanstalk, Route53, EBS, ELB), Lambda, Data Dog, Docker, Kubernetes, Jenkins, Maven, Bamboo, Nexus, Junit, Black Duck, JMS, Twist lock, Terraform, Rabit, PostgreSQL, Kafka, Redshift, Openshift.

Confidential, Denver, CO

Cloud DevOps Engineer

Responsibilities:

  • Worked on agile development life cycle.
  • Install and configure Virtual machines, storage account, virtual network, Azure load balancer in the Azure cloud.
  • Responsible for implementing, design and architect solution for Azure cloud and network infrastructure, Data center migration for public, private and Hybrid cloud
  • Perform assessment of the existing environment (Application, servers, database) using various tool like MAP toolkit, Azure Website Migration assistant and Manual assessment.
  • Developed a migration approach to move workloads from On-Premises to Windows Azure for Windows machines & AWS for Linux Solaris machines. Administered RHEL, Centos, Ubuntu, UNIX & Windows servers.
  • Installed and configured Hadoop MapReduce HDFS Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
  • Development of Data integration scripts using Anaplan connect to tie in Oracle as a data source to Anaplan EPM tool for CIOX.
  • Worked on google cloud platform (GCP) services like compute engine, cloud load balancing, cloud storage, cloud SQL, stack driver monitoring and cloud deployment manager.
  • Setup GCP Firewall rules to allow or deny traffic to and from the VM's instances based on specified configuration and used GCP cloud CDN (content delivery network) to deliver content from GCP cache locations drastically improving user experience and latency.
  • Experience in installing configuring and using Hadoop ecosystem components.
  • Designed and developed Dynamics-AAA (Access, Authorize & Audit) Portal which provides secure access to Azure resources and assigns custom roles. This Portal became a standard for granting access and same compliance with MSIT standards.
  • Responsible for implementing monitoring solutions in Ansible, Terraform, Docker, Openshift and Jenkins.
  • Developed new RESTful API services that work as a middleware between our application and third-party APIs that we will used using Golang.
  • Experience writing data APIs and multi-server applications to meet product needs using Golang.
  • Deployed the tools Microsoft Azure Cloud Service (PaaS, IaaS), and Web Apps.
  • Used SQL Azure extensively for database needs in CustomerLookup & //AzNot.
  • Migrated the Azure CXP Tools to HTTPS based authentication using SSL encryption.
  • Worked with Nagios for Azure Active Directory & LDAP and Data consolidation for LDAP users. Monitored system performance using Nagios, maintained Nagios servers and added new services & servers.
  • Worked with the application based on JSP, JavaScript, Struts 2.0, JSF 2.0, Cloudant, Hibernate 3.0, Service Oriented Architecture System Analysis and Design methodology as well as Object Oriented Design.
  • Expertise in scanning and remediating application vulnerabilities using static code analysis and Black Duck scanning.
  • Data Profiling, Mapping and Integration from multiple sources to AWS S3/RDS/Redshift.
  • Automation of ETL loads into Redshift Database using Windows Batch Scripts.
  • Performed Murex CVA desk tasks.
  • Secured Data is stored in MySQL. Vault (by HashiCorp) secures, stores and tightly controls access tokens and passwords used by the overall platform, started in the AWS cloud and currently integrates with several services like: AWS AIM, Amazon DynamoDB, Amazon SNS, Amazon RDS.
  • Deployed and manage containerized applications more easily with a fully managed Kubernetes service. Azure Kubernetes Service (AKS) offers serverless Kubernetes, an integrated continuous integration and continuous delivery (CI/CD) experience, and enterprise-grade security and governance. Unite your development and operations teams on a single platform to rapidly build, deliver, and scale applications with confidence.
  • Experience in dealing with Windows Azure IaaS - Virtual Networks, Virtual Machines, Cloud Services, Resource Groups, Express Route, Traffic Manager, VPN, Load Balancing, Application Gateways, and Auto-Scaling.
  • Designed and created the database tables and wrote SQL queries to access PostgreSQL
  • Worked on Continuous Integration (CI) workflow using Virtual Environments like Docker and kubernetes to build various containers to deploy the micro services-oriented environments for scalable applications. Implemented Twist lock for containers and application security.
  • Configured RBAC and Azure Monitor for adding security in Azure Cloud.
  • Installed and configured SCM tools, Chef on Azure.
  • Coordination with continuous Integration to ensure that all applicable environment issues are resolved in advance of production implementation.
  • Automated RabbitMQ cluster installations and configuration using Python/Bash.
  • Designed and deployed applications utilizing all the AWS stack (Including EC2 , Route53 , S3 , ELB , EBS , VPC , RDS , DynamoDB, SNS, SQS, IAM, KMS, Lambda, Kinesis ) and focusing on high-availability, fault tolerance and auto-scaling in AWS Cloud Formation, deployment services (Ops Works and Cloud Formation) and security practices (IAM, Cloud Watch, Cloud Trail).
  • Configured AWS IAM and Security Group in Public and Private Subnets in VPC .
  • Designed AWS Cloud Formation templates to create custom sized VPC, subnets, NAT to ensure successful deployment of Web applications and database templates.
  • Created AWS Route53 to route traffic between different regions.
  • Performed SVN to GIT/BitBucket migration and managed branching strategies using GIT flow workflow. Managed User access control, Triggers, workflows, hooks, security and repository control in BitBucket .
  • Implemented multiple CI/CD pipelines as part of DevOps role for on-premises and cloud-based software using Jenkins , Chef, Openshift and AWS/Docker .
  • Experience in Configuration Management, Cloud Infrastructure, and Automation like Amazon Web Services ( AWS ), Ant, Maven, Jenkins, Chef, SVN, GitHub, Clear Case, Tomcat, and Linux .
  • Used JIRA as defect tracking system and configure various workflows, customizations and plugins for JIRA bug/issue tracker, integrated Jenkins with JIRA, GitHub.
  • Extensively experienced in Bash, Perl, Python, Ruby scripting on Linux.
  • Experienced in performance tuning of Spark Applications for setting right Batch Interval time, correct level of Parallelism and memory tuning.
  • Optimizing of existing algorithms in Hadoop using Spark Context, Spark-SQL, Data Frames and Pair RDD's.
  • Implemented ELK (Elastic Search, Log stash, Kibana) stack to collect and analyze the logs produced by the spark cluster.
  • Performed advanced procedures like text analytics and processing, using the in-memory computing capabilities of Spark.
  • Experienced in handling large datasets using Partitions, Spark in Memory capabilities, Broadcasts in Spark, Effective & efficient Joins, Transformations and other during ingestion process itself.
  • Installed on Puppet/Chef/Dockers for the Openstack and Openshift environment along with scripting in PERL/RUBY and PYTHON.
  • Carried automated Deployments and builds on various environments using continuous integration (CI) tool Jenkins.
  • Implemented Restful web service to interact with Redis Cache framework.
  • Worked on developing Restful endpoints to cache application specific data in in-memory data clusters like REDIS and exposed them with Restful endpoints.
  • Installing, configuring and administering Jenkins CI tool on Linux machines.
  • Build Scripts using Ant and Maven build tools in Jenkins to move from one environment to other environments.
  • Maintaining the Elasticsearch cluster and Logstash nodes to process around 5TB of Data Daily from various sources like Kafka, kubernetes, etc.
  • Wrote Chef Recipes to automate our build/deployment process and do an overall process improvement to any manual processes.
  • Wrote multiple cookbooks in Chef and implemented environments, roles and Data Bags in Chef for better environment management.
  • Implemented Chef Knife and Cookbooks by Ruby scripts for Deployment on internal Data Centre Server and reused same Chef Recipes to create a Deployment directly into EC2 instances.
  • Created PostgreSQL and Oracle databases on AWS and worked on modifying their settings.
  • Created and managed multiple instances of Apache Tomcat and deployed several test applications in those instances in QA environment.
  • Merging, and automation processes across the environments using SCM tools like GIT, OCTOPUS, Stash and TFS on Linux and windows platforms.
  • Investigation of issues found in the production environment, Apache Tomcat configuration and support for other teams within IT.
  • Deployed and maintained production environment using AWS EC2 instances and Elastic Container Services with Docker .
  • Good Knowledge on container management using Docker in creating images.
  • Worked on Docker components like Docker Engine and creating Docker images.
  • Implemented a Continuous Delivery pipeline with Docker, Openshift, Jenkins and GitHub and AWS AMI's.
  • Jenkins DSL script for Code quality analysis using sonar cube and HP Fortify tools.
  • Monitoring the environments with Sensu monitoring tool.
  • By designing and implementing Docker workflow reduced built and deployment times.
  • Experience in creating Docker containers and Docker consoles for managing the application life cycle.
  • Documented release builds and source control procedures and plans.
  • Develop scalable build, test and deployment systems in virtualized environments.
  • Resolved the issues on Amazon web services by capturing the snapshots of build boxes.
  • Developed micro service on boarding tools leveraging Python and Jenkins allowing for easy creation and maintenance of build jobs and Kubernetes deploy and services.

Environment: Azure (Web Roles, Worker Roles, SQL Azure, Azure Storage, Azure AD, Resource Groups, Office365, RBAC), GCP, SVN, GIT, GitHub, BitBucket, DSL (groovy), Sonarcube, Sensu, ANT, Maven, PostgreSQL, AWS, Docker, Kubernetes, JIRA, Shell Scripts, Chef, Python, Ruby, Jenkins, AWS, Groovy, Octopus, WebLogic, Tomcat, WebSphere, Golang, Black Duck, Twist lock, Fortify, Spark, Terraform, Cloudant, Redis, Rabit, PostgreSQL, Openshift.

We'd love your feedback!