We provide IT Staff Augmentation Services!

Cloud Devops Engineer Resume

4.00/5 (Submit Your Rating)

SUMMARY

  • AWS Certified devOps engineer professional with more than 13 years of professional experience and focusing on cloud computing services in healthcare vertical. I have hands - on experience building, configuring, monitoring and scaling of distributed applications in AWS. I have good experience working with, setting up and configuring of various continuous integration and delivery systems, plus code quality automation technologies.
  • Experience in Infrastructure automation, code migration, configuration management on Linux and Windows based environments.
  • Involved in and responsible for the support, maintenance, set-up and development of CI-CD pipelines, build and release processes, Linux and Windows server environments.
  • Contributing to open source projects in Baltimore community. Worked on provisioning AWS resources using Terraform for Node.js application
  • Experience in using ECR, ECS, VPC, Autoscaling, Load Balancing, IAM Security Groups, AWS CLI, Cloud watch, Cloud Pipeline, CloudBuild, Cloud Deploy, CloudFormation and Terraform
  • Experience in Scripting languages like PowerShell, bash, python
  • Experience in CI/CD technologies both for cloud native application and non-cloud applications, experience in Git, Jenkins, Circle CI, Nexus etc.
  • Work with development & Test team to understand and implement completion of tasks as described in ServiceNow ticketing system.
  • Involved in all the phases of the DevOps implementation for large programs identifying, tracking and resolving issues in build and deployment process
  • Manage all aspects of testing and verification ensuring all tasks are performed for all interfaces of a DevOps solution
  • Hands on experience in installing, configuring and using Apache Hadoop eco system components like HDFS, Hadoop MapReduce, Sqoop, Hive and Pig.
  • Executed software projects for Healthcare, Auto Insurance and Aviation industries.
  • Able to assess business rules, collaborate with stakeholders and perform source to target data mapping, design and review.
  • Experience in code migration from oracle to hive scripts, Data Migration, Data Preprocessing, Validation and data analysis in HDFS.
  • Expertise in writing Hadoop jobs for analyzing data using Hive and Pig.
  • Experience in data importing and exporting data using sqoop from HDFS to relational Database Systems (RDBMS) and vice-versa.
  • Strong Experience and expertise in working with ETL.
  • Experience in development, enhancement and leading teams. Handled multiple roles - Hadoop Developer, DB2 and Mainframe programmer and subject Matter Expert.
  • Dimensional Data Modeling experience using Data modeling, Erwin Modeling (ERwin 4.5/4.0, Oracle Designer) Star Join Schema/Snowflake Modeling, FACT & Dimensions tables.
  • Experience in developing Test Strategies, Test Plans and Test Cases and expertise in Debugging of existing ETL processes.
  • Was involved in the complete life cycle of quite a few Data warehouse implementations which include Functional/Technical Design, Development Coding and Testing (Unit Testing, System Integration Testing and User Acceptance Testing), Implementation and Production Support.
  • Efficient team player with excellent communication skills, Strong Team building, mentoring skills and good interaction with users.

TECHNICAL SKILLS

  • EC2, EBS, ECR, ECS, EKS, Lambda s3, s3 Glacier
  • RDS, DynamoDB, Amnzon Redshift
  • AWS Migration Hub
  • VPC, ClodFront, Route 53
  • CodeCommit, CodeBuild, CodeDeploy, CodePipeline
  • CloudWatch, AWS Auto Scaling, CloudFormation
  • IAM, Cognito
  • Git, Jenkins, Groovy, Maven, Terraform, Dockers, Kubernetes
  • Hadoop, MapReduce, HDFS, Hive, Pig, Sqoop, Flume, Oozie with
  • Hue and YARN
  • Qlikview 11
  • Blockchain Solidity, Python, PL/SQL, IBM Mainframes
  • Apache Oozee, Dollar Universe, CA 7, Autosys, Informatica Scheduling
  • TOAD, plsql developer, Eclipse, SVN, Rational Clear Case

PROFESSIONAL EXPERIENCE

Confidential

Cloud DevOps Engineer

Responsibilities:

  • Build and maintain CI pipelines using build and release orchestration tools (Jenkins, Docker, Artifactory, AWS, etc.)
  • Continuous Delivery strategy for deploying applications to AWS ECS
  • Collaborate with engineering teams to devise code branching strategies, application deployment and rollback strategies, and implementing the development workflow in the CI pipeline
  • Troubleshoot build and deployment related issues
  • Environment management to ensure all needs are covered for testing and development.
  • Communicating issues to the development team.
  • Working with developers and QA to ensure builds are stable upon release.
  • Utilize AWS VPC, IAM, S3, EBS, EC2, EC2 auto scaling, ELB, Code Pipeline, Code Build, Code Deploy, ECR, ECS, DynamoDB, Glacier, Terraform, SNS, SQS, Lambda, Jenkins and Nexus
  • Utilized scripting languages including Python, shall scripting.
  • Experienced in DEV, TEST, QA, performance and UAT environments
  • Setting up the CloudWatch alerts and alarms for EC2 instances state changes and using in auto- scaling for providing elasticity
  • Implemented S3 versioning and lifecycle policies and archive files in Glacier.
  • Creation of pipelines using Jenkins as part of Continuous Integrations, Managing plugins
  • Adopted Terraform to spin up the servers in AWS as per environment requirement.
  • Provisioned the highly available EC2 Instances using Terraform and wrote new plug-ins to support new functionality in Terraform.
  • Created and managed IAM users & groups, defining roles, policies, Identity providers and used them to restrict access to certain buckets.
  • Experienced working on Development, Test, QA, Performance, UAT environment builds & deployments and knowledge on production deployments.
  • Administrated GIT Source code tools and ensured the reliability of the application as well as designed the Branching strategies for GIT.
  • Perform daily operational responsibilities to include incidents, problems, and customer requests in a timely manner.
  • Developing the deployment & delivery pipelines in Jenkins
  • Integrating pipelines with various application specific testing tools (Selenium, Cucumber & SonarQube) for code quality during continuous delivery.
  • Engaging with different application teams to understand their current CICD process in production and come up with the deployment strategies to successfully migrate the applications.
  • Implementing CICD strategies to advance the DevOps methodologies on Cloud.
  • Building and maintaining CICD toolkit (SVN,Maven, Jenkins, SonarQube, Nexus, Jfrog, Terraform) for monitoring, troubleshooting and resolving issues in dev
  • Utilize Docker, Kubernetes, AWS EC2 and other technologies to implement Continuous Integration and Continuous Deployment (CI/CD) Agile DevOps solutions converting existing AWS infrastructure to serverless architecture (AWS Lambda) deployed via Terraform and AWS Cloud formation.
  • Experienced working on Development, Test, QA, Performance, UAT environment builds & deployments and knowledge on production deployments.

Confidential

Hadoop Developer

Responsibilities:

  • Provide the time estimates for development effort on Hadoop.
  • Managing the code in version control system (SVN/Git).
  • Development of MapReduce programs for parsing and loading data into HDFS.
  • Develop Sqoop scripts to import data from oracle and create external and managed tables for further processing.
  • Creating Hive tables, Loading and analyzing data using HIVE queries.
  • Develop automated scripts for all jobs starting from pulling the data from Mainframes and Electronic transfer folders.
  • Develop and design applications and automation using dollar universe scheduling tool, Oozie workflows and shell scripting with error handling and mailing systems.
  • Monitoring the workflows in Jenkins pipelines.
  • Develop SQOOP scripts to export data into oracle tables for QlikView dashboards.
  • Project implementing in agile-methodology.

Tools: /Environment: GitHub, Git, Hadoop MapReduce, Hive, Sqoop, pig, Oozie, $Universe, Informatica PowerCenter 9x, QlikView, Oracle SQL, Unix, Dollar Universe Scheduler, Tortoise SVN, Versionone requirements management tool

We'd love your feedback!