We provide IT Staff Augmentation Services!

Devops/aws Engineer Resume

2.00/5 (Submit Your Rating)

San Jose, CA

SUMMARY

  • Around 8+ years of IT industry Experience in Linux Administration, with Software Configuration Management, Change Management, build automation, Release Management and Azure/AWS/DevOps experience in large and small software development organizations.
  • Experience in using Build Automation tools and Continuous Integration concepts by using tools like ANT, Jenkins and Maven.
  • Experience in Big Data processing using Apache Hadoop.
  • Experience using Big Data technologies including Hadoop stack.
  • Experience in using Configuration Management tools like Puppet, Chef, Ansible.
  • Developed Puppet modules to automate application installation and configuration management.
  • Expertise on all aspects of chef server, workstations, Nodes, chef clients and various components like Ohai, push jobs, supermarket etc.
  • Extensively worked on Vagrant Docker based container deployments to create environments for dev teams and containerization of env’s delivery for releases.
  • Experienced in Provisioning of IAAS & PAAS concepts of cloud computing and Implementing using, Aws, AZURE, Google Cloud Platform understanding the principals of (SCM) in Agile, Scrum and Waterfall methodologies.
  • Manage the Windows Azure infrastructure for our customers depends on their requirement. Worked on Microsoft AZURE Storage - Storage accounts, blob storage, managed and unmanaged storages. Responsible of web application deployments over cloudservices (web and worker roles) on Azure, using VS and PowerShell.
  • Experience in working on Docker Hub, creating Docker images, and handling multiple images primarily for middleware installations and domain configuration.
  • Experience on Migrating SQL database to Azure data Lake, Azure data lake Analytics, Azure SQL Database, Data Bricks and Azure SQL Data warehouse and Controlling and granting database access and Migrating On premise databases to Azure Data lake store using Azure Data factory.
  • Hands on experience in GNU tool chain (Gcc, gdb, make, arm Gcc)
  • Enterprise-level experience - Working with large teams and involved in architecture of highly scalable, secure, distributed applications, aligned with company standards, process, methodologies, and best practices.
  • Ensured the responsible for overall technical quality, migration, deployment standards and architecture of the application.
  • Planning and implementation of data and storage management solutions in Azure (Azure SQL, Azure files, Queue storage, Blob storage). Implementing scripts with PowerShell for Runbooks.
  • Knowledge on various Docker components like Docker Hub, Machine, Compose and Docker Registry.
  • Experience in VSTS, TFS, Gated Check - ins, Build Definitions, Release Management, PowerShell, Power BI.
  • Maintained Jenkins masters with over 80+ jobs for over 10+ different applications supported Several Quarterly and project releases in parallel.
  • Experience in Performance Monitoring & Analysis using AppDynamics, Splunk, Dynatrace, Windows PerfMon, SQL Profiler, VMStat, IOStat, etc.
  • In-depth noledge of Data Sharing in Snowflake.
  • Experience writing and maintaining Big Data Pipelines using Hadoop, Hive, Kafka etc.
  • Efficient in Build and configuration of infrastructure in cloud development through the usage of Terraform tool as IaC, and ability of modifying the Terraform modules based on the requirement of development project.
  • Deliver of Desktop based UI projects
  • Experience with Snowflake Multi-Cluster Warehouses.
  • Worked with Apache Kafka for High throughput for both publishing and subscribing, with disk structures dat provide constant performance even with many terabytes of stored messages.
  • Strong noledge of TCP/IP, UDP, DNS network, load balancing, firewalls and enterprise monitoring tools Splunk.
  • Created Web API methods for three adapters to pull data from various systems like Database, BizTalk and SAP
  • Proficient inPython, Shell Scripting, SQL, build utilities like open make,ANTand Cruise Control.
  • Administered and Configured the TFS with in multi -platform environments.
  • Management and Administration of AWS Services CLI, EC2, VPC, S3, ELB Glacier, Route 53, Cloudtrail, IAM, and Trusted Advisor services.
  • Experience in monitoring & capturing browser side performance metrics using Browser DevTools, AppDynamics EUM (End User Monitoring), Dynatrace UEM (User Experience Management) and good noledge on PageSpeed, sitespeed.io, webpage test for evaluating client - side performance from real browsers.
  • Experienced in Gitlab CI and Jenkins for CI and for End-to-End automation for all build and CD.
  • Excellent SQL query development and optimization skills PL/SQL, SQL.
  • Proficiently administer and implement Professional Services Salesforce.com; FinancialForce.com; Marketo.com; Jitterbit; Cloud Computing; PaaS; and SaaS technologies.
  • Expertise in using Nexus and Arti factory Repository server for Maven and Gradle builds.
  • Ability to build deployment, build scripts and automated solutions using Shell Scripting.
  • Experience in using monitoring tools like Icinga, Nagios.
  • In Depth experience in Oracle/SQL Server: DDL / DML statements, complex SQL queries, Stored Procedures, Functions and Triggers.
  • Good noledge and experience in using Elasticsearch, kibana and fluentd, CloudWatch, Nagios, Splunk, Prometheus and Grafana for logging and monitoring.
  • Experienced in branching, tagging and maintaining the version across the environments using, Software Configuration Management tools like GITHUB, Subversions (SVN) like GIT, and Team Foundation Server (TFS) on Linux and Windows platforms.
  • Experienced migrating SVN repositories to GIT.

TECHNICAL SKILLS

DevOps Tools: Nexus Repository, SonarQube, Jenkins, Puppet, Chef, Ansible, Docker, Nagios, PING, GIT.

Infrastructure as A service: AWS, Azure, Azure Databricks, Azure SQL database, Azure SQL Datawarehouse, open stack (basic understanding).

Virtualization Platforms: Virtual Box, VMware, Vagrant.

Operating Systems: UNIX, Linux, Windows, FreeBSD.

Scripting Languages: Bash, Perl, Python, Ruby.

Version Control Software: Subversion, GIT, Perforce.

CD Tools: Cruise, Urban CodeDeploy, UrbanCode Release/Build

Logging: Sumo Logic, Splunk, Salesforce.

Monitoring 24/7: Nagios, Page Duty.

PROFESSIONAL EXPERIENCE

Confidential, San Jose, CA

DevOps/AWS Engineer

Responsibilities:

  • Leveraged various AWS solutions likeEC2, S3, IAM, EBS, Elastic Load Balancer (ELB), Security Group, Auto ScalingandRDSin cloud Formation JSON templates
  • DefinedAWS Lambdafunctions for making changes to AmazonS3buckets and updatingAmazon DynamoDBtable.
  • Created snapshots andAmazon machine images(AMI) of the instances for backup and createdIdentity Access Management (IAM)policies for delegated administration within AWS.
  • Experience in using Cloud Infrastructure management and ImplementationWorking experience on various Azure services like Compute (Web Roles, Worker Roles), Azure Websites. Caching, SQL Azure, NoSQL, Storage, Network services, Azure Active Directory, Scheduling, Auto Scaling, and Power Shell Automation.
  • Extensive working experience with different SDLC methodologies such as Agile and Waterfall with an ability to be creative and to take self-initiatives to execute/manage multiple projects in parallel during time critical situations.
  • Deployed AzureIaaS Virtual Machines (VM’s) and PaaS role instance s (Cloud Services) into secure VNets and subnets, designed VNets and Subscriptions to confirm to Azure Network Limits.
  • Acted as build and release engineer, deployed the services by VSTS (Azure DevOps) pipeline. Created and Maintained pipelines to manage the IAC for all the applications.
  • Migrated 9 microservices to Google Cloud Platform from skava and has one more big release planned with 4 more microservices.
  • Experienced in processing Big data on the Apache Hadoop framework using MapReduce programs.
  • Experience in working with Windows, UNIX/LINUX platform with different technologies such as Big Data, SQL, XML, HTML, Core Java, Shell Scripting etc.
  • Design Setup maintain Administrator the Azure SQL Database, Azure Analysis Service, Azure SQL Data warehouse, Azure Data Factory.
  • Maintained the user accounts IAM Roles, Route 53(CNAME), VPC, RDB, MongoDB, SQS & SNS services in AWS cloud.
  • Integrated Kafka with Flume in sand box Environment using Kafka source and Kafka sink.Implemented various resources in Azure using Azure portal, PowerShell on Azure Resources Manager deployment models. Experience deploying infrastructure as code applications using ARM Templates (JSON).
  • Experience developing low latency SQL queries and stored procedures.
  • Monitoring various Performance metrics using AppDynamics, Splunk, Windows Perfmon, VMStat, IOStat, etc.
  • Very strong in C# as development language, .NET framework concepts and implementation and OOPs concepts
  • Strong ASP .NET MVC design and development experience including Unit.
  • CreatingPython scriptsto fully automate AWS services which includesELB, Cloud Front Distribution, EC2, Security Groups andS3. This script creates stacks, single servers and joins web servers to stacks.
  • Experience using Big Data technologies including Hadoop stack.
  • Amazon IAM service enabled to grant permissions and resources to users. Managed roles and permissions of users with the halp of AWS.
  • Experience in building and deploying solutions to big data problems with various technologies.
  • Developed complex SQL queries and procedures for routine and ad-hoc reporting
  • Coordinating with DevOps/TechOps team in instrumenting various Dashboards & Reports for Performance statistics in AppDynamics & Splunk and diagnosing the identified Performance issues using AppDynamics and Splunk.
  • Experience with Big Data tools and technologies including working in a Production environment of a Hadoop Project
  • Wrote python scripts to manage AWS resources fromAPIcalls usingBOTO SDKalso worked withAWS CLI.
  • UsedAWS Route53, to route the traffic between different availability zones. Deployed and supported Mem-cache/AWS Elastic Cache and tan configuredElastic Load Balancing (ELB)for routing traffic between zones.
  • UsedIAMto create new accounts, roles and groups and policies and developed critical modules like generating amazon resource numbers and integration points withDynamoDB, RDS.
  • WroteChefCookbooks to install and configureIIS7, Nginxand maintained legacy bash scripts to configure environments and tan converted them toRuby Scripts.
  • Involved in Migrating Objects from Teradata to Snowflake
  • Configured in setting up CICD pipeline integrating various tool with CloudBeesJenkins to build and run Terraform script templates to create infrastructure in Azure.
  • Heavily involved in testing Snowflake to understand best possible way to use the cloud resources.
  • Designed and implementing Aws Cloud Infrastructure by creating templates for Aws platform also used Terraform to deploy the infrastructure necessary to create development, test and production environments.
  • Worked on Power Shell scripts to automate the Azure Cloud system in creation of Resource groups, Web Applications, Azure Storage Blobs& Tables, firewall rules and used Python scripts to automate day to day administrative tasks.
  • Deployed windows Kubernetes cluster with Azure Container Service (ACS) from Azure CLI and Utilized Kubernetes and Docker for runtime environment of the CICD system to build, test and deploy.
  • Working withGITHUBto store the code and integrated it toAnsible Towerto deploy the Playbooks.
  • Automated various infrastructure activities like Continuous Deployment, Application Server setup, Stack monitoring usingAnsible Playbooksand TEMPhas integrated Ansible withJenkins.
  • WroteCICD pipelineinGroovy scriptsto enable end to end setup of build & deployment usingJenkins.
  • WroteAnsiblePlaybooks using Python SSH as Wrapper for Managing Configurations of my servers, Nodes, Test Playbooks on AWS instances usingPython.

Environment: AWS, Azure, S3, EC2, ELB, IAM, RDS, VPC, SES, SNS, EBS, Cloud Trail, Auto Scaling, Chef, Jenkins, Maven, JIRA, Linux, Java, Kubernetes, Terraform, Docker, AppDynamics, Nagios, ELK, SonarQube, Nexus, JaCoCo, JBOSS, Nginx, PowerShell, Bash, Ruby and Python.

Confidential, Chicago, IL

Azure DevOps Engineer

Responsibilities:

  • Worked closely with developers in building Angular application and Troubleshooting UI build issues.
  • Administrating and supporting company’s Azure Kubernetes infrastructure, ensuring it is secure, resilient and performance and responsible for complete DevOps activities and coordinating with development team.
  • Working as Kubernetes Administrator, involved in configuration for web apps, Azure App services, Azure Application insights, Azure Application gateway, Azure DNS, Azure traffic manager, App services.
  • Configured V-net integration, Active directory, Encryption, and security on Azure using ARM templates and PowerShell scripts.
  • Involved in design, implementation and modifying the Python code.
  • Developed website both frontend and backend modules using Python Django Web Framework.
  • Developed Python and shell scripts for automation of the build and release process.
  • Developed PowerShell scripts and ARM templates to automate the provisioning and deployment process.
  • Strong experience with CI/CD tools, Azure DevOps, IaC Pipelines, Agents, Build/Deployment scripts
  • Expertise to build DevOps pipelines for custom apps as well as for packaged products, CMS apps, microservices drive apps. Possesses experience in integrating surrounding tools such as testing, monitoring, security testing, IaC tools, etc.
  • Ability to write SQL queries.
  • Strong Automation Experience. Good experience on Docker, scripting, package manager tools, code quality and security assessment tools, Azure Monitor, App. Insights
  • TEMPHas good experience with Azure Cloud, PaaS, IaaS, SaaS architecture from Technical and non-Technical perspectives.
  • Worked on the CICD setup for all the Microservices on Profile, Service Availability, Cart, Pricing and promotion, Product Configuration, Foundation framework and product listing.
  • Worked on the Microservices for Unit testing, Integration test cases, Status check foundation framework, Status check tdd contracts, Compile, Package, Verify, Quality Scan, Publish Artifact, Deploy to Development and Start Functional Testing.
  • Developed a stream filtering system using Spark streaming on top of Apache Kafka.
  • Designed a system using Kafka to auto - scale the backend servers based on the events throughput.
  • Recreating existing application logic and functionality in the Azure Data Lake, Data Factory, SQL Database and SQL data warehouse environment.
  • Design and implement database solutions in Azure SQL Data Warehouse, Azure SQL.
  • Performed advanced administrative skills from role design to improving architecture to work effectively with the Salesforce reporting module
  • Efficiently handled all Salesforce-related matters, which involved complex reporting and analytics; Apex coding; and Jitterbit data integrations, SFTP drops, and to a wide variety of end points.
  • Also worked on Apache Hadoop and used Kafka for messaging system and spark for processing large sets of data.
  • Responsible for performance tuning of PL/SQL packages and SQL queries.
  • Worked on to setup for the various Jenkins CICD pipeline configurations for all the Microservices.
  • Worked on the build activities for all the existing Microservices.
  • Conduct post implementation analysis, review lessons learnt and recommendations for implementing continuous improvement.
  • Aligned Azure and Google Cloud Platform capabilities and services with workload requirements
  • Setup Alerting and monitoring using Stack driver in GCP.
  • Created automation and deployment templates for relational and NOSQL databases including MongoDB and Redis.
  • Experience migrating infrastructure and application from on premise to Azure and from Cloud to Cloud such as AWS to Microsoft Azure and GCP
  • Prepared capacity and architecture plan to create the Azure Cloud environment to host migrated IaaS VMs and PaaS role instances for refactored applications and databases.
  • Responsible for pom, Unit test cases, Sonar dashboard for Selenium Test Cases for all the Microservices.
  • Worked on the ms's logs and sonar dashboard monitor for all the existing Microservices.
  • Recommend DevOps tools for Continuous Integration, Deployment and DevOps Enablement.
  • Provide recommendations for building the automated lifecycle for DevOps.
  • Experience to manage the project lifecycle end to end.
  • Moved all Kubernetes container logs, application logs, event logs and cluster logs, activity logs and diagnostic logs into Azure EventHub’s and tan into Splunk for monitoring.
  • Created a SPLUNK search processing Language (SPL) queries, reports and dashboards.
  • Improve speed, efficiency and scalability of the continuous integration environment, automating wherever possible using Python, Ruby, Shell and PowerShell Scripts.
  • Implemented Azure Storage, Azure SQL Azure Services and developing Azure Web role.
  • Daily monitoring production servers using Grafana and Prometheus which is integrated with kubernetes, exceptions and report to the team if something happen during standups.
  • Managing Azure DevOps build and release pipeline. Setting up new repos managing the permissions for various GIT branches. Deployed microservices, including provisioning AZURE environment.
  • Used Make as build orchestration tool, configured targets for the build artifacts.
  • Build Docker images, published artifacts to AWS cloud platform using Jenkins file.
  • Written Jenkins file to standardize multiple stages for CI pipeline.
  • Has experience working with Source code management tools like Git, Gerrit.
  • Development level experience in Microsoft Azure, ASP.NET, ASP, and C #.NET, Web Services, WCF, ASP.NET Web API, ADO.NET, JavaScript, jQuery, AngularJS, Bootstrap, PowerShell, CSS, HTML, UML and XML.
  • Worked on Implementation of Ping Federate for Cross platform Autantication for multiple applications.
  • Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data Factory, T-SQL, Spark SQL and U-SQL Azure Data Lake Analytics . Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks.
  • Designing new databases and data schemas for the high-profile customers-facing portal keeping strong attention on data integrity and queries execution efficiency applying noledge of MariaDB and Azure Databricks.
  • Responsible for estimating the cluster size, monitoring and troubleshooting of the Spark data bricks cluster.
  • Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data Factory, T-SQL, Spark SQL and U-SQL Azure Data Lake Analytics. Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks.
  • Experience with Azure transformation projects and Azure architecture decision making Architect and implement ETL and data movement solutions using Azure Data Factory (ADF), SSIS
  • Working on Cloud computing using Microsoft Azure with various BI Technologies and exploring NoSQL options for current back using Azure Cosmos DB (SQL API)
  • Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data Factory, T-SQL, Spark SQL and U-SQL Azure Data Lake Analytics.
  • Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks.

Environment: AWS, Azure, Google Cloud (GCP), Angular UI, Microservices, AZURE Databricks, Git, Python, Terraform, Gerrit, Jenkins, Make, Bezel, Ping Federate, Docker, VM, Linux. Windows

Confidential, Allentown, PA

DevOps Developer / Engineer

Responsibilities:

  • Interacted with client teams to understand client deployment requests.
  • Coordinate with Development, Database Administration, QA, and IT Operations teams to ensure there are no resource conflicts.
  • Worked closely with project management to discuss code/configuration release scope and how to confirm a successful release.
  • Created multiplePython, Bash, Shell and Ruby Shell Scriptsfor various application-level tasks.
  • Build, manage, and continuously improve the build infrastructure for global software development engineering teams including implementations of build Scripts, continuous integration infrastructure and deployment tools.
  • Managing the code migration from TFS, CVS and star team Subversions repository.
  • Configured in setting up CI/CD pipeline integrating various tool with Cloud Bees Jenkins to build and run Terraform script templates to create infrastructure in Azure.
  • Administered the TFS and VSS Repositories for the Code check in and checkout for different Branches.
  • Provisioned EC2 instances into AWS by using Terraform scripts from scratch to pull images from Docker and performed AWS S3 buckets creation, policies on IAM role-based policies and customizing the JSON template.
  • Implemented continuous integration using Jenkins.
  • Automated setting up server infrastructure for theDevOpsservices, usingAnsible,shellandpython scripts.
  • Installed, configured, managed and monitoring tools such asSplunk, Nagios and Graphitefor Resource monitoring, network monitoring, log trace monitoring.
  • Using Jira, Confluence as the project management tools.
  • Configured AWS Multi Factor Autantication in IAM to implement 2 step autantication of user's access using Google Autanticator and AWS Virtual MFA .
  • Monitored and tracked SPLUNK performance problems, administrations, and open tickets with SPLUNK.
  • Has moved all Kubernetes container logs, application logs, event logs and cluster logs, activity logs and diagnostic logs into Azure EventHub’s and tan into Splunk for monitoring.
  • Successfully collaborated with cross-functional teams in design and development of software features for enterprise satellite networks usingC /C++,leading to senior role in the organization
  • Created repositories according to the structure required with branches, tags and trunks.
  • Attended sprint planning sessions and daily sprint stand-up-meetings.
  • Scheduled different Snowflake jobs using NiFi.
  • Configured applications servers (Apache Tomcat) to deploy the code.
  • Setting up SPLUNK monitoring on Linux and windows systems.
  • Installation and configuration and setup of Docker container environment.
  • Created a Docker Image for a complete stack and created a mechanism via Git workflow to push the code into the container, setup reverse proxy to access it.

Environment: Chef, Apache Tomcat, GIT, Python, Ruby, Bamboo, Perl, Shell, Maven, Jenkins, JIRA, Kubernetes, Docker.

Confidential

Unix/Linux Administrator

Responsibilities:

  • Manage daily builds and deployments to multiple Dev, QA, SIT and PROD environments.
  • Coordinating database drops, debugging builds and deployment issues.
  • Implemented software engineering best practices around software release tools and Release management.
  • Installed and configured SSH server on Red hat/CentOS Linux environments. Managed VMs for Solaris x86 and Linux on VMware ESX 3.5 and administering them with VI Client.
  • Involved in implementing and Administrating enterprise level data backups and Data Recovery.
  • Maintained maximum uptime and maximum performance capacity for Enterprise Production, QA, and UAT/Staging.
  • Configuring network services such as DNS/NFS/NIS/NTP for UNIX/Linux Servers. Accomplished User administration of UNIX / Linux account using SeOS.
  • Automated tasks using Korn, Bash Shell scripts.
  • Used Patch management using native commands on Linux and following the chance control procedures.

Environment: s: RedHat Linux 3/4, VERITAS Cluster Server 5.0, VERITAS Net backup 6.5, VM Ware ESX 3.5, Virtual Centre.

We'd love your feedback!