We provide IT Staff Augmentation Services!

Aws /devops Engineer Resume

4.00/5 (Submit Your Rating)

Temple Terrace, FL

SUMMARY

  • Over 7+ years of experience in IT industry comprising of delivery management, design, development, release & deployment, and cloud implementation.
  • Good experience on Devops tools such as Chef, Vagrant, Virtual Box, Tomcat, WebLogic, WebSphere, Bit Bucket, GitHub, Puppet, Ansible, Jenkins, Maven, ANT, GIT, and Docker.
  • Experience in Infrastructure Development and Operations involving AWS Cloud platforms, EC2, EBS, S3, VPC, RDS, SES, ELB, Auto scaling, Cloud Front, Cloud Formation, Elastic Cache, Cloud Watch, SNS. Strong Experience on AWS platform and its dimensions of scalability including VPC, EC2, ELB, S3 and EBS, ROUTE 53.
  • Worked with automating, configuring and deploying instance on AWS, aws and Rack space
  • Implemented puppet modules to automate the installation and configuration for a broad range of services.
  • Ability to identify and gather requirements to define a solution to be build and operated on AWS.
  • Extensively worked on Jenkins for continuous integration and for End - to-End automation for all build and deployments. Set up Continuous Integration for major releases in Jenkins.
  • Knowledge and experience in creating Jenkins Pipeline.
  • Experienced with build automation tools like Ant and Maven.Experienced with Docker container service.
  • Docker applications by creating Docker images from Docker file.
  • Experienced in Branching, Merging, Tagging and maintaining the version across the environments using SCM tools like GIT on Linux platforms.
  • Experienced in setting up of AWS relational databases like MYSQL, MSSQL and NoSQL database DynamoDB
  • Integration, deployment and automation of application servers Tomcat, Web Logic across Linux platforms.
  • 3 years' experience with Firewalls, IPS, VPN and other edge and network security components - Vyatta GW, Checkpoint, Cisco ASA, Juniper JunOS FW
  • Experience with Firewalls, IPS, VPN and other edge and network security components - Vyatta GW, Fortigate, Checkpoint, Cisco ASA, Juniper JunOS FW.
  • Interprets business goals and communicate it to engineering and operations teams and help identify opportunities for the assigned Cloud Security Product Portfolio to achieve results
  • Setting up databases in AWS using RDS, storage using S3 bucket and configuring instance backups to S3 bucket.
  • Experience in using Bug tracking tools like JIRA and HP Quality center.
  • Extensively experienced in Bash, Perl, Python, Ruby scripting on Linux.
  • Used Python fabric for AWS instance provisioning.
  • Experience in setting up Baselines, Branching, Merging and Automation Processes using Shell, Perl, Ruby,Python and Bash Scripts.
  • Designed distributed processing architecture to monitor and maintain continuous security & compliance by using AWS Services and Python.
  • Strong experience with cloud security strategy, cloud provider ecosystems (Amazon AWS) & migrating Enterprise from traditional data center Infrastructure, Application and Data designs to hybrid or fully-cloud enabled practices.
  • 2 years of experience designing and implementing network security solutions, including firewalls, intrusion detection, encryption, monitoring, vulnerability scanning, and authentication.
  • Automation, Integration, Deployment and Operations of Security Solutions
  • Responsible for strategic financial planning within assigned Security Product Portfolio, including budgeting, forecasting, and financial planning of new and existing Security Products & Services
  • Own all aspects of cloud security product definition including vendor integration, platform integration and monitoring for cloud platforms including but not limited to AWS.
  • Develop, maintain, and communicate the vision of the Cloud Security Product Portfolio.
  • Expertise in Agile Testing Methodologies &Software Test Life Cycle (STLC)
  • Code development using Eclipse, HTML, java, JSP, SWING, Servlet and SQL.
  • Strong in building using java writing Shell Scripts on UNIX.
  • Experience in deploying system stacks for different environments like Dev, UAT,SIT and Prod in both on premise and cloud infrastructure.
  • Strong experience in working in Linux and Windows environments.
  • Worked on bug tracking tools and have good experience on tools like Jira (Bug tracking and monitoring), and also used web based tools like Confluence (collecting thoughts and knowledge), crucible (pear review of codebase), HipChat (Secure chatting in between the team), Fisheye(to secure browsing) also for issue tracking and project management.
  • Good knowledge on ITIL Process and have worked on coordinating releases across the projects
  • Good analytical, problem solving, communication skills and have the ability to work either independently with little or no supervision or as a member of a team.
  • Excellent written and verbal communication skills, strong organizational skills, and a hard-working team player.

TECHNICAL SKILLS

Operating Systems: RHEL/Cent OS 5.x/6.x/7, Ubuntu/Debian/Fedora, Sun Solaris, Windows Server.

Cloud Technologies: Amazon Web Services (IAM, S3, EC2, VPC, ELB, Route53, RDS, Auto Scaling, Cloud Front), Jenkins, GIT, CHEF, CONSUL, Docker, and Rack Space

Devops Tools: Urban Code Deploy, Jenkins (CI), Puppet, Chef, Ansible, AWS.

Build Tools: ANT, MAVEN, Gradle, Nant, MS Build, Control-M / Kom& Bash shell, Data Power.

Languages/ Scripts: Java/J2EE, C, C++, SQL, JAVASCRIPT, Languages Shell, Bash, Perl, Ruby and Python scripting.

Databases: MySQL, Mongo DB, Cassandra, Postgre SQL, SQL Server, Aurora.

Web/App Server: Apache, IIS, HIS, Tomcat, WebSphere Application Server, JBoss.

Bug Tracking Tools: JIRA, Rally, Remedy and IBM Clear Quest, Bugzilla, HP Quality Center.

Versioning Tool: RTC, GIT, TFS, Clear case, Perforce, CVS, VSS.

CI Tools: Hudson, Jenkins, Bamboo, Cruise Control.

Devops/Build & Release Engineering: Jenkins, Perforce, Docker, Udeploy AWS, Chef, puppet, Ant, Vagrant, Atlassian-Jira, GITHub, Bit Bucket, Teamcity, Ansible, Open Stack and Salt Stack, Splunk, Zabbix, Nexus

PROFESSIONAL EXPERIENCE

Aws /Devops Engineer

Confidential, Temple terrace, FL

Responsibilities:

  • Leveraged AWS cloud services such as EC2, auto-scaling and VPC to build secure, highly scalable and flexible systems that handled expected and unexpected load bursts.
  • Build and configure a virtual data center in the Amazon Web Services cloud to support Enterprise Data Warehouse hosting including Virtual Private Cloud (VPC), Public and Private Subnets, Security Groups, Route Tables, Elastic Load Balancer.
  • Continuously managed and improved the build infrastructure for global software development engineering teams including implementation of build scripts, continuous integration infrastructure and deployment tools.
  • Used AWS Elastic Beanstalk for deploying and scaling web applications and services developed with Java, PHP, NodeJS, Python, Ruby and Docker on familiar servers such as Apache and IIS.
  • Strong development and design experience with various java and JEE frameworks like Spring, Spring boot, Groovy, Grails, JAX-RS, JAX-WS, Apache CXF, Jersey, Apache Axis, JPA, Hibernate, MyBiatis, Struts, JSF, EJB 3.1, EJB 2.1 and JMS.
  • Designed AWS Cloud Formation templates to create custom sized VPC, subnets, NAT to ensure successful deployment of Web applications and database templates.
  • Built S3 buckets and managed policies for S3 buckets and used S3 bucket and Glacier for storage and backup on AWS
  • Developed and managed cloud VMs with AWS EC2 command line clients and management console. Implemented DNS service through Route 53 on ELBs to achieve secured connection via https.
  • Good knowledge in creating and maintaining various Devops related tools for the team such as provisioning scripts, deployment tools and staged virtual environments using Docker and Vagrant. Great exposure to network protocols like TCP/IP, UDP, DNS, SMTP, FTP, TELNET, HTTP and frame works like struts, spring and Hibernate.
  • Implemented automated local user provisioning instances created in AWS cloud.
  • Setup and build AWS infrastructure various resources, VPC EC2, S3, IAM, EBS, Security Group, Auto Scaling, and RDS in Cloud Formation JSON templates.
  • Hands on Experience on Cloud automation, Containers and PaaS ( Cloud foundry ) which helps to trigger the inherent originality of an individual using Terraform.
  • Experience in using Cloud foundry (CF) CLI for deploying applications and other CF management activities.
  • Provide highly durable and available data by using S3 data store, versioning, lifecycle policies, and create AMIs for mission critical production servers for backup.
  • Maintained the user accounts (IAM), RDS, Route 53, VPC, RDB, Dynamo DB, SES, SQS and SNS services in AWS cloud.
  • Experience in implementing Data warehouse solutions in AWS Redshift , worked on various projects to migrate data from one database to AWS Redshift, RDS, ELB, EMR, Dynamo DB and S3.
  • Stored and retrieved data from data-warehouses using Amazon Redshift .
  • Strong Experience in implementing Data warehouse solutions in Confidential Redshif t; Worked on various projects to migrate data from on premise databases to Confidential Redshift , RDS and S3.
  • Optimizing and tuning the Redshift environment, enabling queries to perform up to 100x faster for Tableau and SAS Visual Analytics
  • Wrote various data normalization jobs for new data ingested into Redshift
  • Advanced knowledge on Confidential Redshift and MPP database concepts.
  • Migrated on premise database structure to Confidential Redshift data warehouse
  • Provided build, deploy, automation test control, generating reports and notification services with an end goal of continuous integration in a data center and AWS environment.
  • Deployed, configured and installed multiple test (QA) servers on AWS.
  • Developed the puppet manifests for different application and web servers like Apache, Tomcat, Web Sphere applications
  • Design and Develop ETL Processes in AWS Glue to migrate Campaign data from external sources like S3, ORC/Parquet/Text Files into AWS Redshift.
  • Data Extraction, aggregations and consolidation of Adobe data within AWS Glue using PySpark.
  • Setting up to use python with Aws Glue
  • Calling Aws Glue APIs in python
  • Using Python Lbraries with AWS Glue
  • Implemented rapid-provisioning and life-cycle management for Ubuntu Linux using Amazon EC2, Chef, and custom Ruby/Bash scripts.
  • Designed Splunk Enterprise 6.5 infrastructure to provide high availability by configuring clusters across two different data centers.
  • Installed, Configured, Maintained, Tuned and Supported Splunk Enterprise server 6.x/5.x.
  • Architect and Implement Splunk arrangements in exceptionally accessible, repetitive, conveyed figuring situations.
  • Performed Field Extractions and Transformations using the RegEx in Splunk
  • Responsible for Installing, configured and administered Splunk Enterprise on Linux and Windows servers.
  • Supported the upgradation of Splunk Enterprise server and Splunk Universal Forwarder from 6.5 to 6.6.
  • Installation and implementation of the Splunk App for Enterprise Security and documented best practices for the installation and performed knowledge transfer on the process.
  • Worked on installing Universal Forwarders and Heavy Forwarders to bring any kind of data fields into S Splunk.
  • Built Continuous Integration environment Jenkins and Continuous delivery environment.
  • Coordinating stakeholders and select customers to specify and validate a club-membership system by combining Agile-SCRUM with Acceptance Test Driven Development (ATDD) techniques.
  • Hands-on experience with JIRA tool. Used different Agile methodologies such as SCRUM
  • Install, configure, modify, test & deploy applications on Apache Webserver, Tomcat, JBoss App Servers.
  • Responsible for the design, creation and quality of Instruction Manuals for on - site installation of Core and Power Shell type transformers.
  • I have used Docker that automates the deployment of Linux applications inside software containers. Docker provides an additional layer of abstraction and automation of operating-system-level virtualization on Linux.
  • Unix Administration, File system support, new system installation and performance monitoring for AIX … and Redhat and Centos Servers
  • Creation of virtual machines and installation of Experienced in installation, configuration, administration, troubleshooting, tuning, security, backup, recovery and upgrades of RHEL Linux / Centos on them
  • Installed Redhat Linux using kick start and applying security polices for hardening the server based on company's policies. building and support of VMs and ESXi servers for Westinghouse, Boeing and others.
  • Patching of ETS Redhat servers through Solaris Jump server.
  • Experienced in installation, configuration, administration, troubleshooting, tuning, security, backup, recovery and upgrades of RHEL Linux
  • I worked with Docker and Kubernetes on multiple cloud providers, from helping developers build and containerize their application (CI/CD) to deploying either on public or private cloud.
  • Installed Docker using Docker tool box and created Docker files for different environments
  • Experience in Migration & deployment of Applications with upgrade version of Application & Hardware, MS build, batch script, IIS and Jenkins Administrator.
  • Worked on the transition project, which involves migration activities from Ant to Maven to standardize the build across all the applications.
  • Cloud Migration for Confidential ’s existing Surveillance Pattern application migration from Netezza / GreenPlum / Oracle On-Premise to AWS Cloud system using AWS services like VPC, EMR , S3 and Big Data analytic tools.
  • Re-Write several slow performing HQL quires or application to fast and cost efficiently performing during Post-Migration.
  • Plan and implement huge data migration activities across different data warehouse appliances. Familiar with different native tools that are available at the appliance level to move terabytes of data daily.
  • Developed scripts for Data migration from legacy applications to New applications
  • Configured plugins for the integration tools to the version control tools.
  • Manage source code, software builds, software versioning, & defect tracking on software maintenance tasks/projects.
  • Experience with defining and creating usable and customer friendly, intuitive interfaces to the JIRA tool in a fast-paced evolving environment.
  • Used AWS Data Pipeline for creating complex data processing workloads.
  • Implemented a 'server less' architecture using API Gateway, Lambda, and Dynamo DB and deployed AWS Lambda code from Amazon S3 buckets. Created a Lambda Deployment function, and configured it to receive events from your S3 bucket
  • Used ANT and MAVEN as a build tools on java projects for the development of build artifacts on the source code.
  • Writing Splunk Queries, Expertise in searching, monitoring, analyzing and visualizing Splunk logs.
  • Experience in alert handling, standard availability and performance report generation. Experience in root cause analysis of post-production performance related issues through Splunk tool.
  • Designing, optimizing and executing Splunk-based enterprise solutions.
  • Automate the Build and deploy of all internal Java &SC environments using various continuous integration tools and scripting languages (Python, Shell, PowerShell).
  • Implemented Server and Client side validations using ASP.NET validation controls and JavaScript.
  • Experience using cloud providers and API's for Amazon (AWS).
  • Experience in Developing KORN, BASH, PERL, Python shell scripts to automate cron jobs and system maintenance.
  • Experienced in organization of uses on Apache Web server, Nginx, JBOSS, WebLogic and WebSphere Application
  • Responsible for moving data between different AWS compute and storage services by using AWS Data Pipeline.
  • Created a Python process hosted on Elastic Beanstalk to load the Redshift database daily from several source
  • Manage AWS EC2 instances utilizing Auto Scaling, Elastic Load Balancing and Glacier for our QA and UAT environments as well as infrastructure servers for GIT
  • Administered and Engineered Jenkins for managing weekly Build, Test and Deploy chain, GIT with Dev/Test/Prod Branching Model for weekly releases.
  • Created monitors, alarms and notifications for EC2 hosts using Cloud Watch.
  • Responsible for building out and improving the reliability and performance of Cloud applications deployed on Amazon Web Services.
  • Used AWS Kubernetes to provide a platform for automating deployment, scaling, and operations of application containers across clusters of hosts. Worked closely with development teams and performance test engineer for EC2 size optimization and Docker build containers
  • Experience with the cloud versions of JIRA, Confluence, and Bitbucket Pipelines.
  • Experience translating Confluence documentation into JIRA.
  • Performed Task & resolving issues in Project Integration stream when developers faced problem of delivering.
  • Strong working knowledge in developing Restful webservices and Micro Services using Golang .
  • Implemented REST services by Golang with microservices architecture.
  • Developed new RESTful API services that work as a middleware between our application and third-party APIs that we will used using Golang.
  • Using GO, developed a microservice for reading large volume of data(millions) from PostgreSQL database.
  • Experience writing data APIs and multi-server applications to meet product needs using Golang.
  • Migrated applications to the AWS cloud.Involved in DevOps processes for build and deploy systems.
  • Created Python scripts to totally automate AWS services which includes web servers, ELB, Cloudfront distribution, database, EC2 and database security groups, S3 bucket and application configuration, this script creates stacks, single servers, or joins web servers to stacks.
  • Planned release schedules with agile methodology & coordinated releases with engineering & SQA for timely delivery.
  • Troubleshoot the automation of Installing and configuring applications in the test environments.
  • Daily routine is to deploy the code to the lower environments, automating the deployment.
  • Responsible for coordinating the Offshore and Onsite team and resolve all the issues faced by the team

Environment: AWS Cloud (EC2, VPC, ELB, S3, RDS, Cloud Trail and Route 53), Cloud Watch, Chef, Perl, Python, AWC EC2, Ant, CI/CD, Bash Scripts, Jira, Maven, Git, SQL, J2EE, Nagios, Subversion, Jenkins, Unix/Linux, Shell scripting, Websphere.

AWS/Devops Engineer

Confidential - MA

Responsibilities:

  • Developed build and deployment processes for Pre-production environments.
  • Developed Shell/Python Scripts for automation purpose.
  • Resolved merging issues during rebasing and re-integrating branches by conducting meetings with Development Team Leads.
  • Ability to build deployment, build scripts and automated solutions using various scripting languages such as Shell, PowerShell, Python, Ruby .
  • Created and wrote shell scripts (Bash), Ruby , Python and PowerShell for automating tasks.
  • Involved in designing and deploying a large applications utilizing almost the entire AWS stack (Including EC2, Route53, S3, RDS, Dynamo DB, SNS, SQS, IAM) focusing on high availability, fault tolerance, and auto-scaling in AWS Cloud Formation.
  • Working on migration project of moving current applications in traditional datacenter to AWS by using AWS services.
  • Launching AmazonEC2 Cloud Instances using Amazon Web Services (Linux/ Ubuntu/RHEL) and Configuring launched instances with respect to specific applications.
  • Installed application on AWS EC2 instances and also configured the storage on S3 buckets. Assisted the team experienced in deploying AWS and Cloud Platform
  • Managed IAM policies, providing access to different AWS resources, design and refine the workflows used to grant access.
  • Created AWS Route53 to route traffic between different regions.
  • Implemented and maintained the monitoring and alerting of production and corporate servers/storage using AWS Cloud watch.
  • Implemented ITIL process while pushing builds and deployments to prod and pre-prod environments.
  • Designed AWS Cloud Formation templates to create custom sized VPC, subnets, NAT to ensure successful deployment of Web applications and database templates.
  • Launched Compute(EC2) and DB(Aurora, Cassandra) instances from Amazon Management Console and CLI.
  • Installed and configured Splunk Universal Forwarders on both UNIX (Linux, Solaris, and AIX) and Windows Servers.
  • Hands on experience in customizing Splunk dashboards, visualizations, configurations using customized Splunk queries.
  • Used Splunk for Application Log, Security Log and Performance monitoring.
  • Configured Splunk multisite indexer cluster for data replication.
  • Developed Splunk infrastructure and related solutions as per automation tool sets.
  • Implemented the Docker for wrapping up the final code and setting up development and testing environment using Docker Hub, Docker Swarm and Docker Container Network.
  • Elastic search experience and capacity planning and cluster maintenance. Continuously looks for ways to improve and sets a very high bar in terms of quality.
  • Implemented real time log analytics pipeline using Elastic search .
  • To facilitate common access, to Elastic search .
  • Setup and configured Elastic search 2.1.0 in a POC test environment to ingest over million records from oracle DB.
  • Configured and setup Kubernetes Cluster environment with a master and 3 minions
  • Docker container deploying micro services, and scaling the deployment using Kubernetes.
  • Designed the data models to be used in data intensive AWS Lambda applications which are aimed to do complex analysis creating analytical reports for end-to-end traceability, lineage, definition of Key Business elements from Aurora.
  • Integrated Docker container orchestration framework using Kubernetes by creating pods, configMaps and deployments
  • Deployed applications on AWS by using Elastic Beanstalk. Integrated delivery (CI and CD) using Jenkins and puppet.
  • Implemented Work Load Management (WML) in Redshift to prioritize basic dashboard queries over more complex longer-running adhoc queries. This allowed for a more reliable and faster reporting interface, giving sub-second query response for basic queries.
  • Responsible for Designing Logical and Physical data modelling for various data sources on Confidential Redshift
  • Designed and Developed ETL jobs to extract data from Salesforce replica and load it in data mart in Redshift.
  • Wrote scripts and indexing strategy for a migration to Confidential Redshift from SQL Server and MySQL databases
  • Worked on AWS Data Pipeline to configure data loads from S3 to into Redshift
  • Used JSON schema to define table and column mapping from S3 data to Redshift
  • Manage source code, software builds, software versioning, & defect tracking on software maintenance tasks/projects.
  • Experience with defining and creating usable and customer friendly, intuitive interfaces to the JIRA tool in a fast-paced evolving environment.
  • Experience in writing the HTTP RESTful Web services and SOAP API's in Golang.
  • Introduced to Golang while working within a project.
  • Wrote microservices using Golang.
  • Developed Micro services using Golang and developed corresponding test cases.
  • Created PDF reports using Golang and XML documents to send it to all customers at the end of month with international language support.
  • Automate the Build and deploy of all internal Java &SC environments using various continuous integration tools and scripting languages (Python, Shell, PowerShell).
  • Implemented Server and Client side validations using ASP.NET validation controls and JavaScript.
  • Experience using cloud providers and API's for Amazon (AWS).
  • Experience in Developing KORN, BASH, PERL, Python shell scripts to automate cron jobs and system maintenance.
  • Experienced in organization of uses on Apache Web server, Nginx, JBOSS, WebLogic and WebSphere Application
  • Responsible for moving data between different AWS compute and storage services by using AWS Data Pipeline.
  • Design and Develop ETL Processes in AWS Glue to relocate Campaign information from outside sources like S3, ORC/Parquet/Text Files into AWS Redshift.
  • Data Extraction, accumulations and combination of Adobe information inside AWS Glue utilizing PySpark.
  • Experience in Setting up databases in AWS using RDS, storage using S3 bucket and configuring instance backups to S3 bucket.
  • Experience with Build Management tools Ant and Maven for writing build.xml and Pom.xml
  • Experienced in build and deployment of Java applications on to different environments such as QA, UAT and Production.
  • Branching, Tagging, Release Activities on Version Control Tools: GITHub, Bit Bucket.
  • Implemented Continuous Integration using Jenkins and Hudson.
  • Configuring and deploying Open Stack Enterprise master hosts and Open Stack node hosts.
  • Experienced in deployment of applications on Apache Web server, Nix and Application Servers like Tomcat, JBoss.
  • Created Shell Scripts to install Splunk Forwarders on all servers and configure with common configuration files such as Bootstrap scripts, Outputs.conf and Inputs.conf files.
  • Extensively used Splunk Search Processing Language (SPL) queries, Reports, Alerts and Dashboards.
  • Installation and implementation of the Splunk App for Enterprise Security and documented best practices for the installation and performed knowledge transfer on the process.
  • Using DB connect for real-time data integration between SplunkEnterprise and databases.
  • Analyzing in forwarder level to mask the customer sensitive data able to manage distributed search across set of indexers.
  • Responsible to filter the unwanted data in heavy forwarder level thereby reducing the license cost.
  • Worked with administrators to ensure Splunk is actively, accurately running, and monitoring on the current infrastructure implementation.
  • Used Spring IOC to define all the workflows as beans and load the corresponding dependencies for the workflows.
  • Setting up to utilize python with Aws Glue
  • Calling Aws Glue APIs in python
  • Using Python Lbraries with AWS Glue
  • Virtualized the servers using the Docker for the test environments and dev-environments needs and also configuration automation using Docker containers.
  • Experienced in setting up MongoDB, MySQL, SQL, Aurora on AWS
  • Experience with Bug tracking tool like JIRA, Bugzilla and Remedy.
  • Worked with various scripting languages like Bash, Perl, Shell, Ruby, PHP and Python.
  • Coordinated with the Offshore and Onshore teams for Production Releases.

Environment: Amazon Web Services, IAM, S3, RDS, EC2, VPC, cloud watch, Nix, ANT, Maven,Tortoise, Jenkins, GITHub,Bit Bucket, Chef, Puppet, Ansible, Docker, Java, Agile, Apache HTTPD, Apache Tomcat, JBoss, Junit, Cucumber, Json, Bash, Shell, Perl, Python, PHP, Ruby.

Devops Engineer/ Linux Administrator

Confidential - Charlotte, NC

Responsibilities:

  • Involved in designing and deploying a large applications utilizing almost the entire AWS stack (Including EC2, Route53, S3, RDS, Dynamo DB, SNS, SQS, IAM) focusing on high availability, fault tolerance, and auto-scaling in AWS Cloud Formation.
  • Create End user Excel/Google Sheets applications, with dynamic connection to back to end AWS tables
  • Manage daily sales feeds and compilation processing
  • Create End-product reports in AWS using PostgreSQL, managing associated online reporting with drill down and email notifications
  • Create, maintain and troubleshoot daily file transfer processes
  • Have learning in making Buckets in AWS and put away records. Empowered Versioning and security for the documents put away.
  • Used AWS Lambda to run the code in the AWS. Conveyed Spring Boot based miniaturized scale benefits in Docker compartment utilizing Amazon EC2 holder administrations and utilizing AWS administrator support.
  • Elastic search understanding and scope quantification and bunch upkeep. Ceaselessly searches for approaches to improve and sets a high bar as far as quality.
  • Implemented constant log examination pipeline utilizing versatile hunt.
  • To encourage normal access, to versatile inquiry.
  • AWS Cloud assets and the executives administrations like Elastic Beanstalk,S3, RDS, Lambda, SQS, Cloud Front, SNS, Cloud Watch and Cloudformation formats.
  • Create and keep up Lambda in AWS for RDS preview reclamation process
  • Top to bottom involvement in Amazon Cloud (AWS) including EC2, VPC, Identity and Access Manager (IAM), EC2,Container Service, Elastic Beanstalk, Lambda, S3, CloudFront, Glacier, RDS, DynamoDB, ElastiCache, Redshift, Direct Connect, Route 53, CloudWatch, CloudFormation, CloudTrail, AWS IoT, SNS, API Gateway, SES, SQS, SWF.
  • Worked in Log Insight and Cloud watch administrations to analyze issues and arranging cautions.
  • Configure log understanding to all the virtual machines where basic applications are introduced. Made dash sheets to different applications and observed the logs.
  • Maintain auto-scaling AWS stacks (favored utilizing cloud arrangement and scripting)
  • Develop multiple high-level exposure projects simultaneously, while ensuring quality products and initiate releases in required timelines
  • Create maintain and troubleshoot audits on AWS PostgreSQL table performance, file loads, file transfers and data accuracy.
  • Setup and arranged Elastic hunt 2.1.0 in a POC test condition to ingest more than million records from prophet DB.
  • Experience with organization, support and tasks of various
  • Daily audits/email/alerts to monitor all incoming feeds, files and updates
  • Create, develop and maintain AWS PostgreSQL process for Channel performance data warehousing
  • Develop new processes while streamlining/improving/maintaining existing ones
  • Implement data feeds using a multitude of sources and platforms, for both internal and external clients
  • Resolve any issues encountered for full product support along with prompt accurate communications
  • Completely taking the responsibility of Jenkins, GitHub, and Maven to automate the deployment process
  • Worked on AWS Cloud platform and its features which includes EC2, VPC, EBS, AMI, SNS, RDS, EBS, Cloud Watch, Cloud Trail, Cloud Formation AWS Configuration, Auto scaling, Cloud Front, IAM, S3. Worked in AWS using EC2, AWS Dynamo DB, AWS S3, AWS VPC, and IAM services.
  • Created Micro services using AWS Lambda and API Gateway using REST API.
  • Have knowledge in creating Buckets in AWS and stored files. Enabled Versioning and security for the files stored.
  • Used AWS Lambda to run the code in the AWS. Deployed Spring Boot based micro services in Docker container using Amazon EC2 container services and using AWS admin console.
  • Experience with administration, maintenance and operations of different
  • AWS Cloud resources and management services like Elastic Beanstalk,S3, RDS, Lambda, SQS, Cloud Front, SNS, Cloud Watch and Cloudformation templates.
  • Create and maintain Lambda in AWS for RDS snapshot restoration process
  • In-depth experience in Amazon Cloud (AWS) including EC2, VPC, Identity and Access Manager (IAM), EC2,Container Service, Elastic Beanstalk, Lambda, S3, CloudFront, Glacier, RDS, DynamoDB, ElastiCache, Redshift, Direct Connect, Route 53, CloudWatch, CloudFormation, CloudTrail, AWS IoT, SNSsplunk, API Gateway, SES, SQS, SWF.
  • Worked in Log Insight and Cloud watch services to diagnose problems and configuring alerts.
  • Configure log insight to all the virtual machines where critical applications are installed. Created dash boards to various applications and monitored the logs.
  • Maintain auto-scaling AWS stacks (preferred using cloud formation and scripting)
  • Deployment of applications and coordinated with the core development team and level2 support team
  • Installed and configured an automated tool Puppet that included the installation and configuration of the Puppet master, agent nodes and an admin control workstation
  • Created and updated Puppet manifests and modules, files, and packages stored in the GIT repository.
  • Used kubernetes to deploy scale, load balance, scale and manage docker containers with multiple namespaced.
  • Developed CI/CD system with Jenkins on Google's Kubernetes container environment, utilizing kubernetes and Docker for the runtime environment for the CI/CD system to build and test and deploy
  • Experience with Container based deployments using Docker, Working with Docker images, Docker hub and Docker registries.

We'd love your feedback!