Data Reliability Engineer Resume
4.00/5 (Submit Your Rating)
FL
SUMMARY
- About 8 years of experience in IT industry with imperative concentration on cloud designing and management, DevOps, automation, Linux administrator, build and release Management.
- Very good knowledge and Hands on experience with AWS services EC2/S3/AWS Auto Scaling/RDS/CloudWatch/ IAM/Code Commit/Code Build/Code Deploy and Code Pipeline for CI/CD
- Very good knowledge on AWS.
- Installation, configuration, troubleshooting, load balancing, clustering, deploying applications and performance tuning of WebLogic server and excellent handson other middleware server’s tomcat/Apache.
- Expertise in CICD tools Jenkins.
- Skilled in Application performance monitoring tools, Requirements Analysis, Tool Migrations and Quality Assurance.
- Thorough knowledge on installation/configuration of Jenkins. Plugins installation, Configuration of pipelines, setting up manual gates, security, ssh keys etc.
- Involved in setting up the CI/CD pipeline using Jenkins and created notifications and alarms for EC2 instances using CloudWatch
- Experienced in providing day - to-day user administration like adding/deleting users in local and global groups on Red Hat Linux platform and managing user's queries.
- Upgrading and integrating of JIRA, Confluence instances, all plugins and applications.
- Maintaining the Security groups and setting up the rules for instances that are associated to the security groups.
- Experienced in working with version control systems like GIT and used Source code management client tools like GitBash, GitHub, Git Lab.
- Proficient in using Code Quality tools like Sonarqube
- Installed and configuredSplunk and NewRelicto monitor applications deployed on application server, by analyzing the application and server log files. Worked on setup of various dashboards, reports and alerts inSplunk
- Willing to learn new things and grow.
- Capable of playing multiple roles, as a team member/working independently/team lead.
- Effective oral/written communication skills and strong analytical problem-solving capabilities.
- Passionate about achieving a challenging position.
TECHNICAL SKILLS
Operating System: LINUX, UNIX,Windows
Versioning Tools: GIT, CVS
CI Tools, Configuration Management: Jenkins
Infrastructure as code: Cloud formation
Build Tools: JIRA
Monitoring Tools: Splunk, Cloud watch, NewRelic, Datadog
Scripting: Shell scripting,Python
Web/App servers: Apache Tomcat, JBOSS, Web logic, Web Sphere
Database: SQL SERVER, MySQL,Snowflake
PROFESSIONAL EXPERIENCE
Confidential, FL
Data Reliability Engineer
Responsibilities:
- Implement application monitoring, synthetic and real user monitoring solutions following best practices across the enterprise.
- Involved in DevOps migration/automation processes for build and deploy systems.
- Day to day job included but not limited to handling Tickets, Monitoring, Troubleshooting and maintenance.
- Implemented the Build automation process for all the assigned projects.
- Involved in designing and deploying multitude applications utilizing almost all the AWS stack (Including EC2, Route53, S3, RDS, DynamoDB, SNS, SQS, IAM) focusing on high-availability, fault tolerance, and auto-scaling in AWS CloudFormation.
- Managed AWS infrastructure as code using Terraform and written Terraform templates for configuring EC2 instances, security groups, subnets.
- Used AWS services such as IAM, route 53, EC2, EBS, AMI, Auto scaling, VPC, Load balancer, RDS, VPC, ECS, Cloud watch, Cloud formation, SNS, etc.
- Monitor Production Applications for performance, assist in troubleshooting of application performance issues, and produce ad hoc reports.
- Supported the code builds by integrating with a continuous integration tool (Jenkins).
- Performed merging and tagging need to be done after the code went live in the environment.
- Maintain a Live Like environment to test any production issues on the setup and push it into production.
- Build and Release management - GIT, Jenkins administration.
- Maintained QA environment and solved issue of QA and maintained DB version according releases. Also published code and DB on production and staging as per business requirement.
- Created extensive documentation, including straightforward how-to procedures for common administrative tasks.
Environment:, JIRA, Jenkins,, AWS,Nexus, GIT, Sonarqube, Shell Scripting, UNIX/Linux.
Confidential
Site Reliability and Release Management Engineer
Responsibilities:
- Gathered all the stakeholder approvals, necessary sign offs while acting as a DevOps/Release manager for two development teams.
- Work on the Monitoring tools such as AWS CloudWatch.
- Support and manage monitoring tools like AWS CloudWatch and logging tools like Splunk and NewRelic
- Created the automated build and deployment process for application, re-engineering setup for better user experience, and leading up to building a continuous integration system.
- Responsible for Continuous Integration (CI) and Continuous Delivery (CD) process implementation-using Jenkins along with Python and Shell scripts to automate routine jobs.
- Implemented new projects builds framework using Jenkins as build framework tools.
- Implemented AWS solutions using EC2, S3, RDS, EBS, Elastic Load Balancer, Auto scaling groups, Optimized volumes and EC2 instances.
- Performed Branching, Tagging, and Release Activities on Version Control Tools: GIT.
- Performed and deployedBuildsfor various Environments like QA, Integration, UAT and Production Environments
- Worked on the installation and configuration of the monitoring tool Nagios.
- Worked on Apache and Firewalls in both development and production.
- Deployed and configured Atlassian Jira, both hosted and local instances for issue tracking, workflow collaboration, and tool-chain automation.
- Troubleshoot and resolve Build failures due to infrastructure issues reduced by 95% stabilizing the build process. Setup and execute the process to code review system effectively.
Environment: Jenkins, SQL, Ansible, AWS, Git, JIRA, Linux.
Confidential
Data Analyst
Responsibilities:
- Worked on large datasets using advanced Excel functions like Pivot Table and VLOOKUP to extract useful information and to analyze data.
- Understood the capabilities of RapidMiner and executed different models.
- Utilized Tableau to generate visual reports and presented information to the management and the business teams.
- Provided assistance for data trends identification using Advanced Excel functions like VLOOKUP, Pivot tables resulting in 30% increase in investments.
- Built and maintained SQL scripts and complex queries for data analysis and extraction for various projects. Exported data to MS Access and Excel for periodic and exception reporting.
- Managed projects through generating Reports, identifying Risks and setting up KPI’s (Key Performance Indicator)
- Performed Extraction, Transformation and Loading (ETL) changing over a huge number of records of value-based procedure information into important analytical data