Site Reliability Engineer Resume
SUMMARY
- Over 5+ years of extensive programming experience in IT industry using Google Kubernetes Engine(GKE), GCP cloud, Jenkins, JAVA, Microsoft Azure, RStudio 3.6.1, Abinitio 3.2.5, Teradata (TD 15), Oracle 11g, MongoDb, SQL, Python, Unix Shell, Bash script.
- Application design and development using programming languages, tools, and related technologies such as Google Cloud Platform (GCP), Terraform scripts, Jenkins CI/CD pipelines, Java, Google Kubernetes Engine, Jira board, Mac O.S., Cluster Design & infrastructure etc.
- Involved in taking backups for workloads using Velero and used during GKE cluster upgrades in multiple regions and different environments such as npe,uat & prod
- Experienced in implementing OPA policies and twist lock agent
- Involved in developing yaml’s that meets the requirements to deploy virtual services into GKE clusters globally
- Experience in using Microsoft Azure, FileZilla and WinSCP as a part of deploying websites
- Involved in User Acceptance Testing, Usability testing.
- Experienced in developing front ends in HTML5, XHTML and CSS for both Internet and Intranet website
- Experience in writing the UNIX shell scripts as per the client requirements.
- Involved in working with Machine Learning and Big Data Analytics in which deep learning, kernel methods and ensemble learning were majorly focused.
- Experience in Agile methodology of project implementation.
- Good understanding of Abinitio Architecture and architectural components like GDE, Co - op system and the Abinitio parallelism techniques.
- Professional experience in scheduling the jobs in Tivoli workstation.
- Over 2+ years of experience in implementation of data science projects like data analytics and interactive simulations with R programming as a developer.
- Experience in implementing different regression methods like Linear Regression, Ridge Regression
- LASSO Regression Random Forest Regression, Decision Tree Regression, Gradient Boost Regression to predict RMSE value.
- Involved in User Acceptance Testing, Usability testing.
- Experienced in developing front ends in HTML5, XHTML and CSS for both Internet and Intranet website.
TECHNICAL SKILLS
Tools: Kubernetes, GitHub, BitBucket, Terraform 0.14.3, AB Initio 3.2.5 and Co-operating system 3.1.5 & 3.2.5, Informatica
Programming Languages: Java, Groovy scripting, R 3.5.3, Python 3.7.0, SQL, PL/SQL, Teradata, Unix
Cloud Technologies: GCP cloud, Microsoft Azure, Jenkins
Web Technologies: HTML/HTML5, CSS2/CSS3, DHTML, XML, XHTML .
Web Services: HTTP Web Server, JIGSAW SOAP (JAX-WS), WSDL, REST (JAX-RS).
Databases: Oracle11g/10g, MS SQL, Microsoft SQL Server, Mongo DB, DB2.
Tools: /IDEs: Visual Studio, Oracle, PL/SQL Developer, Putty, Notepad++, Sublime Text Editor, File Zilla, Winscp.
Debugging Tools: Grafana, ELK, Stack Driver,Splunk,Chrome Developer Toolbar, Firebug, IE Developer Toolbar, Safari, Visual Studio, Toad for SQL debugger, Visual Basic (VB6), Ansible.
Operating System: Mac O.S., Windows 7, 8, 10, Windows XP, UNIX/Linux.
PROFESSIONAL EXPERIENCE
Confidential
Site Reliability Engineer
Responsibilities:
- Worked closely with senior technical consultants and Architect on IAM authentication process and development
- Implemented GKE upgrades and Istio upgrades for all the clusters globally
- Responsible for setting up Open SSL secrets for Istio along with continuous monitoring of expiration and regular updates as required
- Implemented PagerDuty alerts for pod monitoring for better support to segment SRE’s throughout
- Involved in continuous monitoring of C3M reports and remediations for more availability of services and to clean up the environments
- Fulfilled tasks as Support person of the week and worked on core SRE support tasks
- Involved in automating repetitive work through Jenkins and Groovy scripts
- Responsible for maintaining and expanding our Google Cloud infrastructure
- Responsible at automating repetitive work through CRON jobs to schedule jobs to run periodically at fixed times
- Identifying the problems and integration of technical skills to address and solve issues.
- Authenticating and virtualizing REST API’s for better performance.
- Responsible for setting up some of the CI/CD process pipeline using Jenkins to deploy the application to Docker containers in Kubernetes.
- Used JIRA ticketing system to keep track of issues and tasks on individuals
- Involved in successful code migration from Bitbucket to GitHub
- Used GIT as version control tool for effectively managing the code changes & experience with Scrum methodology.
- Analyzing the existing system process and functionality and design the new system with respective appropriate functioning techniques
- Create effective test data and unit test cases to ensure the successful execution of the cluster servers
- Coordinate with testing team for writing and executing the system testing
Confidential
Cloud Engineer - Microsoft Azure / R Programmer
Responsibilities:
- Implemented identification of purchase patterns as a part of Data Analytics.
- Conducted research on development and designing of sample methodologies and analyzed data for pricing of client’s products.
- Generated graphs and reports using ggplot package in RStudio for analytical models.
- Developed and implemented R and Shiny application which showcases machine learning for business forecasting.
- Experience in using Microsoft Azure, FileZilla and WinSCP as a part of deploying websites and maintaining
- Performed time series analysis using Tableau 8.1
- Created dashboards and visualizations using Tableau desktop.
- Performed analysis using JMP.
- Created dashboards in QuickView to visualize data.
- Involved in data cleaning steps like converting raw data to technically correct data by implementing concepts like type checking and normalizing on existing data.
- Experienced in implementing fixing and imputing concepts which helps to transform technically correct data to consistent data.
- Implemented correlation calculations among different variables to check for any independent features highly correlated with Purchase.
- Implemented vector Assembler to convert all independent features into vectors.
- Involved in performing MinMax scaling, regression methods, calculating RMSE value.
Environment: RStudio 3.6.1, XML, HTML 5, Java CSS, Microsoft Azure, File Zilla, WinSCP, QuickView, Oracle - Toad for Oracle 12.1;PL/SQL 11g, MS SQL Server 20, Windows 10.
Confidential
Cloud Engineer - Microsoft Azure / R Programmer
Responsibilities:
- Implemented different data input methods like read.table, read.csv, read.csv2, read.delim, read.delim2.
- Involved in using Microsoft Azure, FileZilla and Winscp as a part of deploying websites
- Involved in using packages like rworldmap and functions like setMapExtents which allows map extents to be set from area names.
- Implemented joinData2Map to Join user polygon attribute data to a map.
- Implemented data visualizations with the help of plotting functions like ggplot, geom line, geom point, geom bar and ggplotly.
- Implemented packages like RColorBrewer, grDevices and colorRamps, colorspace in which brewer.pal, heat.colors, rainbow hcl functions are extensively used.
- Experienced in animating the plots into .gif and .mp4 files implementing libraries like ggmagick, gganimate with respective functions image join, image animate and image write.
- Experienced in saving the plots as jpeg(), png(), svg() or pdf().
- Developed various workbooks in Tableau from multiple data sources.
- Performed validation on machine learning output from R.
- Used Toad to verify the counts and results of the visualizations.
Environment: R Studio 3.6.1, XML, HTML 5, Java CSS, Microsoft Azure, File Zilla, WinSCP, Oracle - Toad for Oracle 12.1;PL/SQL 11g, MS SQL Server 2008/2016,Windows 10
Confidential
Senior ETL Developer
Responsibilities:
- Implemented Ab Initio graphs with multiple transformation queries to load data from different sources like data files, table etc., to Teradata data warehouse.
- Experienced in implementing Initial full load and Incremental full load(CDC) to make the information available for 24 hours x 7 days.
- Involved in Unix Korn shell wrapper scripts to accept parameters and scheduled the processes using Crontab, Job Scheduler, Database Load Interface and Denormalization using Ab Initio.
- Used Parallelism techniques to partition and process large data simultaneously
- Developed shell scripts to automate file manipulation and data loading.
- Involved in replicating operational table into staging tables, transform and load data into warehouse tables using Ab Initio GDE.
- Involved in design, developing and testing the software (AbInitio, Teradata, UNIX shell scripts) to maintain the data marts (Extract /Transform/Load data).
- Designed and developed complex Aggregate, Join, Router, Look up and Update transformation rules (business rules).
- Used various component ofAb Initio graphslikePartition By Key, Sort, Reformat, Join, De-Dupetc. to impose business logic on the incoming data for loading and maintainingDimensional tables(insert/update), such that the history of the records could be retained, for recurring loads.
- Developed schedules to automate the update processes and Abinitio Sessions/Batches using IBM Tivoli Workload Scheduler.
- Designed documents for the enhancements, which are applied as part of the Formulary Management Project.
Environment: Abinitio 3.2.5 and Co-operating system 3.1.5 & 3.2.5, Unix, Tivoli Workload Scheduler, Toad for SQL debugger, Windows 10, Tableau.
Confidential
ETL Developer
Responsibilities:
- Implemented ad hoc multi file to retrieve multiple flat file data from single input file component.
- Expertise and well versed with variousAb Initio Transform, Partition, Departition, Dataset and Database components.
- Experience withAb InitioCo-Operating System, application tuning, and debugging strategies.
- Expert in writingUNIX Shell Scripts including Korn Shell Scripts, Bourne Shell Scripts.
- Wide usage of Lookup Files while getting data from multiple sources and size of the data is limited.
- Worked with project team inoptimizing and tuning of SQL statements,usedphases/checkpointsto avoid deadlocks to improve the efficiency.
- Used sandbox parameters to check in and checkout of graphs from repository Systems.
- Worked in highly parallelized (MPP Solution) Ab initio environment to process 1+Tera bytes of data daily.
- Experienced in handling the errors by connecting Reject, Error and Log ports of Reformat component to a file.
Environment: Abinitio 3.2.5 and Co>operating system 3.1.5 & 3.2.5, Unix, Tivoli Workload Scheduler, Toad for SQL debugger, Windows 10, Tableau.
Confidential
ETL Developer
Responsibilities:
- Involved in replicating the copy of data using Normalize component.
- Develop Abinitio code to process XML files as well as database retrievals in an agile environment.
- Involved in Requirement Analysis, Development and Testing of applications underUNIXplatforms.i.e.Unit and Regression Testing.
- Involved in creation of PSET (Parameter Sets) files and analyzing the graph dependencies as a part of Abinitio process.
- Implemented versioning in EMI (Metadata system of Abinitio) for future use.
- Involved in fixing repository related, Reader process and External procedure problems.
- Developed parameterized graphs using formal parameters.
- Createdsandboxand editedsandbox parameteraccording to repository.
- Wide usage of Lookup Files while getting data from multiple sources and size of the data is limited.
- Worked with Departition Components like Concatenate, Gather, Interleave and Merge in order to departition and repartition data from Multifiles accordingly.
- Involved in writing several Shell scripts, to remove old files and move raw logs to the archives.
- Developed dynamic graphs to load data from data sources into tables and to parse records.
- Extensive usage of Multifile system where data is partitioned into four partitions for Parallel processing.
Environment: Abinitio 3.2.5 and Co>operating system 3.1.5 & 3.2.5, Unix, Tivoli Workload Scheduler, Toad for SQL debugger, Windows 10, Tableau.
Confidential
R Programmer/Intern
Responsibilities:
- Involved in data cleaning, wrangling and visualization with R.
- Extensive usage of functions for calculating ROC, AUC and Confusion Matrix.
- Implemented dashboards with R shiny web application.
- Involved in retrieving spatial and non-spatial information from given sources.
- Involved in providing the simulations, data analyses and data visualizations with the R package.
- Experienced in visualizing the categorical data and quantitative data.
- Involved in finding the relationship between any two variables using the concepts of covariance and correlation.
- Implemented scatter plots to plot the quantitative data using ggplot, geom point, scale color manual, geom smooth.
