Cloud Engineer - Lead Resume
Wilmington, DE
PROFESSIONAL SUMMARY:
- Around 10+ years of experience in all phases of Software Development Life Cycle (SDLC) including requirements analysis, design specification, coding and testing of enterprise applications.
- Strong experience in working in Linux and Windows environments
- Strong experience with Open Stack Cloud and experience with migration to Confidential web Services (AWS) and Experienced in Cloud automation using AWS Cloud Formation templates, Python, Ruby, Chef (DevOps), Puppet (DevOps).
- Experienced in build tools such as Apache Ant, Maven, Atlassian Bamboo and Jenkins.
- Strong hands on experience with scripting languages like Python, Ruby, PowerShell,
- Expertise in using Version Control Tools VSS, Sub - Version, Git.
- Experienced in Tomcat, Apache, Splunk, New Relic.
- Strong experience in Chef cookbook development, Ruby scripting, Bash Scripting and PowerShell scripting.
- Strong experience with CI (Continuous Integration) and CD (Continuous Deployment) and methodologies like Jenkins.
- Experience with Docker, Docker-compose and Docker-machine tools and familiar with Containerization technologies like Docker.
- Worked on integration of Pivotal Cloud Foundry with OpenStack.
SKILL:
Languages: Core Java, Python, Map- Reduce, Apache spark, PERL, HiveQL.
Operating systems: Windows, UNIX, Linux, CentOS, AIX.
Databases: Oracle, Teradata, MS SQL server, HBase, Mango DB, Netezza, DB2.
Data modeling /Reporting tools: Erwin, ER studios, Visio, Tableau.
Other pursuits: Data Analytics, Data mining, Lean six sigma, cost benefit analysis, regression analysis. Learning new concepts on Business Analytics using Data.
EXPERIENCE:
Cloud Engineer - Lead
Confidential, Wilmington, DE
Responsibilities:
- Lead the migration efforts for on-site solutions to cloud-based offerings, such as Confidential 's AWS
- Working extensively in AWS cloud platform including AWS services such as EC2, RDS, S3, Route53, VPC, Lambda, Cloud Formation
- Stealthily plan and lead the deployment of the cloud solution in production environments
- Develop scripts and glue code to integrate multiple software components. Worked on CICD Jenkins Confidential to automate build /deploy entire application, used GitHub and Artifactory repository to push the code and scripts.
- Automate the provisioning of environments cooking up some recipes with Chef, or through Terraform, and deployment of those environments using cloud formation templates and containers, like Docker
- Worked with Splunk and New relic for application log monitoring, deployed New relic with automated cloud formation Templates
- Design and develop automation workflows, performing unit tests and conducting reviews to ensure work is rigorously designed, elegantly coded, and effectively tuned for platform performance, and assessing the overall quality of delivered components
- Automated AWS components like EC2 instances, Security groups, ELB, RDS, IAM through AWS cloud Formation templates and Experience in using AWS CLI.
- Experience working with IAM in order to create new accounts, roles, and groups.
- EC2 instances, RDS instance - MySQL, VPC, S3, IAM, Route53 and STS.
- Monitoring the real time transactions for PCE (Pro Classic/Enterprise) using New Relic Application Monitoring tool.
- Experience with writing Docker files and using kitematic with Docker.
- Involved in POC’s with Kubernates and Docker for Container Orchestration and implemented Docker to deploy multiple MicroServices
- Used Marathon and Mesosphere to manage containers across AWS in multiple environments
- Strong verbal and written communication skills while working with dynamic collaborative engineering teams in different geographical locations, solving complex business problems together
- Self-driven and actively, looks for ways to contribute and knows how to get things done
- Provided technical guidance to other engineers on the team
Sr. DevOps Engineer
Confidential, Columbus, OH
Responsibilities:
- Worked on different projects inside nationwide as Subject matter expert for Confidential, Confidential, and underwriting desktop.
- Worked with different relational Databases Oracle, SQL server, Teradata, DB2 and Netezza as an application data engineer responsible for application development in fast paced Agile Development Methodologies.
- Worked on automation Confidential to git and fetch the repo to execute the Confidential job for daily/weekly download jobs.
- Configuring the Jenkins System such as adding the jdk installation, gradle installation in Manage Jenkins.
- Created each and every component in OpenStack like instance, network creation, volume creation using CLI.
- Automated the creation of open stack components using heat templates by writing the yaml file.
- Worked as a data engineer for Data & Analytics projects inside Nationwide and actively involved in implementing Hadoop which involves Hadoop administration, developing, designing and deploying big data application on Hadoop ecosystem and resolving issues.
- Involved in extracting data from various sources into Hadoop HDFS for processing.
- Identifying Cross Functional Dependencies, Monitoring & tracking the release milestones
- Developed PL/SQL packages, Dynamic SQL, DML, DDL, Oracle tables, stored procedures, functions, Cursors, triggers and UNIX Shell scripts.
- Worked with BI tools like Tableau for visual analysis and report sharing.
- Worked on POC for Apache spark for data processing and streaming using python and big data on AWS and Confidential Blue mix cloud.
- Worked on CA technologies ESP for batch jobs and fixing performance issues on the databases, Implementing Industry best practices and making as a standard process to help business users.
- Worked on Ticket queues as per SLA from the application customers.
- Worked on creating Release Plan, Definition, collection, analysis & presentation of Release Project Metrics on weekly basis.
- Coordinating Release effort amongst various teams (Development, QA, Testing, and Business Analysis) in geographically separated environment.
- Presenting Project’s Weekly status to Senior Management during Weekly Status Meetings.
Data Engineer & Release engineer
Confidential, Columbus, OH
Responsibilities:
- Setup and configure Hudson/Jenkins to build, package, and deploy releases to development and QE servers. Create and manage the CI build process.
- Supported Oracle and Teradata, SQLServer databases and worked as SQL developer.
- Worked on POC to migrate databases from SQL server to Oracle.
- Analyzing tables and indexes for performance tuning. Optimizing database objects and streamlining applications.
- Analyzing and recommending corrective action and/or resolving release delivery failures
- Reviewed log files to identify and resolve bottlenecks, ensuring optimal availability and performance manage for deploy tasks, including creating audit, compliance, and deployment jobs, patching servers, automating builds and deployments, and engineering / updating packages and scripts.
- Coordinated Release process and Reverse Demand Management (RDM) activities; organized meetings and interactions with impacted teams and led development of RDM game plan.
Database Engineer
Confidential, Memphis, TN
Responsibilities:
- Supported Oracle and Teradata databases.
- Extensively used the advanced features of PL/SQL like Records, Tables, Object types and Dynamic SQL.
- Administration of Oracle networking using Sqlnet, Tcp/Ip and Listener setup for oracle databases.
- Created and modified SQL*Plus, PL/SQL and SQL*Loader scripts for data conversions.
- Daily Health Checkup of the database using STATSPACK, AWR, ADDM.
- Analyzing tables and indexes for performance tuning. Optimizing database objects and streamlining applications.
- Taking Periodic Backup of the databases and the software using RMAN.
PL/SQL Developer
Confidential
Responsibilities:
- Extensively used the advanced features of PL/SQL like Records, Tables, Object types and Dynamic SQL.
- Worked and developed PL/SQL packages.
- Created Tables, Views, Constraints, Index (B Tree, Bitmap and Function Based).
- Developed materialized views for data replication in distributed environments.
- Created PL/SQL scripts to extract the data from the operational database into simple flat text files using UTL FILE package.
- Wrote conversion scripts using SQL, PL/SQL, stored procedures, functions and packages to migrate data from SQL server database to Oracle database.
- Created and modified SQL*Plus, PL/SQL and SQL*Loader scripts for data conversions.
- Analyzing tables and indexes for performance tuning. Optimizing database objects and streamlining applications.