Aws Cloud Developer Resume
2.00/5 (Submit Your Rating)
SUMMARY
- I am having around 12 years of experience on analysis, design, development and support of various applications and environments like AWS Cloud, DevOps, Hyperion EPM, Ab - Intio, Orchestra EBX and Collibra
- Very good experience on different AWS cloud resources, i.e. EC2, VPC, security Groups, subnets, EMR cluster, Load Balance, ASG, SQS, Lambda’s, EBS Volumes, Route53, S3 buckets and One Lake.
- Good experience on database systems like AWS RDS, Redshift
- Involved in gathering the requirements, design architecture diagram and development road map technical documentation.
- Migration, Installation and configurations of different applications on on-premises and AWS cloud environments.
- Worked on creating security groups with in/out bound ports, subnets under specific VPC to build EC2 instances with specific access.
- Used Jenkins, GitHub to automate the deployment process to setup and build the environment and applications.
- Used GitHub to store and maintain the code changes.
- Used Cloud Formation template, terraform to build Linux and windows AWS Instances Involved in creating of different JSON files.
- Written and created different scripting files like PowerShell, JSON and shell scripts to deploy and automate.
- Created and supported S3 buckets/One Lake and assign its bucket policies, access, enable versioning, cross region replica.
- Created and assign the policies to IAM profiles to control access on different resources like S3, Lambda, RDS from EC2.
- Very good exp on creating EBS volume and creating snapshots and copy snapshot from one region to another region.
- Good knowledge on setup of ELB, ALB, ASG’s as per needed.
- Good experience in creating and setup of RDS database, Lambdas, Route 53 and SQS
- Good experience in Python, PowerShell and Shell scripting.
- Excellent communication skills and versatile team player with good analytical, presentation and inter-personal skills.
TECHNICAL SKILLS
AWS Cloud & Dev Ops: AWS Cloud, S3, EC2, RDS, EBS Volume, IAM, Security Groups, VPC, Subnets, EMR, Jenkins, Power Shell, Python, GitHub, Cloud Formation, Cloud Watch and SNS, ELB, ASG, Data Bricks, Athena, One Lake.
Financial & ETL tools: Hyperion Essbase, Planning, HFM, TM1 Cognos, OBIEE, Collibra, AbIntio/MDH, Oracle EBS, EBX, PeopleSoft, Informatica Power center
Databases: MySQL, Oracle, AWS RDS, Red Shift, PostgreSQL
Reporting Tools: Hyperion Financial Reports, Smart View, Excel Add-In
PROFESSIONAL EXPERIENCE
Confidential
AWS Cloud Developer
Responsibilities:
- Involved in implementing /migration of Collibra applications into AWS Cloud and upgrade to different versions from 4.6 version to 5.3 and then 5.6
- Setup AWS cloud environment setup using EC2, VPC, security Groups, subnets, Load Balance, ASG, SQS, Lambda’s, Volumes, Route53, S3 buckets, EBS volumes
- Installed DGC, Console, Rep and Connect servers on four AWS ec2 Linux instances and configured all four servers to access from console web URL.
- Involved in creating of different JSON, terraform files to build Linux servers.
- Work with redshift team to create clusters for reporting mart to store data from different source system and maintain its configurations, endpoint in applications to connect database, backups, read replica to west region.
- Rehydration’s on AWS cloud applications on every 60 days with latest AMI for updated OS and its plugins and patches.
- Setup of security groups, subnets, EC2, IAM under specific VPC to build EC2 instances to access other source and target systems.
- Very good exp on creating EBS volume and creating snapshots and copy snapshot from one region to another region and attach /detach of volumes and extend the volume for Linux and Windows instance.
- Git and Jenkins are used to store the code and deploy the code changes
- Involved in all environmental issues like increase CPU’s, add extra memory, add additional disk space, install SSL certs, performance tannings, automation in all areas as much as possible.
- Involved in complete disaster recovery process on AWS cloud environment from east region to west region
- Monitor ContolM and AROW jobs to make sure all source system data copied to S3 buckets and then moved to redshift.
- Validate the data in the redshift database tables using SQL Workbench.
- Used Python scripting in lambdas to automate the process and deploy the API’s.
- Maintain and support of docker containers on linux environment to deploy API’s
- Setup CloudWatch events and datadog tool to monitor and store the logs for analysis and troubleshoot if any issues
- Working closely with business team to fix any data load issues in redshift for analysis, report generations.
- Involved in setting up of One Lake and assigning its read/write access to PUT/GET the files from/to Lake
- Setting up of cross region and cross account S3 bucket access by assuming other account roles.
- Involved in disaster recovery process end to end to recover applications without data loss on another region.
Confidential
AWS Cloud Developer
Responsibilities:
- Involved in production support of Hyperion and ETL tools and migrated them into AWS cloud environment
- Understand current Oracle Hyperion systems server setup and configurations details on on-prem environment system to migrate into AWS Cloud
- Designed architecture diagram and road map design document according to AWS cloud environment
- Create a VPC and a Public Subnet with the VPC to host a Single Node EBS instance.
- Create new security groups with In and OUT bound ports to control /restrict the public access to the application
- Setup new IAM role with the required policies to allow access to required resources like S3, RDS
- Install Hyperion Essbase, planning and HFM on three different ec2 instances and configure then to talk each other.
- Create RDS database and replication to cross region for if in case any disaster
- Select the right AWS AMI based on the required/support configurations for Oracle EBS software
- Choose AMI with CPU's. Memory as required to run the application
- Attach root and external EBS Volumes to AWS EC2 instance to install OS and Oracle Application File System
- Reserve an Elastic IP and assign it to EC2 instance. An Elastic IP address is a public IPv4 address, which is reachable from the Internet.
- Install required rpms and perform required OS pre-requisite steps like creating necessary OS groups and users, modifying etc. hosts and setting required kernel parameters.
- Perform single node install of EBX 12.2 software on AWS ec2 instance
- Migrated People Soft system into AWS Cloud.
- Register a domain using Route53.
- Make required changes to Oracle Applications to use the domain registered in Route53.
- Create Record Set in Route53 and launch Oracle Applications URL.
- Copy database snapshots to S3 buckets to maintain as backup to restore if in case any disaster
- Assign DNS to access front end applications page through web browser.
- Migrated application artifacts (data, meta data, configuration, security files) from Development to UAT/Production environments using LCM
Confidential
Technical Lead
Responsibilities:
- Support to the existing LEVIS Production environment on 24/7 across APD, LSE and LSA regions to continue the business without interrupt.
- Maintain the existing EC2 instances, S3, EBS volume and snapshots.
- Needs to complete daily/monthly/Quarterly/yearly tasks as per scheduled
- Standardized financial data and metrics between the 3 Regions and Global
- Data loaded from different source systems like SAP, SQL server, Teradata
- Meta data load will have happened from source systems to staging (SQL Server) area and then build in Essbase
- Production support of source system people soft to generate data files to downstream Hyperion applications
- ASO cubed are used to load data and transfer the data to BSO cube for forecast process
- Created users & groups and provision them using Shared services
- Troubleshoot the LEVIS production support issues within the SLA time lines
- Implemented maxl and batch script for cube builds and data loads
- Implemented calc scripts as per requirements
- Manual data loads for Non Ecc cubes like, Russia, LFA and LEVIS XX using template
- Manual restatement process based on user requests to copy data from source to target cube for specific period.
- Used Cloud Formation template to create/ build instances by passing parameters and JSON file.
- Maintain RDS database on AWS Cloud
- Servers start and stop on AWS cloud, rebuild instance if terminated.
- Involving in Enhancements to the existing system based on new requirements.
- Hyperion system gets the financial data from people soft system