We provide IT Staff Augmentation Services!

Aws Cloud Developer Resume

PROFESSIONAL SUMMARY:

  • I am having around 12 years of experience on analysis, design, development and support of various applications and environments like AWS Cloud, DevOps, Hyperion EPM, Ab - Intio, Orchestra EBX and Collibra
  • Very good experience on different AWS cloud resources, i.e. EC2, VPC, security Groups, subnets, EMR cluster, Load Balance, ASG, SQS, Lambda’s, EBS Volumes, Route53, S3 buckets and One Lake.
  • Good experience on database systems like AWS RDS, Redshift
  • Involved in gathering the requirements, design architecture diagram and development road map technical documentation.
  • Migration, Installation and configurations of different applications on on-premises and AWS cloud environments.
  • Worked on creating security groups with in/out bound ports, subnets under specific VPC to build EC2 instances with specific access.
  • Used Jenkins, GitHub to automate the deployment process to setup and build the environment and applications.
  • Used GitHub to store and maintain the code changes.
  • Used Cloud Formation template, terraform to build Linux and windows AWS Instances Involved in creating of different JSON files.
  • Written and created different scripting files like PowerShell, JSON and shell scripts to deploy and automate.
  • Created and supported S3 buckets/One Lake and assign its bucket policies, access, enable versioning, cross region replica.
  • Created and assign the policies to IAM profiles to control access on different resources like S3, Lambda, RDS from EC2.
  • Very good exp on creating EBS volume and creating snapshots and copy snapshot from one region to another region.
  • Good knowledge on setup of ELB, ALB, ASG’s as per needed.
  • Good experience in creating and setup of RDS database, Lambdas, Route 53 and SQS
  • Good experience in Python, PowerShell and Shell scripting.
  • Excellent communication skills and versatile team player with good analytical, presentation and inter-personal skills.

SKILLS:

AWS Cloud & Dev Ops: AWS Cloud, S3, EC2, RDS, EBS Volume, IAM, Security Groups, VPC, Subnets, EMR, Jenkins, Power Shell, Python, GitHub, Cloud Formation, Cloud Watch and SNS, ELB, ASG, Data Bricks, Athena, One Lake.

Financial & Confidential ; ETL tools: Hyperion Essbase, Planning, HFM, TM1 Cognos, OBIEE, Collibra, AbIntio/MDH, Oracle EBS, EBX, PeopleSoft, Informatica Power center

Databases: MySQL, Oracle, AWS RDS, Red Shift, PostgreSQL

Reporting Tools: Hyperion Financial Reports, Smart View, Excel Add-In

PROFESSIONAL EXPERIENCE:

Confidential

Environment: AWS Cloud, EC2, S3, Json, EBS Volumes, One Lake, R53, EBS, GitHub, Jenkins, Redshift, AWS RDS MySQL, Linux, Collibra 4.6, 5.3.2 and 5.6, AWS Glue, docker, Lambda, JSON

AWS Cloud Developer

Responsibilities:

  • Involved in implementing /migration of Collibra applications into AWS Cloud and upgrade to different versions from 4.6 version to 5.3 and then 5.6
  • Setup AWS cloud environment setup using EC2, VPC, security Groups, subnets, Load Balance, ASG, SQS, Lambda’s, Volumes, Route53, S3 buckets, EBS volumes
  • Installed DGC, Console, Rep and Connect servers on four AWS ec2 Linux instances and configured all four servers to access from console web URL.
  • Involved in creating of different JSON, terraform files to build Linux servers.
  • Work with redshift team to create clusters for reporting mart to store data from different source system and maintain its configurations, endpoint in applications to connect database, backups, read replica to west region.
  • Rehydration’s on AWS cloud applications on every 60 days with latest AMI for updated OS and its plugins and patches.
  • Setup of security groups, subnets, EC2, IAM under specific VPC to build EC2 instances to access other source and target systems.
  • Very good exp on creating EBS volume and creating snapshots and copy snapshot from one region to another region and attach /detach of volumes and extend the volume for Linux and Windows instance.
  • Git and Jenkins are used to store the code and deploy the code changes
  • Involved in all environmental issues like increase CPU’s, add extra memory, add additional disk space, install SSL certs, performance tannings, automation in all areas as much as possible.
  • Involved in complete disaster recovery process on AWS cloud environment from east region to west region
  • Monitor ContolM and AROW jobs to make sure all source system data copied to S3 buckets and then moved to redshift.
  • Validate the data in the redshift database tables using SQL Workbench.
  • Used Python scripting in lambdas to automate the process and deploy the API’s.
  • Maintain and support of docker containers on linux environment to deploy API’s
  • Setup CloudWatch events and datadog tool to monitor and store the logs for analysis and troubleshoot if any issues
  • Working closely with business team to fix any data load issues in redshift for analysis, report generations.
  • Involved in setting up of One Lake and assigning its read/write access to PUT/GET the files from/to Lake
  • Setting up of cross region and cross account S3 bucket access by assuming other account roles.
  • Involved in disaster recovery process end to end to recover applications without data loss on another region.

Confidential

Environment: AWS Cloud, EC2, S3, EMR, Python, GitHub, Jenkins, Redshift, Linux, Oracle Hyperion EPM modules, HFM, Planning, ESSBASE, EBX, People Soft, One Lake, Terraform,JSON

AWS Cloud Developer

Responsibilities:

  • Involved in production support of Hyperion and ETL tools and migrated them into AWS cloud environment
  • Understand current Oracle Hyperion systems server setup and configurations details on on-prem environment system to migrate into AWS Cloud
  • Designed architecture diagram and road map design document according to AWS cloud environment
  • Create a VPC and a Public Subnet with the VPC to host a Single Node EBS instance.
  • Create new security groups with In and OUT bound ports to control /restrict the public access to the application
  • Setup new IAM role with the required policies to allow access to required resources like S3, RDS
  • Install Hyperion Essbase, planning and HFM on three different ec2 instances and configure then to talk each other.
  • Create RDS database and replication to cross region for if in case any disaster
  • Select the right AWS AMI based on the required/support configurations for Oracle EBS software
  • Choose AMI with CPU's. Memory as required to run the application
  • Attach root and external EBS Volumes to AWS EC2 instance to install OS and Oracle Application File System
  • Reserve an Elastic IP and assign it to EC2 instance. An Elastic IP address is a public IPv4 address, which is reachable from the Internet.
  • Install required rpms and perform required OS pre-requisite steps like creating necessary OS groups and users, modifying /etc/hosts and setting required kernel parameters.
  • Perform single node install of EBX 12.2 software on AWS ec2 instance
  • Migrated People Soft system into AWS Cloud.
  • Register a domain using Route53.
  • Make required changes to Oracle Applications to use the domain registered in Route53.
  • Create Record Set in Route53 and launch Oracle Applications URL.
  • Copy database snapshots to S3 buckets to maintain as backup to restore if in case any disaster
  • Assign DNS to access front end applications page through web browser.
  • Migrated application artifacts (data, meta data, configuration, security files) from Development to UAT/Production environments using LCM

Confidential

Environment: HyperionEssbase11.1.2.3, Shared Services11.1.2.3, AWS Cloud, S3, EC2, Python, SQL Server, GitHub, Peoplesoft HR and Finance SQL Server, BSO & Confidential ; ASO

Technical Lead

Responsibilities:

  • Support to the existing Confidential Production environment on 24/7 across APD, LSE and LSA regions to continue the business without interrupt.
  • Maintain the existing EC2 instances, S3, EBS volume and snapshots.
  • Needs to complete daily/monthly/Quarterly/yearly tasks as per scheduled
  • Standardized financial data and metrics between the 3 Regions and Global
  • Data loaded from different source systems like SAP, SQL server, Teradata
  • Meta data load will have happened from source systems to staging (SQL Server) area and then build in Essbase
  • Production support of source system people soft to generate data files to downstream Hyperion applications
  • ASO cubed are used to load data and transfer the data to BSO cube for forecast process
  • Created users & Confidential ; groups and provision them using Shared services
  • Troubleshoot the Confidential production support issues within the SLA time lines
  • Implemented maxl and batch script for cube builds and data loads
  • Implemented calc scripts as per requirements
  • Manual data loads for Non Ecc cubes like, Russia, LFA and Confidential XX using template
  • Manual restatement process based on user requests to copy data from source to target cube for specific period.
  • Used Cloud Formation template to create/ build instances by passing parameters and JSON file.
  • Maintain RDS database on AWS Cloud
  • Servers start and stop on AWS cloud, rebuild instance if terminated.
  • Involving in Enhancements to the existing system based on new requirements.
  • Hyperion system gets the financial data from people soft system

Confidential

Environment: HyperionEssbase11.1.1.3, Shared Services11.1.1.3, SQL, ASO, BSO, ControlM, EIS, PeopleSoft

Developer

Responsibilities:

  • Design and develop the Essbase cubes and provide production support to the existing Essbase system
  • Monitor the CONTROLM daily, weekly and monthly jobs on cube builds and data loads
  • Cube builds and data loads manually and EIS when required
  • Troubleshoot the production support issues within the SLA time lines
  • Implemented calc scripts, maxl and batch script for cube builds and data loads
  • Created users & Confidential ; groups and provision them using Shared services
  • Worked on day to day activities like, data, logs, outline backups and restore them into system if required
  • Worked closely with end users to resolve their access issues and data issue
  • Migrated essbase system from oracle database into EXADATA
  • Involved in Year-end rollover changes and DR activities as per organization norms
  • Migrated applications from Development to UAT/Production environments using LCM

Confidential

Environment: HyperionEssbase11.1.2.3, Planning11.1.2.3, Smart ViewSQL Server, EPMA, DRM

Team Member

Responsibilities:

  • Requirements gathering from customer
  • Created planning application based on customer requirements
  • Load the metadata and data into EPMA using ADS files and Interface tables
  • Created Oracle schemas for configuring Hyperion Products and Planning Application creation
  • Implemented planning web forms to enter the actual adjustments and forecast data by users
  • Implemented business rules as per customer requirements
  • Used copy version function to copy the data from version to version
  • Involved in security part from shared services
  • Crated users and groups and provisioned access based on customer request
  • Assigned member level security to application
  • Hyperion Services Maintenance
  • Migrated applications from Development to UAT/Production environments using LCM
  • Involved in testing of migrated Hyperion Essbase, Planning Applications
  • Migrated Essbase and Planning Applications from classic to EPM System 11.1.2
  • Implemented shared library concept in EPMA System
  • DRM is used to maintain the hierarchy to share across planning and essbase applications
  • Load the metadata and data into EPMA using Interface tables
  • Created Oracle schemas for configuring Hyperion Products and Planning Application creation
  • Hyperion Services Maintenance
  • Migration from Development to UAT/Production
  • Involved in testing of migrated Hyperion Essbase, Planning Applications

Confidential

Environment: Hyperion Essbase 6.2, Essbase Add-Ins, Web Analyzer

Team Member

Responsibilities:

  • Application Maintenance support for Hyperion Essbase production environment.
  • Providing Production support to existing cubes Ensured that support central tickets are updated as per the SLA
  • Monitor auto scheduled jobs and data Loading Manually based on the client data files for some regions
  • Enhancements and implements new essbase cube based on customer requirements
  • Essbase cubes Optimization and tuning to improve the application performance.
  • Development and modification of Analyzer reports setting the user/groups access rights.
  • Creation, modification and support of Essbase calculation scripts
  • Involved in providing security like creating users/groups and filters
  • Working and maintaining of scheduled
  • Manual Testing

Hire Now