Aws Engineer Resume
SUMMARY:
- AWS certified cloud engineer with around 6+ Years of experience in IT industry comprising of Systems Administration and Change Management, Software Configuration Management (SCM), Build and Release management and Process Engineering.
- 5 Years of hands on experience in Amazon Web Services included EC2, Vpc, S3,ELB,
- 2Year experience of designing and architect serverless web application using AWS Lambda gateway.
- Experienced in Automating, Configuring and deploying instances on AWS, Azure environments and Data centers, also familiar with EC2, Cloud watch, Cloud Formation and managing security groups on AWS.
- Private Cloud Environment - Leveraging AWS and Puppet to rapidly provision internal computer systems for various clients.
- Demonstrated understanding of AWS data migration tools and technologies including storage gateway, database migration and Import-Export services.
- Hands-on experience in Microsoft Azure Cloud services (Paas & Iaas), Storage, Web Apps, Active directory, Application Insights, Internet of Things (IOT), Azure search, key vault, and SQL Azure...
- Good understanding of Networking concepts including TCP, VPN, VPC, NAT, Subnet, DNS, Gateways and Routers.
- Having around 9 years of experience with TIBCO family in developing and supporting. middleware applications using TIBCO/EAI components such as Business Works, Designer, Hawk, Administrator, Integration Manager, Business Connect, Smart Mapper, ADB Adapter, Adapter for Oracle Apps, File Adapter and MQ series Adapter.
- Expertise business process management solution development.
- Experience with solid working knowledge of business analysis and modeling, process design, performance analysis, XML, SQL and networking protocols.
- Expertise in Business process modeling, simulation and implementation using TIBCO Business
- Studio, Business Events and TIBCO I process Modeler.
- Having working knowledge on TIBCO General Interface.
- Experience in using industry messaging standards like TIBCO Rendezvous and EMS for developing distributed applications.
- Good working knowledge of user management system and domain monitoring and
- Experience in developing web based user-friendly GUIs using Servlets, JSP, JavaScript, HTML and XHTML.
- Proficient in Programming and data modeling in relational databases such as SQL Server,
- Hands on experience in using Mule connectors like FTP, FILE, SFTP, IMAP, Salesforce, NetSuite etc. as a part of Integration Usage.
- Knowledge in developing middle tier applications using Enterprise Service
- Good Confidential Mule Data weave component and worked with exposing API using RAML.
- Deployed Mule ESB applications into MMC (Mule Management Console).
- Experience on various Mule connectors/adapters, developing API.
- Experience in MuleSoft Anypoint API platform on designing and implementing Mule APIs.
- Integration of Mule ESB system while utilizing MQ Series, Http, File system and SFTP transports.
- Developed application in Any point studio 5.4.3 IDE and used RAML 0.8 for generating
- Extensively used Mule components that include File, SMTP, FTP, SFTP, JDBC Connector, and Transaction Manager.
- Working knowledge of API management using Any point API management tools.
- Expertise in SOA and ESB and involved in integrations with Salesforce.
- Experience in using find Bugs & PMD tools to write efficient code.
- Experience in tracking defects, issues, risks using HP Quality Center.
- Experience in using version control systems like SVN, CVS and IBM ClearCase.
- Experience on doing impact analysis and created various design documents to achieve different implementations using Microsoft Visio, Rational Rose.
- Strong Intercommunication and presentation skills to work Confidential all levels of the organization, with clients, management and across teams and time zones.
- Excellent reputation of developing on-time, in-budget innovative solutions resulting in increased efficiency and better business processes and growth.
- Good Experience in Banking.
TECHNICAL SKILLS:
Public cloud: Amazon Web Services, VPC,EC2,Lambda,API Gateway,S3,EBS,RDS,IAM, Route 53,Cloudfront,DynamoDB,CloudWatch.
TIBCO SOA products: TIBCO AMX/Active Matrix 2.x,3.x, TIBCO Active Matrix Business Works 5.x, TIBCO Administrator 5.x, TIBCO Hawk 4.x, TIBCO Adapter for Active Database 6.x, TIBCO Adapter for Files 6.x, TIBCO Adapter for MQSeries.
TIBCO B2B Integration: TIBCO Business Connect 5.x
TIBCO BPM Products: TIBCO Business Studio, TIBCO process Sustention AMXBPM, TIBCO Designer.
TIBCO Messaging Products: TIBCO Rendezvous 7.5x, TIBCO EnterpriseMessageService.
TIBCO BO Products: TIBCO General Interface 3.x, TIBCO Business Events 3.0.
Systems: TIBCO XML Canon, ClearCase, SVN, Microsoft VSS.
Databases: Oracle 9i/10g, My Sql, MS SQL.
Operating System: Windows NT/ 2000/ XP,Linux/Ubuntu
General Skills: Client interaction, Design, development, Code reviews and bug tracking.
Monitoring Tools: Dynatrace, Solarwinds, Terraform, PagerDuty.
DevOps Tools: Puppet, Chef, Ansible, Docker and Splunk
PROFESSIONAL EXPERIENCE:
Confidential
AWS Engineer
Responsibilities:
- Built S3 buckets and managed policies for S3 buckets and used S3 bucket and Glacier for storage and backup on AWS.
- Built a VPC, established the site-to- site VPN connection between Data Center and AWS.
- Develop push-button automation for app teams for deployments in multiple environments like Dev, QA, and Production.
- Involved in designing and deploying a multitude application utilizing almost all of the AWS stack (including EC2, Route53, S3, RDS, Dynamo DB, SNS, SQS, IAM) focusing on high-availability, fault tolerance, and auto-scaling.
- Experience in AWS Networking - Direct Connect/VPC, NACL’s, security groups, etc.
- Migrated the current Linux environment to AWS/RHEL Linux environment and used autoscaling feature.
- Increasing EBS backed volume storage capacity when the root volume is full using AWS EBS volume feature.
- Created AWS Route53 to route traffic between different regions.
- Created users and groups using IAM and assigned individual policies to each group.
- Created SNS notifications and assigned ARN to S3 for object loss notifications.
- Created load balancers (ELB) and used Route53 with failover and latency options for high availability and fault tolerance.
- Configured Security group for EC2 Window and Linux instances and for puppet master and puppet agents.
- Experience in installing Docker like container instances using Amazon ECS to deploy multiple Tomcat Application servers.
- Worked on monitoring tools to configure Nagios, Splunk, Zabbix.
- Worked on JIRA for defect/issues logging & tracking and documented all my work using CONFLUENCE.
- Manage AWS EC2 instances utilizing Auto Scaling, Elastic Load Balancing and Glacier for our QA and UAT environments as well as infrastructure servers for GIT and Puppet.
- Prior experience in developing Cloud Formation Scripts for AWS Orchestration, Chef and/or Puppet.
- Prior experience in automated build pipeline, continuous integration and continuous deployment (CICD) using Jenkins.
- Experience with Cloud Endure to migrate on premise workloads to AWS cloud.
- Migrate Big F5 load balancer hardware from current data center to new location.
- Design application network traffic from old to new data center.
- Designed and implemented Okta Cloud based solution for Single Sign On (SSO) for applications.
- Extensive participation in designing centralized workforce Authentication and authorization for Data center servers using Centrify infra suite Services.
- Designed and achieved MFA for browser applications and servers to meet SOX compliance.
- Current cloud PoC research projects (AWS, Google, Microsoft Azure, IBM Cloud private, Sky tap) to migrate workloads.
- Designed architecture for networks, Cloud Computing (SaaS, PaaS, IaaS) and cybersecurity.
Confidential, Illinois
AWS Engineer
Responsibilities:
- Working as a team member within team of cloud engineers and my responsibilities includes. setup & Managing windows servers on Amazon using EC2, EBS, ELB, SSL, Security groups, RDS and IAM.
- Managing vpc, subnets, make connection between different Zones, Blocking suspicious IP/subnet via ACL.
- Managing CDN on Amazon CloudFront (Origin path: Server/S3) to improve site performance.
- Create & Managing buckets on S3 and store db and logs backup, upload images for CDN server.
- Setup databases on Amazon RDS or EC2 instances as per requirement.
- Managing AMI/Snapshots/Volumes, Upgrade/downgrade AWS resources (CPU, Memory, EBS).
- Creation of POCs and Architecture for upcoming engagements.
Confidential, Illinois
AWS Engineer
Responsibilities:
- Years of experience in AWS-Platform as a service (Paas) and infrastructure as a service (IaaS).
- Experience in guiding the classification, plan, implementation, growth, adoption and compliance to enterprise architecture strategies, process and standards.
- Ability to design and manage cloud-based infrastructures to deliver the required performance, security and availability requirements.
- Ability to understand Migration requirements and bridge the gaps.
- Expertise in architecture blueprints and detailed documentation. Create bill of materials, including required Cloud services (such as EC2, S3 etc.) and tools.
- Hands-on experience with EC2, ECS, ELB, EBS,S3,VPC,IAM,SQS,RDS,Lambda,Cloud Watch, Storage Gateway, Cloud formation, Elastic Beanstalk and Autoscaling.
- Demonstrable experience with developer tools like code Deploy, Code Build, Code Pipeline, design the overall Virtual Private Cloud VPC environment including server instance, storage instances, subnets, high availability zones, etc.
Confidential, Irving Texas
Tibco Lead
Responsibilities:
- Installation of TIBCO products - TIBCO TRA, TIBCO Business works, TIBCO Adapter for MQSeries, TIBCO Adapter for files, Admin, Hawk, Smart Mapper.
- Designing Architecture, HDD (High level Design Document) for Integration.
- Designing and Implementing BW Processes.
- Designing and Implementing XSD schemas keeping future enhancements in view.
- Developing Java message objects using Castor.
- JIBs and schema generation.
- Testing Business services from BW and ESB Endpoint.
- Unit and Integration testing, bug fixing and acceptance testing.
- Deployment and Maintenance of Business process services using Tibco Administration 5.1 and using scripted deployment.
- Created EAR Files for the developed BW components and deployed them using Tibco
- Quality Procedures such as review of Design Documents, Code Walkthrough and testing.
- Worked with Tibco Business Events, - Concepts, Rules.
- Configured Business Events’ Palettes, channels, Rules network.
- Created Business events Channels, Concepts- advisory, timer and simple
- Installation and starting the Iprocess engine.
- Developing and designing the different process of definitions using Iprocess modeler.
- Deployment process for the services through Iprocess engine.
Confidential, San Diego CA
Tibco Sr Developer
Responsibilities:
- My responsibilities were to coordinate with the business analysts, Team Lead and end-users to gather requirements, develop the application and implement the application.
- Installation of Tibco products -TIBCO TRA, TIBCO Business works, TIBCO Adapter
- Configured ADB Adapter for publishing and subscribing the data from Business works.
- Configured File Adapter for publication of data from Business works.
- Installed Tibco s/w on Solaris 9 for development, testing and production environment.
- Designed and developed GUI Components like MVC Framework using Struts, Jsp’s etc.
- Implemented Javascript functions for client-side validations in addition to the validation framework.
- Struts Tiles framework is used for simplified and consistent Layout.
- Spring used as Application Framework to integrate Web, Service and Data Layers as IOC and Transaction attributes.
- For implementing business logic Simple Java Classes used.
- Implemented J2EE Design patterns like Service Façade, DAO’s, Singleton pattern, Business delegate, ValueList, DTO.
- Using Hibernate to interact with Database.
- Interaction with interfaces like Savvion and Tibco.
- Used Eclipse IDE for developing the environment.
- Used WebLogic Application server to deliver high performance and scalability.
- Implemented Log4j for all classes to set debug levels on production.
- For testing the application used JUnit.