We provide IT Staff Augmentation Services!

Aws Data Engineer Resume

4.00/5 (Submit Your Rating)

SUMMARY

  • Certified AWS Devops Engineer with over 8+ years of extensive IT experience, Expertise in DevOps and Cloud Engineering & UNIX, Linux Administration.
  • Exposed to all aspects of Software Development Life Cycle (SDLC) such as Analysis, Planning, Developing, Testing and implementing and Post - production analysis of the projects and methodologies such as Agile, SCRUM and waterfall.
  • Extensive experience in developing web pages using HTML/HTML5, XML, DHTML CSS/CSS3, SASS, LESS, JavaScript, React JS, Redux, Flex, Angular JS (1.X) jQuery, JSON, Node.js, Ajax, JQUERY Bootstrap.
  • Extensive experience in Amazon Web Services (AWS) Cloud services such as EC2, VPC, S3, IAM, EBS, RDS, ELB, VPC, Route53, Ops Works, DynamoDB, Autoscaling, CloudFront, CloudTrail, CloudWatch, CloudFormation, Elastic Beanstalk, AWS SNS, AWS SQS, AWS SES, AWS SWF & AWS Direct Connect.
  • Experience with designing, building, and operating solutions using virtualization using private hybrid/public cloud technologies.
  • Created Automation to create infrastructure for Kafka clusters different instances as per components in cluster using Terraform for creating multiple EC2 instances & attaching ephemeral or EBS volumes as per instance type in different availability zones & multiple regions in AWS
  • Implemented AWS X-Ray allows you to visually detect node and edge latency distribution directly from the service map Tools, like Splunk, Sumologic can be used for log analysis but when comes to distributed tracing with in the AWS, X-Ray will be provided much better features with service map, traces with in depth analysis with minimal configuration with not much maintenance.
  • Knowledge of High Availability (HA) and Disaster Recovery (DR) options in AWS.
  • Hands on experience in Architecting Legacy Data Migration projects such as Teradata to AWS Redshift migration and from on-premises to AWS Cloud.
  • Designed, built, and deployed a multitude application utilizing almost all AWS stack (Including EC2, R53, S3, RDS, HSM Dynamo DB, SQS, IAM, and EMR), focusing on high-availability, fault tolerance, and auto-scaling
  • Strong hands-on experience with Microservices like Spring IO, Spring Boot in deploying on various cloud Infrastructure like AWS.
  • Expertise in configuration and automation using Chef, Chef with Jenkins, Puppet, Ansible and Docker
  • Experience in configuring Docker Containers for Branching and deployed using Elastic Beanstalk.
  • Experience in designing, installing and implementing Ansible configuration management system for managing Web applications, Environments configuration Files, Users, Mount points and Packages.
  • Extensively worked on Jenkins and Hudson by installing, configuring and maintaining the purpose of Continuous Integration (CI) and for End-to-End automation for all build and deployments and in implementing CI/CD for database using Jenkins.
  • Configured SSH, SMTP, Build Tools, and Source Control repositories in Jenkins. Installed multiple plugins to Jenkins and Hands-on experience in deployment automation using Shell/Ruby scripting.
  • Experience in setting up Baselines, Branching, Merging and Automation Processes using Shell, Ruby, and PowerShell scripts.
  • Extensive experience developing a green field app large app using AWS Cognito, Lambda, API gateway, node backend, Postgres and React /Redux front end.
  • Worked on designing Poc's for implementing various ETL Process.
  • Experience in using build utilities like Maven, Ant and Gradle for building of jar, war, and ear files.
  • Performed several types of testing like smoke, functional, system integration, white box, black box, gray box, positive, negative and regression testing
  • Worked in container-based technologies like Docker, Kubernetes and OpenShift.
  • Expertise AWS Lambada function and API Gateway, to submit data via API Gateway that is accessible via Lambda function.
  • Managed configuration of Web App and Deploy to AWS cloud server through Chef.
  • Created instances in AWS as well as worked on migration to AWS from data center.
  • Developed AWS Cloud Formation templates and set up Auto scaling for EC2 instances.
  • Championed in cloud provisioning tools such as Terraform and CloudFormation.
  • Responsible for distributed applications across hybrid AWS and physical data centers.
  • Wrote AWS Lambda functions in python for AWS's Lambda which invokes python scripts to perform various transformations and analytics on large data sets in EMR clusters.
  • Used Amazon EMR for map reduction jobs and test locally using Jenkins.
  • Experience in setting up and managing ELK (Elastic Search, Log Stash & Kibana) Stack to collect, search and analyze logfiles across servers, log monitoring and created geo-mapping visualizations using Kibana in integration with AWS CloudWatch and Lambda.
  • Strong Experience in implementing Data warehouse solutions in Confidential Redshift; Worked on various projects to migrate data from on premise databases to Confidential Redshift, RDS and S3.
  • Experience in ETL techniques and Analysis and Reporting including hands on experience with the Reporting tools such as Cognos.
  • Experience on Cloud Databases and Data warehouses (SQL Azure and Confidential Redshift/RDS).
  • Good knowledge on logical and physical Data Modeling using normalizing Techniques.
  • Used principles of Normalization to improve the performance. Involved in ETL code using PL/SQL in order to meet requirements for Extract, transformation, cleansing and loading of data from source to target data structures.
  • Experience in automation and provisioning services on AWS
  • Experience building and optimizing AWS data pipelines, architectures and data sets.
  • Good working experience on Hadoop tools related to Data warehousing like Hive, Pig and Hive involved in extracting the data from these tools on to the cluster using Sqoop.
  • Experience in working with Teradata. And making the data to be batch processing using distributed computing.
  • Getting in touch with the Junior developers and keeping them updated with the present cutting-Edge technologies like Hadoop, Spark.

TECHNICAL SKILLS

  • Linux (Red Hat 4/5/6), UNIX, Ubuntu, Windows 7,8,10 and iOS
  • Subversion (SVN), ClearCase, GitHub, Code Commit
  • BAMBOO, JENKINS
  • C, C++, PYTHON, JAVASCRIPT, SQL
  • React JS, Angular JS (1.x), Node JS
  • CODEDEPLOY, CODEPIPELINE
  • CHECKMARX, SONARQUBE, NEXUSIQ
  • ANT, MAVEN
  • JIRA, Rally, Remedy
  • SHELL, PYTHON, TYPESCRIPT
  • CloudFormation, Terraform
  • Apache Tomcat, JBOSS, Web sphere, Nginx
  • Oracle 7.x/8i/9i/10g/11g, Data warehouse, RDBMS
  • Ecosystems S3, Redshift Spectrum, Athena, Glue, AWS RedShift
  • SOAP, REST, JavaScript, CSS, Angular JS, HTML
  • Amazon Cloud Watch, Nagios, Splunk, nexus,
  • Chef, Ansible, PUPPET vSphere, VMware Workstation, Oracle Virtual Box, Hyper-V
  • Docker, Kubernetes, ECS
  • SELENIUM, Junit
  • FTP, HTTP, HTTPS, HTML, W3C, TCP, DNS, NIS, LDAP, SAMBA
  • NEXUS, GIT, ARTIFACTORY
  • LAMBDA, SNS, SQS, DYNAMODB, KINESIS, REDSHIFT, ANTHENA
  • CLOUDWATCH, CLOUDTRAIL, EC2, ECS, VPC, IAM, CONFIG, AWS X-RAY

PROFESSIONAL EXPERIENCE

AWS DATA ENGINEER

Confidential

Responsibilities:

  • We help developers automatically build and deploy software into production multiple times a day safely while maintaining compliance in a highly regulated financial industry. We use tools like Atlassian Bamboo, Bitbucket, Confluence, JIRA, Jenkins, Sonar type Nexus and Nexus IQ, SonarQube, Grunt, and Maven to get the job done.
  • Created Function as a service is a category of cloud computing services that provides a platform allowing customers to develop, run, and manage application functionalities without the complexity of building and maintaining the infrastructure typically associated with developing.
  • Implemented a 'serverless' architecture using API Gateway, Lambda, and Dynamo DB and deployed AWS Lambda code from Amazon S3 buckets. Created a Lambda Deployment function, and configured it to receive events from your S3 bucket
  • Designed the data models to be used in data intensive AWS Lambda applications which are aimed to do complex analysis creating analytical reports for end-to-end traceability, lineage, definition of Key Business elements from Aurora.
  • Writing code that optimizes performance of AWS services used by application teams and provide Code-level application security for clients (IAM roles, credentials, encryption, etc.)
  • Using SonarQube for continuous inspection of code quality and to perform automatic reviews of code to detect bugs. Managing AWS infrastructure and automation with CLI and API.
  • Creating AWS Lambda functions using python for deployment management in AWS and designed, investigated and implemented public facing websites on Amazon Web Services and integrated it with other applications infrastructure.
  • Creating different AWS Lambda functions and API Gateways, to submit data via API Gateway that is accessible via Lambda function.
  • Responsible for Building Cloud Formation templates for SNS, SQS, Elastic search, Dynamo DB, Lambda, EC2, VPC, RDS, S3, IAM, Cloud Watch services implementation and integrated with Service Catalog.
  • Regular monitoring activities in Unix/Linux servers like Log verification, Server CPU usage, Memory check, Load check, Disk space verification, to ensure the application availability and performance by using cloud watch and AWS X-ray. implemented AWS X-Ray service inside Confidential, it allows development teams to visually detect node and edge latency distribution directly from the service map Tools.
  • Design and Develop ETL Processes in AWS Glue to migrate Campaign data from external sources like S3, ORC/Parquet/Text Files into AWS Redshift.
  • Automate Datadog Dashboards with the stack through Terraform Scripts.
  • Developed file cleaners using Python libraries and made it clean.
  • Utilized Python Libraries like Boto3, NumPy for AWS.
  • Used Amazon EMR for map reduction jobs and test locally using Jenkins.
  • Data Extraction, aggregations and consolidation of Adobe data within AWS Glue using PySpark.
  • Create external tables with partitions using Hive, AWS Athena and Redshift.
  • Developed the PySprak code for AWS Glue jobs and for EMR.
  • Good Understanding of other AWS services like S3, EC2 IAM, RDS Experience with Orchestration and Data Pipeline like AWS Step functions/Data Pipeline/Glue.
  • Provide a streamlined developer experience for delivering small serverless applications to solve business problems The Platform is a Lambda-based platform. It is composed of a pipeline and a runtime.
  • Find and resolve complex build and deployment issues while on-call 24*7 support and strong knowledge of troubleshooting and debugging application team issues.
  • Experience in writing SAM template to deploy serverless applications on AWS cloud.
  • Design, develop and implement next generation cloud infrastructure at Confidential .
  • Hands-on experience on working with AWS services like Lambda function, Athena, DynamoDB, Step functions, SNS, SQS, S3, IAM etc.
  • Developed internationalized multi-tenant SaaS solutions with responsive UI's using React or AngularJS, with NodeJS and CSS.
  • Creation of indexes, forwarder & indexer management, Splunk Field Extractor IFX, Search head Clustering, Indexer clustering, Splunk upgradation.
  • Install and configured Splunk clustered search head and Indexer, Deployment servers, Deployers.
  • Designing and implementing Splunk - based best practice solutions.
  • Designed and Developed ETL jobs to extract data from Salesforce replica and load it in data mart in Redshift.
  • Responsible for Designing Logical and Physical data modelling for various data sources on Confidential Redshift.
  • Experienced with event-driven and scheduled AWS Lambda functions to trigger various AWS resources.
  • Integrated lambda with SQS and DynamoDB with step functions to iterate through list of messages and updated the status into DynamoDB table.

Environment: AWS (EC2, S3, EBS, ELB, RDS, SNS, SQS, VPC,LAM Cloud formation, CloudWatch, ELK Stack), Bitbucket, Ansible, Python, Shell Scripting, PowerShell, GIT, Jira, JBOSS, Bamboo, Docker, Web Logic, Maven, Web sphere, Unix/Linux, AWS X-ray,Dynamodb,Kinesis,CodeDeploy,CodePieline,CodeBuild,CodeCommit,Splunk,SonarQube.

AWS CLOUD ENGINEER

Confidential

Responsibilities:

  • Worked in Server infrastructure development on AWS Cloud, extensive usage of Virtual Private Cloud (VPC), Cloud Formation, Lambda, Cloud Front, Cloud Watch, IAM, EBS, Security Group, Auto Scaling, Dynamo DB, Route53, and Cloud Trail.
  • Designing and building multi - terabyte, full end-to-end Data Warehouse infrastructure from the ground up on Confidential Redshift for large scale data handling Millions of records every day.
  • Supported AWS Cloud environment with 2000 plus AWS instances configured Elastic IP and Elastic storage deployed in multiple Availability Zones for high availability.
  • Setup Log Analysis AWS Logs to Elastic Search and Kibana and Manage Searches, Dashboards, custom mapping and Automation of data.
  • Wrote python scripts to process semi-structured data in formats like JSON.
  • Used ETL component Sqoop to extract the data from MySQL and load data into HDFS.
  • Good hands on experience with Python API by developing Kafka producer, consumer for writing Avro Schemes.
  • Managed Hadoop clusters using Cloudera. Extracted, Transformed, and Loaded (ETL) of data from multiple sources like Flat files, XML files, and Databases.
  • Used Cloud Watch for monitoring the server's (AWS EC2 Instances) CPU utilization and system memory.
  • Involved in the development of the UI using JSP, HTML5, CSS3, JavaScript, jQuery, AngularJS. Worked on JavaScript framework (Backbone.JS) to augment browser-based applications with MVC capability.
  • Managed the artifacts generated by Maven and Gradle in the Nexus repository and Converted Pom.xml into build.
  • Designed infrastructure for AWS application and workflow using Terraform and had done implementation and continuous delivery of AWS infrastructure using Terraform.
  • Developed Python scripts to take backup of EBS volumes using AWS Lambda and Cloud Watch.
  • Developed and deployed stacks using AWS Cloud Formation Templates (CFT) and AWS Terraform.
  • Used Jenkins and pipelines which helped us drive all Microservices builds out to the Docker registry and then deployed to Kubernetes.
  • Managed Docker orchestration and Docker containerization using Kubernetes
  • Used Kubernetes to orchestrate the deployment, scaling and management of Docker Containers.
  • Automated builds using Maven and scheduled automated nightly builds using Jenkins. Built Jenkins pipeline to drive all microservices builds out to the Docker registry and then deployed to Kubernetes.
  • Extensively worked on Hudson, Jenkins for continuous integration and for End to End automation for all build and deployments.
  • Resolved update, merge and password authentication issues in Bamboo and JIRA.
  • Developed and maintained Python/Shell PowerShell scripts for build and release tasks and automating tasks.
  • Used Jenkins pipelines to drive all micro services builds out to the Docker registry and then deployed to Kubernetes, Created Pods and managed using Kubernetes.
  • Designed and implemented large scale business critical systems using Object oriented Design and Programming concepts using Python and Django.
  • Experienced in working Asynchronous Frameworks like NodeJS, Twisted and designing the automation framework using Python and Shell scripting.
  • Used Ansible Playbooks to setup and configure Continuous Delivery Pipeline and Tomcat servers. Deployed Micro Services, including provisioning AWS environments using Ansible Playbooks. automated various infrastructure activities like Continuous Deployment, Application Server setup, stack monitoring using Ansible playbooks and has Integrated Ansible with Jenkins.
  • Prepared projects, dashboards, reports and questions for all JIRA related services.
  • POC to explore AWS Glue capabilities on Data cataloging and Data integration

Environment: AWS (EC2, S3, EBS, ELB, RDS, SNS, SQS, VPC, Redshift, Cloud formation, CloudWatch, ELK Stack), Jenkins, Ansible, Python, Shell Scripting, PowerShell, GIT, Microservice, Jira, JBOSS, Bamboo, Kubernetes, Docker, Web Logic, Maven, Web sphere, Unix/Linux, Nagios, Splunk, AWS Glue.

AWS CLOUD ENGINEER

Confidential

Responsibilities:

  • Experience working in Agile Environment.
  • Well versed with Rally and Jira.
  • Working in DevOps model to define, develop, maintain and support products.
  • Designed, Architecture and Built out Highly Available Puppet Masters (3.x) as the configuration management tool for the team, Jenkins for the Continuous Integration, and Sensu Monitoring tool (Open Source) to replace Nagios for monitoring the health of all the critical applications and server’s health.
  • Created custom Modules in Puppet to support the applications.
  • Worked on SparkSQL where the task is to fetch the NOTNULL data from two different tables and loads
  • Able to handle whole data using HWI using Cloudera Hadoop distribution UI.
  • Importing the complete data from RDBMS to HDFS cluster using Sqoop.
  • Well versed with testing the custom modules locally using Test Kitchen and Vagrant.
  • Create develop and test environments of different applications by provisioning Kubernetes clusters on AWS using Docker, Ansible, and Terraform.
  • Integrated Jenkins to do auto build when code is pushed to GIT.
  • Achieved the Continuous Integration and Continuous deployment (CI/CD) process using GIT, Jenkins, Puppet and Custom Repositories.
  • Worked on Jenkins builds using Ant and Maven.
  • Created slaves’ nodes for Jenkins to run the builds outside of Jenkins Master.
  • Experience working with configuring and managing RabbitMQ, and Redis.
  • Worked with and managed Big Data tools like Cassandra and Spark.
  • Create develop and test environments of different applications by provisioning Kubernetes clusters on AWS using Docker, Ansible, and Terraform.
  • Write terraform scripts from scratch for building Dev, Staging, Prod and DR environments.
  • Configured servers to send the server and application data to Splunk.
  • Hands on experience with generating reports using Splunk.
  • Built and managed servers, firewall rules, storage and authentication to servers on OpenStack and AWS.
  • Well versed with AWS products such as EC2, S3, EBS, IAM, CloudWatch, CloudTrail, VPC, and Route53.
  • Experience with working on scripting and automation in Bash and Perl to achieve the interconnectivity and integrate the DevOps tools.
  • Good knowledge on Ruby while working with creating custom modules in puppet to integrate the applications into Puppet.
  • Experience on working and configuring TCP/IP network on the newly built physical servers and bringing them onto the company’s network. servers on day-to-day system administration tasks, managing user keys, monitoring servers and working on the break-fix issues.
  • Configuring the newly built physical servers through ILO, to connect to the network and deploy the required applications on them from puppet.
  • Managed Disks and File systems using LVM on Linux.

Environment: AWS (EC2, S3, EBS, ELB, RDS, SNS, SQS, VPC, Cloud formation, CloudWatch, ELK Stack), Bitbucket, Ansible, Python, Shell Scripting, PowerShell, GIT, Jira, JBOSS, Terraform, Redshift, Maven, Web sphere, Unix/Linux, AWS X-ray,Dynamodb,Kinesis,CodeDeploy,CodePieline,CodeBuild,CodeCommit,Splunk,SonarQube.

AWS CLOUD DEVELOPER

Confidential, Memphis, TN

Responsibilities:

  • Set up an AWS Lambda function that runs every 15 minutes to check for repository changes and publishes a notification to an Amazon SNS topic.
  • Integrated services like Bitbucket AWS Code Pipeline and AWS Elastic Beanstalk to create a deployment pipeline.
  • Created S3 buckets in the AWS environment to store files, sometimes which are required to serve static content for a web application.
  • Configured S3 buckets with various life cycle policies to archive the infrequently accessed data to storage classes based on requirement.
  • Possess good knowledge in creating and launching EC2 instances using AMI’s of Linux, Ubuntu, RHEL, and Windows and wrote shell scripts to bootstrap instance.
  • Used IAM for creating roles, users, groups and implemented MFA to provide additional security to AWS account and its resources. AWS ECS and EKS for docker image storage and deployment.
  • Used Bamboo pipelines to drive all micro services builds out to the Docker registry and then deployed to Kubernetes, Created Pods and managed using Kubernetes.
  • Design an ELK system to monitor and search enterprise alerts. Installed, configured and managed the ELK Stack for Log management within EC2 / Elastic Load balancer for Elastic Search.
  • Create develop and test environments of different applications by provisioning Kubernetes clusters on AWS using Docker, Ansible, and Terraform.
  • Worked on deployment automation of all the micro services to pull image from the private Docker registry and deploy to Docker Swarm Cluster using Ansible.
  • Installed Ansible Registry for local upload and download of Docker images and even from Docker Hub.
  • Create and maintain highly scalable and fault tolerant multi-tier AWS and Azure environments spanning across multiple availability zones using Terraform and CloudFormation.
  • Implemented domain name service (DNS) through route 53 to have highly available and scalable applications.
  • Maintained the monitoring and alerting of production and corporate servers using Cloud Watch service.
  • Worked on scalable distributed data system using Hadoop ecosystem in AWS EMR.
  • Migrated on premise database structure to Confidential Redshift data warehouse.
  • Wrote various data normalization jobs for new data ingested into Redshift.
  • Wrote scripts and indexing strategy for a migration to Confidential Redshift from SQL Server and MySQL databases.
  • The data is ingested into this application by using Hadoop technologies like PIG and HIVE.
  • Worked on AWS Data Pipeline to configure data loads from S3 to into Redshift
  • Used JSON schema to define table and column mapping from S3 data to Redshift
  • On demand, secure EMR launcher with custom spark submit steps using S3 Event, SNS, KMS and Lambda function.
  • Created EBS volumes for storing application files for use with EC2 instances whenever they are mounted to them.
  • Experienced in creating RDS instances to serve data through servers for responding to requests.
  • Automated regular tasks using Python code and leveraged Lambda function wherever required.
  • Knowledge on Containerization Management and setup tool Kubernetes and ECS.

Environment: AWS (EC2, S3, EBS, ELB, RDS, SNS, SQS, VPC, Cloud formation, CloudWatch, ELK Stack), Bitbucket, Ansible, Python, Shell Scripting, PowerShell,ETL,AWS Glue, Jira, JBOSS, Bamboo, Docker, Web Logic, Maven, Web sphere, Unix/Linux, AWS X-ray,Dynamodb,Kinesis,CodeDeploy,CodePieline,CodeBuild,CodeCommit,Splunk.

DEVOPS MIDDLEWARE ENGINEER

Confidential

Responsibilities:

  • Installed and configured Web Logic Application server 8.x/9.x/10.x/11x using graphic, console and silent mode and configured the Web Logic domain.
  • Determined and suggested hardware and software specific to the System and customized it.
  • Configured Node Manager for running managed servers.
  • Installed and configured JBOSS 5.1/6.0, Apache Tomcat 6.0on different environments like Dev, Test, QA and Production.
  • Experience in designing, installing and implementing Ansible configuration management system for managing Web applications, Environments configuration Files, Users, Mount points and Packages.
  • Installed and configured Apache HTTP Server 2.0, Tomcat 7.0, IIS and Sun One Webservers in various environments Installed and configured plug - in for Apache HTTP server and Sun One Web server to proxy the request for Web Logic server.
  • Developed and maintained the continuous integration and deployment systems using Jenkins, ANT, Maven, Nexus, Ansible and Run deck.
  • Experienced in creating Ansible playbooks, tasks, roles, templates.
  • Completely automated the process of building OAuth, OpenID and SAML stacks with Ansible and Jenkins.
  • Deployment and troubleshooting of JAR, WAR, and EAR files on both stand alone and clustered environment in JBOSS 5.1/6.0, Web Logic 8.x/9.x/10.x and Apache tomcat 6.0.
  • Performed migration and Upgradation tasks like upgrading Web Logic server 9.x/10.x to Web Logic11.xand updating JDK's and installing service packs and patches for Web Logic Server.
  • Configure F5 load balancer with Web servers. Used F5 to capacity, performance and reliability of the applications.
  • Developed and run UNIX shell scripts and implemented auto deployment process.
  • Solved server hang issues such as Deadlock, Application and Database level lock by taking thread dump and analyzed to get the root cause of the hang.
  • Set up Wily for monitoring, notification, root cause analysis and data reporting.
  • Performance monitoring and JVM Heap size and EJB monitoring using Wily Introscope and Load testing using Mercury Load Runner and JMeter with Thread and Heap analysis Using Samurai thread dump.
  • Used Subversion (SVN) to maintain present and historical source code versions and documentation.
  • TDA and Heap Analyzer for detecting blocking and locked threads.
  • Used HP OpenView for managing applications, network conditions and status across the platform.
  • Implemented standard backup procedures for both application and Web Logic server.
  • Involved in assisting QA team in Load and Integration testing of J2EE applications on WebLogic Server and resolved complex issues.

ENVIRONMENT: Web Logic Application Server 8.x/9.x/10.x,11x JDK 1.4/1.51.6/1.7 , JBoss (5.1, 6.0), JRockit 8x, Apache 2. x, Tomcat 7.x/8.x, Sun One/ I Planet, IIS 6, Solaris 8/9, Red Hat LINUX, Windows 2007, F5 Load balancer, SiteMinder, ansible, Cassandra, Nagios, JMX, Oracle 8i/9i, JDBC, Subversion, EJB, JSP, Servlets, XML, MS Office, Open SSL, Secure SSH.

We'd love your feedback!