4

I develop a simple REST-style web application consisting of 2 basic modules.

Module#1: server exposing REST web services, stateless, deployed in Tomcat
Module#2: REST client

There is one Tomcat instance with the Module#1 deployed.

I would like to scale it horizontally and run the second Tomcat on the second machine. I'm a complete novice when it comes to load-balancing/clustering that's why I need help. There's no need for session replication and failover.

How should I approach it?

I made a research and these are the possible approaches that I see:

1. No cluster, no third party proxy.

I run the second Tomcat on the second machine. Since I have control over both client and server I can provide a very basic algorithm in client side and choose randomly a host, which would be chosen before API call. There would be no need to configure cluster, nor to provide a third party proxy. Are there any potential pitfalls? Is it a correct approach?

2. Tomcat cluster

When it comes to Tomcat cluster configuration, does it mean that there 2 Tomcats running on a separate machines and their configuration says that they are a cluster? Do I need a separate library, tool for that? Is Tomcat enough? Will I have 2 processes running as in the first approach?

3. Tomcat load balancer

What are the differences between the Tomcat cluster and the Tomcat load balancer? Again, do I need a separate library, tool for that? Is Tomcat enough?

4. Third party proxy

I found some info about things like HAProxy. Does it mean, that all the calls go through it, and the proxy decides which host to choose? Does it mean that apart from the two Tomcat processes there will a third one running separately? On which machine is this proxy running assuming that I have 2 Tomcats on two separate machines?

Which one should I choose? Am I misunderstanding something? Articles, answers appreciated.

BartoszMiller
  • 143
  • 1
  • 5
  • Have you read http://serverfault.com/questions/350454/how-do-you-do-load-testing-and-capacity-planning-for-web-sites ? We can't tell you what will work for your app - you need to test and research on your own first. – voretaq7 Oct 15 '14 at 20:16
  • I haven't, thanks for the link. However, I'm asking about the way they work, so that I can figure out which one I want. At the moment I'm a total novice and I'm not sure that I understand the differences between Tomcat clustering, Tomcat load balancing without the cluster (possible?), external proxy, etc. – BartoszMiller Oct 15 '14 at 20:28

1 Answers1

5

First you have see the differences between both options load balancing (without cluster) and Cluster with replication.

Clustering has a formal meaning. A cluster is a group of resources that are trying to achieve a common objective, and are aware of one another. Clustering usually involves setting up the resources (servers usually) to exchange details on a particular channel (port) and keep exchanging their states, so a resource’s state is replicated at other places as well. It usually also includes load balancing, wherein, the request is routed to one of the resources in the cluster as per the load balancing policy.

A clustered architecture is used to solve one or more of the following problems:

  • A single server cannot handle the high number of incoming requests efficiently
  • A stateful application needs a way of preserving session data if its server fails
  • A developer requires the capability to make configuration changes or deploy updates to their applications without discontinuing service.

A clustered architecture solves these problems using a combination of load balancing, multiple server to process the balanced load, and session replication.
In your case is not necessary session replication, for this I think a cluster configuration is not the approach that you need.

Documentation: Apache Tomcat - Clustering/Session Replication HOW-TO

Load balancing can also happen without clustering when we have multiple independent servers that have same setup, but other than that, are unaware of each other. Then, we can use a load balancer to forward requests to either one server or other, but one server does not use the other server’s resources. Also, one resource does not share its state with other resources.

The fundamental feature of a load balancer is to be able to distribute incoming requests over a number of backend servers in the cluster according to a scheduling algorithm.

On both architectures you need something that implements the load balancer, for this the one option is use Apache HTTP Server.

tomcat load balanceer

For implement a load balancer in Apache Http Server, you have some options as:

  • Using the JK native connector
  • Using Apache HTTP mod_proxy

Reference:

Kacper Cichecki
  • 185
  • 1
  • 5
Federico Sierra
  • 3,499
  • 1
  • 18
  • 24