I'm looking for an HTTP load balancer that will handle server errors silently. What i want is to load balance every single request so that it works out, in the worst case with a little of timeout.
If the working web node returns an HTTP 500 server error, the load balancer will have to retry the request with another web node. If the second node does return another 500 error, doing the same with the last node (i suppose to have 3 nodes). If the last node returns a 500 error, display it to the end user.
If a server node timeout (takes more than 1 or 2 second to answer) the request will have to be routed to another server, the client should receive a good answer within 2+ seconds.
You can nginx with HttpProxyModule (it's pretty standart module and usually is inside nginx) to implement such load balancer.
Nginx is lightweight, fast and has a lot of functionality (you can even embed lua code in it).
Example config for your use case would be
The secret sauce is proxy_next_upstream directive, which determines in what cases the request will be transmitted to the next server. Possible values are :
This behavior could be accomplished too by apache in two ways
First one, using failonstatus
The directive is failonstatus in the module mod_proxy
For example, I used to use the below configuration for a productive environment
Second way, and in my opinion the best option is to use the module mod_proxy_hcheck https://httpd.apache.org/docs/2.4/mod/mod_proxy_hcheck.html
Currently, I am using this module to detect backend issues
I'm guessing you want to serve HTTP?
Nginx provides a lot of functionalities, including all the ones you are looking for: http://wiki.nginx.org
check especially the upstream and proxy settings, there you can implement all your requirements: http://wiki.nginx.org/HttpUpstreamModule http://wiki.nginx.org/HttpProxyModule
Another possible solution to implement your requirements is
LVS
(Linux Virtual Servers) which are implemented in the Linux kernel itself. if you just google forLVS tutorial
you will get tons of results.What you're looking for here is either a proxy or a reasonably expensive load-balancer.
On the proxying side, squid/nginx can do the job relatively reasonably - which you go with is somewhat a matter of preference, but also how important it is to have the kitchen sink at your disposal (if it's not, nginx is arguably the best choice)
On the hardware side of things... an F5 load balancer can do this sort of thing whilst also ensuring high scalability.