This question is more of a math question than a server question, but it is strongly server related.
If I have a server that I would be able to guarantee 95% uptime and I would put that server in a cluster of 2, how much would the uptime be then? Now, let's say I do the same, but I make it a cluster of 3?
Let's not consider things like single point of failure, but purely focus on the math here. One of the things that makes this a bit complicated is that if for example I have 2 servers, the chance that they are both off is 2^2, so that's 1/4th; or for 3 that's 2^3, so 1/8. Considering I have an 5% downtime for each of these servers, would the total average be then that 1/8th of that 5%?
How would you calculate something like this?