I recently had a UPS fail to do its job during business hours because I didn't realize it had already reached the end of its useful life. Yes, the battery was old and, yes, I should have manually tested it during off hours to discover the issue before it became a problem. I've replaced the battery and have verified that the automatic self-tests are enabled and that email notification works.
In this case the UPS failed not because of a power outage but because of a low-voltage drop in the data room during lunch time that it could not compensate for it. So just like we use UPSs to protect us during outages I thought it may be wise to also protect the room from low-voltages in the line before my UPSs (as a redundancy to the UPSs' built-in voltage regulation system). (I have various model APC Smart-UPS units [not the on-line type].)
I've included two graphs of last month's reported Line Input Voltages from two different UPSs on different breakers. The peaks and valleys in these graphs are very normal for me but I don't know if they are normal in general or if they indicate an issue with the building (I heard it may be indicative of a bad ground somewhere) or not.
Looks like you're experiencing either a brown-out or something big is in use during some of the big drops in voltage.
If the issue is you have too much draw from your devices, you can/should split them across multiple UPS's. Batteries only work properly if you don't overload them. If you split the devices accordingly, your graphs later show the power still drops, that means you're not getting enough power to the outlet you're plugged into. Examples of things that cause this are too many devices running off the outlet or circuit. Note: One circuit can have multiple outlets.
If the issue is that you have just a general lack of consistent voltage, then splitting them up won't do much, but it will make it easier for the UPS's to handle the brown-out if it doesn't have to handle a lot of devices.
The key isn't so much voltage as it is amperage. If the amperage on your circuit is rated for 20 amps (average home circuits have 15 amps; server room circuits are generally built at 20 amps, but if you have a high capacity/high usage server, circuits might have been built with 30 amps; you'd need to check your circuit breaker or call an electrician to confirm), then anything more than that can cause a drop in supply. You should add up all the amperage requirements on your devices that are on the circuit and try to keep your amperage usage around 75%-90% (depending on how lucky you're feeling). It's a personal preference, but the goal is to assure you have sufficient amperage available to handle both startup amperage as well as maintainance amperage. For example, you may have a server that runs on 5 amps, but it may require 7 amps to startup before it settles down at 5 amps (these are just random numbers, you need to do the math).
Here's some equations to help:
Wattage = Voltage x Amperage
If you have a 750W server that runs on 110V, you need 6.8A. These are theoretical examples. Your server at idle might only need 300W; your server at high usage may require 700W; your server on boot may require 900W.
Scenario: (3) 750W servers on a 15A circuit
If (2) of those servers are on and on high load, you're using 13.6A. If you go to turn on your 3rd server, it will overload the circuit, because of a dip in amperage. This is where UPS's come in play. The UPS's job is to assure consistent voltage and amperage. If it senses a drop in V or A, it adjusts. If, however, there are too many devices, the same principles above apply.