I'm moving my server to a co-location center and they'r not concerned with the actual wattage, which is what I've tracked, but they are very concerned with the peak amperage. They charge by the amps made available to the machine. Is there some industry standard way I can test that? The person I spoke with in the data center is a sales guys, so he's not sure of the technical aspects that he's asking me about.
If there's a software solution, my system is an HP DL580 G7 running centos 7.
What I've tried:
I have a UPS on it now that gives wattage outputs which bounce all over the place. The highest I've seen is 800 watts, so my guess is 800watts/120volts should be six and two thirds amps. Do I provision 7 amps? Doesn't sound very precise.
Powerstat says "Device does not have any RAPL domains, cannot power measure power usage." so I don't think it's compatible with my system.
Please let me know what the industry standards for this are.
0 Answers