When ordering co-location services, a certain amount of power is included, for example 0.25 amps, 0.5 amps, 1 amps, 2 amps, etc. How do I calculate how much a server is going to require?
When ordering co-location services, a certain amount of power is included, for example 0.25 amps, 0.5 amps, 1 amps, 2 amps, etc. How do I calculate how much a server is going to require?
Watts = Volts * Amps
None of that should matter though -- You will typically be charged for how many watts you use. You can think of this as the volts are the water pressure, the amps are the size of the hose, and the watts are how much water has actually gone through the hose. This sort of description fails in a bunch of ways, but as long as you don't build things based only on this abstraction you'll be okay
The only way to be sure about your system's utilization is to measure your server's utilization under your expected load. A PSU that can draw 500 watts, if you've got 220v power, can draw up to 2 amps, but it may draw only .5 amps under typical loads for your workloads.
Commonly you usually only pay attention to the amperage to figure out how big a wire you need to use to connect your box to the mains; if it is drawing more current it needs a larger wire (which, weirdly, is a smaller "gauge" wire, but don't worry it'll be thicker and more expensive).
Lastly, if you go beyond 15 or 20 amps (at least in the US, with 110v circuits), you'll get into a situation where you have weird connectors (twist lock connectors that vary in geometry depending on the current rating of the circuit; you can't plug a 20amp twist lock connector into a plug for 30 amps, for instance). But again, most of the time you don't need to worry about these details unless you're looking at big arn.
Just a thought, if you have your server already and want to test out the actual amps it is drawing, you could use Kill-A-Watt or something similar to measure the actual draw before placing it in the datacenter.
Other answers correctly refer to wattage as being the important measure of how much power you really use, however... many data centers and colo providers (like the two that I use, one in Canada, one in the US) will bill you a flat rate per circuit, measured in amps.
So it is useful to know the power draw in amps that your equipment will use. A very rough rule of thumb for ball-park estimation would be about 2A per "average" server. But if you need precision, then measure it precisely. Don't rely on ball-park estimates. :)
You can buy power bars that show you the amps used by whatever you plug into them. Good ones will let you poll that data by SNMP and you can graph it or whatever.
Amp is a current measure, not power. Although knowing the Voltage of you power source it is trivial to use current values as a measure of the power consumed. W=VA, Watt=Volt*Amper (disregarding the power factor, the phase between this two vectors).
Knowing this you can easily look at your HW power consumption details, it will give you the max Watt consumption, divide it by you electrical Voltage (110 or 200, depending of you location) and you will have how many Apms would consume, in the better case of course!
You can calculate the amperage from the wattage of the server by the following equation watts=volts x amps
therefore you know your operating voltage and the wattage of your PSUs so you can calculate the amperage by dividing the wattage by the voltage. For instance 1100W @ 110V = 10A
This is the "ideal" calculation and doesn't take into account efficiencies and load levels. But if you put in the worst case scenario figures (I.E. maximum PSU draw) then you will get your WCS amperage which the Co-Lo company will thank you for.
Most modern servers have some sort of on-board managemend card. For example, IBM x3550 servers have the BMC interface for in or out-of-band management and among other things it shows the ammount of power (in watts, see previous posts ) the server uses. For example on debian 5 you would do:
You can only get the real reading of the current by inserting an ammeter in the power lead. You may be able to borrow an AC ammeter, or a low power tong tester, from a friendly electrician.
Insert the ammeter in series with one of the mains wires - or clip the tong tester around one of the mains wires. Either way will require a modified mains lead. And do be careful - mains voltages can kill.
The current will vary depending on the power management built into the server so you will need to run the server in a simulated full load condition in order to show the maximum current drain.