I'm attempting to figure out the power/cooling draw for some old servers that I'd virtualized.
Starting with info in this question, I've taken the 880W the old boxes appeared to be drawing (according to the APC logs), and gotten 2765 BTUs/hr, or 1 ton of cooling every 4.34 hours.
At this point I'm scratching my head as to how to figure the cost for cooling. I'm sure that depends on what I'm using to cool. I'd imagine that ambient temperature counts for something at this point, but I'm not sure if it's significant.
[edit] What I'm missing - I believe - is any idea of what 1 ton of cooling costs. Is that a sensible thing to shoot for, or am I barking up the wrong tree? [edit]
In any case, any pointers on what info to gather next, what to do with it, or what's wrong with the above figuring (if applicable) is most welcome.
Here's the easy way. You can do a lot of fancy math, but in the end analysis, the amount of electricity it takes to cool the equipment is equal to the amount of needed to power it. There is solid scientific rational behind this in the bowels of APC's web site, if you're curious.
So, your 880W load with cooling would be 1760W total. From here it's clear sailing. Let's say your kilowatt-hour (kwh) rate from the utility company is $0.17/kwh.
Your annual kwh load will be 1760 watts * 24hrs/day * 365 days/hr / 1000 watts/kw = 15,418 kwh hours per year.
15,418 kwh * $0.17/kwh = $2,621/yr
You should have two sets of numbers: the electricity used by the systems (880W in your case) and the electricity used to cool them (convert Tons to BTU to Watts to kWh), then add them up.
You not only have to account for the electricity you use to power the systems, but also the electricity used to cool them.
You computed the cooling you need not necessarily the cooling you provided. These can be two separate numbers. If your ambient temperature was around 70F these numbers are close enough.
You need to know how much your AC unit draws in wattage/ton or wattage/hour to calculate your cooling costs.
You may not have access to that data easily. You can if you're shopping around usually find the draw of an AC unit on it's spec sheet. Then you just have to multiply it out to get cost over a week or hour or whatever.
Ambient air temp in your data center is almost irrelevant unless you're talking about a tiny number of servers. When our AC fails our room goes from 68-90 in about 20 minutes. I mean if your ambient air temp is a frosty -10F then sure, it can be helpful (that's how we replaced our AC units last winter, by piping in Minnesota January air).
I am guessing you're trying to put an ROI on your virtualzation, to prove it is cost effective. So calculate the cooling load of your virtual server, which I presume you added just to remove the physical ones. Subtract that from the cooling load you calculated above for the servers you removed. Now you know your net gain in cooling load.
Then calculate the draw of your AC unit per ton or per hour, and use that to figure out your cost in dollars per ton or hour.
.880 * kwhrate = $/hr
So at $.10/kwh, you'd be spending 8.9 cents per hour.