I have been comparing our server room logs for power usage, compressor usage, and outside temperature and I'm not sure if something is wrong with my understanding how these systems work, or if something is wrong with our records.
- Hot day: +30C outside, ~50% compressor use, ~25 KW electricity use
- Cold day: -30C outside, ~50% compressor use, ~25 KW electricity use
- Inside it's constant 23C
- Equipment that this is cooling generates about 28 KW worth of heat as measured by UPS readout.
- This is an 20 year old Liebert 20T HVAC unit hooked up outside condenser.
- ~25 KW includes the condenser electricity use
- Compressor readings go in increments of 0,25,50,75 and 100%
- This Liebert has 2 compressors in it and from what I gather (maybe wrong?) each compressor has three speed modes. Off, Medium, High. So if two compressors are on medium that's 25+25=50% or if one is high and other medium that's 75%. If both are on high that's 100% readout.
I would think that on a cold day the compressor use should be way lower than on a hot day. I also assume that lower compressor utilization means large reduction in electricity use.
We have one reading per day (manually entered), and because the compressor use oscillates between 25 - 50 - 75 as it's internal thermostat tries to maintain 23C steady, it's possible that we were unlucky and we got reading during the very short time it was in the "up" part of the curve on the cold day. But I took averages of those numbers to eliminate that possibility. The 50% and 25 KW are averages over multiple days. I rounded them (+/-10%) for the purpose of of highlighting that they are essentially the same. The effect I'm looking for is 30% to 80% difference due to temperature. On the really cold day wouldn't the cooling be essentially free?
Are my readings wrong or am I completely wrong in my assumption that there should be massive power savings during cold weather with this kind of system?
0 Answers