i trying to see how much electricity is required to power 'x' number of computers. I know it's a vague thing because some computers draw more than others (eg. diff chipsets, HDD, vid cards, PSU's, etc).
So, lets just assume is a mum-and-dad Dell computer with some average run of the mill stuff. Nothing fancy. 20" LCD's.
this is to help calculate the generator power required to keep around 'x' computers running in a LAN. The real figure is in the hundreds .. but i'm assuming i can just figure out the base cost for one machine and then multiple it by the number of seats.
I understand this doesn't include
- Switches
- Servers
- cooling (fans), etc...
I did some stats on this a while ago FWIW, using the handy dandy kill-a-watt..
Typical Developer Dell PC
(2.13 GHz Core 2 Duo, 2 GB RAM, 10k RPM 74 GB main hard drive, 7200 RPM 500 GB data drive, Radeon X1550 video)
Standard Developer Thinkpad T-60 Laptop
(core 2 duo 2.0 GHz, 100GB hdd, ATI X1400 video)
LCDs
It turns out with the LCDs the default brightness level has a lot to do with how much power they draw. I almost immediately turn any LCD I own down to 50% bright, just because my eyes are overwhelmed if I don't...
While I don't have exact numbers, I have ran LAN parties with up to that many people in a conference room.
We had a power board, which had it's own breaker. We had about 18A breakers in total, for 6480W, which works out to 324 watts per machine. Not really a lot for gaming (we blew a breaker once, but I don't think we had 20 people, more like 17 or 18).
So if if it's just office type computers, 6000-6500 watts should be good.
There are two parts to this:
The monitor is basically constant. Work out hours per day x days per week x power rating and you have a figure (in kilowatt hours) per week.
The PC is a little harder because it uses a certain power level when it's idle and a higher power level when it's doing something. Certain peripherals like optical drives basically use no power when they're not being used and, say 10W or so when they are (figure is for argument's sake).
Generally speaking though, a PC (excluding monitor) shouldn't be drawing more than about 150W under load so use that as a baseline figure. Dedicated graphics cards and other factors can take this to 600W or more.
Generally assume it's under load at least 80% of the time it'll be used. Also take into account people who don't turn their machines off.
Although I haven't had real experience with figuring out how much power to keep multiple machines running, one thing to keep in mind is to have enough power for maximum load.
If tripping the circuit breaker or overloading the generator is unacceptable, then I would think it would be a good idea to figure out a conservative estimate for power consumption -- find out maximum power consumption of each component and round values up.
A rough guestimate that I would come up with for an "average" computer would be something along the line of 300 W for the machine and 100 W for the LCD, and definitely your mileage may vary.
Something that is not obvious but can cause a lot of problems are the "ground fault circuit interrupter" (http://en.wikipedia.org/wiki/Ground_fault_circuit_interrupter). I'm not sure how common these things are in the US but here in Sweden most "newer" houses have them.
Every computer leaks some current to ground, I'm not exactly sure how much but its a couple of mA. If you have a GFCI that will trip at 30mA (which is most common here in sweden) you might run into problems before you overload the fuse.
I have organized a few LAN parties also. We have 16A breakers here, and I have been putting 10 PC's on each breaker without any problem. We have 200V here in the Netherlands, so that's about 3500W for 10 PC's
There are some great answers already that show you the average load while doing some common tasks, but if you're doing a power budget, you really only to make sure you have enough for the maximum load.
What you need to know are the operating AC voltage and the maximum draw in Amps. These can be found in either the technical specs and may be printed on the power supply itself. You get the wattage by simply multiplying these two numbers.
W = V(AC) * A
The converse is also true: if you have a 400W power supply, it draws ~3.6A (400W/110V)
As an example, on the power brick of my Dell mini I see 110V (1A), so I'm looking at 110W of power. Thats the theoretical maximum load. I'll most likely use less than this, but not more according to most electrical codes.
Let's take another example, a Dell 1905FP 19" Flat Panel Monitor. A bit of Googling will get you to the Technical Documentation page. Scrolling down to the electrical section shows the following:
AC input voltage / frequency / current 100 to 240 VAC / 50 or 60 Hz + 3 Hz / 1.5A (Max.)
Standard AC voltage in the US is 110V, so you're looking at 165W max (110V * 1.5A). That page also shows that the normal power consumption is between 32W and 65W.
To figure out your actual power budget, spec out an average system and find out the maximum power load for that then multiply by the number of systems you expect to have. That's your power budget.
Now, you can either get a generator that will power all your devices at maximum load if uptime is a priority, or get one that will handle 50-75% maximum load if price is a concern. As you can see from that Dell Monitor, most devices will operate at 30-50% of maximum load under normal conditions.