[K12OSN] electricity use

Julius Szelagiewicz julius at turtle.com
Wed Mar 14 14:53:53 UTC 2007


On Tue, 13 Mar 2007, Robert Arkiletian wrote:

> On 3/13/07, Brad Thomas <bthomas at bhbl.org> wrote:
> > I am a social studies teacher and I've been building a lab in my classroom
> > of old, discarded computers over the last two years (I was up to 20). I
> > have been using small distros like DSL (DamnSmallLinux) to make them work,
> > but was planning to switch to a k12ltsp setup before the end of the year.
> > However, my principal just sent an e-mail last week instructing me to
> > remove all but 6 of the computers from my room implying that they were
> > using too much electricity. I just got back from a school planning council
> > meeting where she and an assistant principal said that they called Dell
> > (we buy all our new fat machines from Dell) and Dell said there should
> > only be one computer per 20 amp circuit (which translates into one per
> > room I think). As far as I can tell (using a Watts Up meter) one
> > computer-and-monitor use a little more than 1 amp of power, so I don't get
> > this. Can anyone out there give me some guidelines they go by? Or steer me
>
> I am not an electrician but I do teach basic electric circuits in
> physics. The basic equation for power is P=IV (Power=VoltagexCurrent)
> So 120 Volts (which is a north american standard) x 20 A of current
> equals 2400 Watts of power.  Now if you remember those old boxes
> probably have 200W power supplies max + monitor ~75W. Add another 25
> for safety. So I would say 300W/box is reasonable. Although less would
> probably work because I doubt they would draw the max of the PS unit.
> So with a 20A circuit (2400W) that equals 8 machines working at full
> tilt.
>
> Now compare that to an ebox 2300 + and a 19in lcd monitor consumes. 15W + 40W
> Plus the server, don't forget.
>
> > to a good site? How much planning goes into ensuring proper electrical
> > flow into your k12ltsp labs?
> >
Well, there are some assumptions here that don't live up to reality:
1. V * A = W is true for DC (direct current), for AC, say 120V, one must
use a "power factor" arrived at experimentally. It is around .9 for UPSes,
around .8 for AC motor loads.
2. 20A rated circuit will -not- deliver 20A current for long - the circuit
breaker will disconnect at 18A.

So from 1. and 2. it follows that one won't get much more than 1,950 Watts
of power from a 20A 120V AC outlet.

On the other hand, the specs on power supplies are the maxima that can be
drawn. If you remove disk drives from old boxes, the typical load is below
150W.

The Dell reccomendation of one computer per 20A circuit has more to do
with harmonic distortion introduced by the power supply than with the
load.

Take all this with some distrust - I used to be a mathematician. What you
might want to do is to borrow a good meter (with the owner?) and measure
the real power consumption. You will be surprized how little the stations
draw and how much the server does.
julius




More information about the K12OSN mailing list