[Baylisa] Q: Estimating DC Power

Nick Christenson npc at GangOfOne.com
Wed Jun 4 17:33:29 PDT 2014


> As best I can tell, the Commonly Accepted Best Practice is that you aspire
> to not exceed 80% capacity on any given circuit.  

A lot of people will tell you that this is enshrined in the National
Electrical Code.  It isn't true.  What the NEC says is that you can't
have a single appliance on a branch circuit with more than one outlet
that draws more than 80% of the circuit's rated load.  One appliance
on a single outlet circuit is allowed to draw 100% of the load, or 
multiple appliances are allowed to draw an aggregate of 100% of the 
load, as long as no one device exceeds 80%.  

Note, the loading rules are not designed to protect the circuit
breaker, they're designed to protect the wiring.  The problem is that 
a circuit breaker rated for 20A isn't going to be rated for continuous 
duty at 20A forever.  So, while the electrical code will allow you to 
run 19.9A off a 20A breaker, we can't be shocked if that breaker 
eventually trips or fails.

So, at what continuous load is it safe to run a 20A circuit?  Nobody
really knows.  There certainly isn't a hard cutoff.  However, because
just about everyone follows the 80% rule of thumb, collectively we
have a lot of experience that tells us that it's safe to run circuits
for long periods of time at 80% load.  I've run circuits at 90-95%
load for moderately long periods of times (several months) without
problems, but that was years ago, it's a small sample size, and it 
doesn't make me comfortable.

Note also you need to deal with the power factor of the equipment.
That is, because the power draw of the equipment can be non-linear,
the effective volts*amps of a AC appliance may differ from its
measured power draw in Watts.  This is long and complicated, but
the upshot is that you need to make sure your measured current load
divided by your power factor doesn't exceed the rating of your
circuit breaker.  

Twenty years ago, a lot of computer gear had power factors of 0.8
or even lower, but I power draws these days that are much closer
to linear with most modern data center servers having a power factor 
in excess of 0.9.  If you're working with a generous margin for 
error, you can probably get away with ignoring this, but if you're
dipping into those margins, you want to be a little more careful.

> But then Management wants
> to know why.  And my answer is:
> 1) Estimating power consumption is not a precise science

This is true.  Also, as equipment, for example power supplies, ages 
it's possible that it could start drawing more power. 

> 2) Power fluctuates, you want some room for error

Yeah, but if you're measuring at peak consumption, you're good.  On
the other hand, sometimes you add an extra hard drive to servers, 
upgrade their RAM, or replace power supplies with ones that aren't
exactly like what was in there before, and it's good to have a little
slack.  If you're running identical machines that will never be 
individually upgraded on a maintenance contract, then I'd be willing
to run with less margin.

> 3) Getting it wrong means blowing a fuse on the PDU, losing a rack, and a
> prolonged outage...

Yes.  Getting it wrong on the low end costs a little money in unused
capacity.  Getting it wrong on the high end causes serious downtime
and maybe hardware damage.

> So, I have competing strategies:
> 1) Most Conservative: Take the *max* momentary measurement observed as your
> peak power consumption.  That's your 80% baseline.

This is the way I see most people setting up their gear.

> 2) Take the *average* peak consumption as your peak consumption.  That's
> your 80% baseline.

I wouldn't do this one.  Measure your real peak draw on the circuit and
use that as your baseline.

> 3) Take the max momentary measurement as your 100% of circuit capacity, or
> 90% ... as long as your average peak doesn't exceed 80% of circuit capacity
> ...

Under restricted conditions, I'd be willing to do this up to 90%
capacity:  (1) Management explicitly signs off on the risks, (2) There
is no way anyone would do something like plug another piece of gear,
even if it's something as small as a laptop of phone charger, into
those circuits, (3) I've measured the power factor of the equipment
in question and am confident that it exceeds 0.9, and (4) the gear 
is uniform and won't be individually upgraded without remeasuring 
and rebalancing circuits at that time.

> I am of course comfortable with Most Conservative, but I'm not the one
> writing the checks, and I'd rather not spend money we don't need to spend
> ... how do you folks estimate power needs?

There isn't a lot of widespread knowledge about running this close to
the margins, and what is known is situationally dependent.  Like anything
else, there are risks that are hard to quantify.  The big factor is not
so much the cost of outage, but what value do you assign to the cost of
being wrong about your assumptions.

Generally, as you run anything closer to red line MTTF goes down and
systems require more hand holding which increases costs.  Balancing
these isn't easy, but I get nervous around folks who try to push 
these margins without acknowledging they're taking on risk.

I'd quantify the known costs, explain the issues with the unknown risks,
and present these to management and let them decide and sign off on 
what they want to do.  

Hope this helps.

-- 
Nick Christenson
npc at gangofone.com


More information about the Baylisa mailing list