Saturday, February 17, 2007

Geek alert

I'm going to go all geeky on you for a minute or two. If you aren't even remotely interested in technology, then look away now.

An article in Slashdot caught my eye claiming that in the past five years, server power consumption in th US has doubled. At first, I thought they meant each server had doubled it's usage but it seems that what they mean is that overall consumption by servers in the US has risen from 0.6% of consumed electricity in 2000 to 1.2% in 2005. This is attributed to all sorts of things including the rise of the Blade centre.

But, from experience, it seems that Blades often replace old, inefficient servers and are capable of doing more on a per-unit basis which means that in a fair comparison of, say, productivity per watt consumed, the new systems win out for efficiency. On top of this, many organisations are installing large server farms but then getting rid of traditional desktops for users and replacing them with thin client terminals and Citrix environments on those shiny new Blade farms. I would be interested to see what the power consumption difference is between the two models but I'll be willing to wager that it is lower with the thin client/server farm option than having all those desktops sitting about the place.

As a result, this article seems a little disingenuous. I would be very interested to see if there have been any comparative studies done between the two models mentioned above and the efficiency of new vs old servers to see if the headline rise in consumption is automatically a bad thing or if we're actually reducing overall consumption per user by increasing our server farm sizes.

0 comments: