Perhaps this article from CIO has the part of the answer. Lovins claims a racks of servers in 2001 used 30 to 60 watts per square foot. however, CIO mag states that:
...a rack of servers installed in data centers just two years ago might have consumed 2 kilowatts and emitted 40 watts of heat per square foot. Newer, "high-density" racks, which cram more servers into the same amount of space, are expected to consume as much as 25 kilowatts and give off as much as 500 watts of heat per square foot by the end of the decade.
So, it loooks like it was stable for a while but now is rocketing upwards. This doesn't seem to mean too much - more dense servers obviously means more power consumption - and of course the data center is the whipping boy for the whole power/computing debate as a whole.
Some projections seem a litte nuts - 50 percent of our energy budget towards cyberinfrastructure by 2020. RAND seemed more realistic in 2002 with 5 percent, but quality and reliability will outpace the cost of the electricity itself.
No comments:
Post a Comment