If server power consumption grows 20 percent per year, the four-year cost of a server's electricity bill will be larger than the $3,000 initial price of a typical low-end server with x86 processors. Google's data center is populated chiefly with such machines. But if power consumption grows at 50 percent per year, "power costs by the end of the decade would dwarf server prices," even without power increasing beyond its current 9 cents per kilowatt-hour cost, Barroso said.How many IT people see the electric bills? A break even on this would be interesting.
Monday, July 03, 2006
This article is from 2005 but its still timely. The notion is that it cost more to run a server in power than it does to buy the iron. This is at 9 cents per kilowatt-hour, but prices are going much higher. I quote: