Monday, July 23, 2007

Economics of Virtualization May Be "Off Planet"

With the promise of reduced costs and increased efficiency, the virtualization rage continues in the techno-sphere. The basic premise of virtualization is to make one server do the work of many; this increases utilization, and hence requires less servers. Fewer servers mean less power, which in turn means less CO2, thus saving the planet. Simple.

But now there's a counterpoint; "yes, you have fewer servers in a virtualized environment, but each one of those servers is more heavily utilized, and because they are doing more work their power consumption goes up. The net gain is zero." Can that be? When I interviewed Foedus, they claimed one could get up to a 20 to 1 reduction in hardware using virtualization; it's hard to believe that doesn't more than make up for the extra power. Quocirca did their own analysis and came to the same conclusion. On the other hand, when I interviewed John Engates of Rackspace, he agreed that the power to run the heavier-laden box beats the costs of buying it - the juice beats the iron.

There's got to be missing pieces of the puzzle here - perhaps not all servers virtualize well, or we need to take into account the specific kinds of applications being served. And other items, such as ambient temperature and facility design, clearly make a big difference as well. The takeaway is that, like most things, establishing what power savings you are going to get from an infrastructure virtualisation project is not straightforward, but there is a potential for a win-win here - you will not only be saving the planet, but also money. And with the total power costs over the lifetime of a server currently estimated as being in the region of 50 per cent of the hardware costs, self-interest may play as big a part here as enlightened altruism; I'm still for jumping on the calliope.::Quorica :: The Register

No comments: