It wasn't supposed to be that way. Management, troubled by server utilitization rates in the range of 10%, was delighted to discover virtualization as a way to get more out of existing hardware. Some organizations have begun cost saving initiatives that require all new applications to reside in virtual machines unless a business case can be made to break policy. Unfortunately, the temptation for some overzealous bean counters is to put too many applications on each physical server.
Ok, there could be a tendency to load a virtual server up to the breaking point. But 10 percent? Come on, we are talking about massive waste here, the equivalent of driving a tank to the 7-11 to get a quart of milk. Typically, 40 percent is about the figure where the hardware people start getting queasy anyway; nevertheless, I challenge anyone to draw me a parallel where you buy a piece of equipment and baby it in this manner. Mitchell concludes that "Such determinations can't always be made purely on a technical basis.", but I have to disagree. You benchmark, weight the options, and allocate the timeslices to the applications. It's a numbers game, and the numbers can be way better than they are now.
No comments:
Post a Comment