Thursday, November 13, 2008

Blog: Working at 99% CPU utilization

November 13th, 2008

Working at 99% CPU utilization

Posted by Paul Murphy @ 12:15 am

The key metric used in data processing management is system utilization - and that same metric has been adopted to push the idea that consolidation through virtualization makes sense.

The key factor driving the evolution of system utilization as a management metric was the high cost of equipment - first in the 1920s when data processing was getting started, and again in the 1960s when the transition from electro-mechanical to digital equipment got under way.

What made it possible, however was something else: the fact that batch processing of after the fact data was the natural norm for the gear meant that user management expected printed reports to arrive on their desks on a predictable schedule wholly divorced from the processes generating the data.

Thus what drove data processing management to use its gear 24 x 7 was money - but what made it possible to do this was user management’s acceptance of the time gaps implicit in an overall process consisting of distinct data collection, data processing, and information reporting phases.

Almost a hundred years later data processing still operates in much the same way - and the utilization metric is still their most fundamental internal performance measure. And, since their costs are still high too, the financial incentive to high utilization continues to provide justificatory power.

No comments:

Blog Archive