Search This Blog

How Much Data is Processed by Servers?


No data is available about the amounts of data processed through DoD’s $36.3 billion IT. However, there is information available of the amount of information currently  processed by servers.  With the DoD IT budget ten times larger than the IT budget of any one US corporation,  it is possible to infer what are some of the information processing issues.  In 2008 the US has installed 38% of the global server population. *
According to studies by the University of California, San Diego, in 2008, the world’s servers processed 9.57 zettabytes of information, which is 10 to the 22nd power. That is ten million million gigabytes or 12 gigabytes of information daily for each of the 3.2 billion workers in the world’s labor force. The world’s 151 million world businesses process 63 terabytes per company of  information annually.
There are about 2.5 megabytes in a long  book. It would be necessary to stack such books from the Earth to Neptune and back about 20 times to equal one year’s capacity of existing  servers. Each of the world’s 3.2 billion workers would have to read through a stack of books 36 miles long each year to read 9.57 zettabytes of text.
The total number of servers in the world in 2008 was 27.3 million, with 10.3 million in the USA. 95% of the world’s total zettabytes in 2008 were processed by low-end, servers costing $25,000 or less. The remaining 5% was processed by more expensive servers. 87.8% of processing on US servers in 2000 was performed by means of low end servers. By 2008 that number has risen to 96.0%, which indicates that the low-end servers will continue to carry a greater share of the processing workload.
The follow graph shows the total number of USA servers, including an estimated equivalent number of low-end servers based on 50% performance/price/year improvements: **


The total annual world server sales in 2008 were $53.30 billion, with entry level servers at $30.8 billion. It is interesting to note that large computer complexes, such as operated by Google, Yahoo or Facebook depend on small scale servers for information processing. High end servers show slower gains in performance/price/year than low end servers and are not  preferred in billion dollar computer complexes.
It follows that most information processing in the world is performed by low end servers (392 billion million transactions in 2008), with only 10 billion million transactons executed by high end servers. This pattern is expected to persist. The purchase of computers for cloud computing is not likely to shift in favor of mainframe equipment.
 Transaction processing workloads amounts to approximately 44% of all the total bytes processed. Such transactions are “commodity” applications, such as e-mail, that are not handled efficiently in low-end servers. The overhead costs for administration of a large number of low-end servers are excessive unless they are aggregated into huge complexes that use the identical harware management and software control architecture.
SUMMARY
More than 2,000 of DoD system projects, with budgets under one million per year, are likely to be processed in low-end servers. The the performance/price of servers has been increasing since 2000, the share of the work performed by low-end servers has been increasing. The current OMB guidelines that count as data center only operations with more than 500 sq. ft. are becoming irrelevant.  A rack mounted server costing less than $25,000 occupies only 10 sq. ft. of space. It has greater processing power than a 1990 mainframe. Consequently the current DoD approach to reducing the number of data centers will miss the attempt by contractors to continue installing low-end servers in increasingly stand-alone configurations.
Most of the workload on low-end servers consists of Web services and of commodity computing. There are large economies of scale availabale through virtualization of workloads and consolidation for processing by large servers. This will deliver economies of scale and reduce that number of operating personnel needed to support a large number of low-end stand-alone servers.
What is now emerging from the proliferation of servers is the “big data” challenge. Server capacities are almost doubling every other year, driving  similar growth rates in stored data and network capacity.  Computing is now driven by increasing data volumes,  the need for integration of ever increasing sources of heterogeneous data. There is a need for rapid processing of data to support data-intensive decision-making.
It is necessary to re-examine the current approaches to the realization of IT for economies of scale in DoD. This places pressure in the direction of more centralized management of IT resources.  


·       

No comments:

Post a Comment

For comments please e-mail paul@strassmann.com