Google’s Server Power

Google has many servers, I would estimate based on various reports that they have close to 600,000 servers if not more. Three years ago, experts estimated 450,000 servers1. A thought occurred to me recently, why doesn’t Google offer the services of their computing power for a fee?

During off peak hours most of the servers would be left idle, so it would not take much to configure the remaining idle power to compute something in a server cluster. Think about it. The fastest single supercomputer in 2009 is the Oak Ridge National Laboratory’s Cray XT5 “Jaguar” Supercomputer2. It has 150,000 processing cores compared to Google’s 600,000. To put it another way, the Jaguar’s peak speed is 1.759 Petaflops, but a server cluster with 450,000 computers is estimated to be capable of reaching 100 Petaflops3.

Google has been tight-lipped about how many servers they really have, as some estimates are in the hundreds of thousands and others in the millions. Google revealed in April of 2009 that they create their own modular data centres using shipping containers, each powered with 1,160 servers and their own battery power supplies4. Even one of these containers has a decent amount of computing power. Importantly, how many of these server containers have they created and how many of those are dedicated to cluster computing outside of their search engine estimates?

This is an important question one must raise, especially with the amount of public data that Google is collecting. Perhaps it is in the interest of the public to be aware of this since Google did decide to publicly trade.

It is odd that Google is not offering such powerful computing services as source of revenue, so who is using it? CIA? FBI? NASA?

So, if there is a slight possibility that these server clusters are not being used for computing power, then there is an opportunity for organizations like medical or non-profit scientific research companies to use this computing power. There are huge benefits in doing so, and with Google’s large profits, perhaps they can arrange for a tax-receipt for donating this service.

Since things are under a tight lid with Google, we can only assume that they are sourcing it to governmental bodies interested in military, surveillance, and cryptography data crunching. It is critical that people understand that the ability to compute at this speed essentially makes the encryption used today absolutely useless. For example, in 2001, Pascal Junod wrote a paper “On the Complexity of Matsui’s Attack”, a cipher designed by Mitsuru Matsui to break the Data Encryption Standard (DES)5. DES in still in use today, ATM (bank machines), email, and remote access, to name a few. Pascal demonstrated that it takes only 50 days using 12 powerful computers to break DES encryption (243 DES evaluations). That means Google’s cloud computing can break through most encryption in mere seconds!

So we need to think twice about allowing Google the opportunity to get involved with changing laws governing the use of the Internet.

Update: Reports in 2011 indicated that based on Google’s energy usage they have over 900,000 servers.

 

1 http://www.datacenterknowledge.com/archives/2009/05/14/whos-got-the-most-web-servers/

2 http://en.wikipedia.org/wiki/Cray_XT5

3 http://en.wikipedia.org/wiki/Supercomputer

4 http://news.cnet.com/8301-1001_3-10209580-92.html

5 http://eprint.iacr.org/2001/056.pdf

Leave a Reply