Hubble sees glowing ring of fire
Strange Geology

Google is big. Really BIG

technologyreview.com

When vice president of engineering Urs Hoelzle gave a talk about Google's Linux clusters at the University of Washington in November of 2002, he quoted a figure of 1,000 queries per second, but he said that the measure was made at 2:00 a.m. on December 25, 2001.

His point, obvious to everybody in the room, is that even by November 2002, Google was doing a lot more than 1,000 queries per second; just how many more, though, was anybody's guess.

The facts may be seeping out. Last Thanksgiving, the New York Times reported that Google had crossed the 100,000-server mark. If true, that means Google is operating perhaps the largest grid of computers on the planet.

The simple fact that they can build and operate data centers of that size is astounding, says Peter Christy, co-founder of the NetsEdge Research Group, a market research and strategy firm in Silicon Valley. Christy, who has worked in the industry for more than 30 years, is astounded by the scale of Google's systems and the company's competence in operating them.

It is this ability to build and operate incredibly dense clusters that is as much as anything else the secret of Google's success. And the reason, explains Marissa Mayer, the company's director of consumer Web products, has to do with the way that Google started at Stanford.

Instead of getting a few fast computers and running them to the max, Mayer explained at a recruiting eventat MIT, founders Sergey Brin and Larry Page had to make do with hand-me-downs fromStanford's computer science department. They would go to the loading dock to see who was getting new computers, then ask if they could have the old, obsolete machines that the new ones were replacing. Thus, from the very beginning, Brin and Page were forced to develop distributed algorithms that ran on a network of not-very-reliable machines.

Today this philosophy is built into the company's DNA. Google buys the cheapest computers that it can find and crams them in racks and racks in its six (or more) data centers. PCs are reasonably reliable, but if you have a thousand of them, one is going to fail every day, said Hoelzle. So if you can just buy 10 percent extra, it is still cheaper than buying a more reliable machine.

Follow me on Twitter: @IanYorston

Comments