The Internet is a Series of Tubes

Turns out the internet really is a series of pipes  tubes. Here is a picture from one of Google’s Data Centers.

Here is a view inside the Douglas County, Georgia data center. The colorful pipes sent and receive water for cooling the facility. Bikes are the preferred method of transportation inside the massive center.

The pipes are for running cooling water. Here is another data center:

Inside the Council Bluffs, Iowa data center there is over 115,000 square feet of space. These servers allow services like YouTube and search to work efficiently.

More here.


The senator called it a "series of tubes" rather than pipes. I actually got into a heated argument with some friends over whether that was a dumb analogy or not. I said it was close enough for an explanation on traffic issues. He would have gotten a lot more support if he had said pipes. They said no he was a moron. They had never heard the term 'pipe' in its computer-related use. They didn't like me mocking their ignorance after that point.

What was much more absurd, but much less commented on, was what directly preceded the "series of tubes" line:

"Ten movies streaming across that, that Internet, and what happens to your own personal Internet? I just the other day got… an Internet was sent by my staff at 10 o'clock in the morning on Friday. I got it yesterday [Tuesday]. Why? Because it got tangled up with all these things going on the Internet commercially."

I second Sean: the analogy was fine. And of course, "pipe" was used as a metaphor for inter-process communication by the compter geniuses who created UNIX and C.

As John's quote illustrates, the appropriateness of the analogy was rather accidental.

Well, I never disputed that he was a moron. I only said that it sounds like a pretty decent version of the explanation you need to give someone who is not familiar with technology. The 'Internet' that was mailed to him was obviously his own riff. But I'm sure he had some smart lobbyist carefully explaining the issues to him, only to be undone by an intern who sent something four days late and used a bullshit excuse.

Fine, but then it was not the "tubes" bit that was the problem! And I would bet many of the people who have jumped on the "tubes" metaphor have no more clue about Internet architecture than the Senator did.

I think the meme spread because "tube" is a funny word and the phrase was the catch line from the whole clip.

But also, perhaps, because it is sort of true. As a phrase it's sort of "yes but not quite" and in that "not quite" there's a lot of room for absurd speculations.

It reminds me of all the people who likely have zero clue of the physics or chemistry of how magnetism works who derided ICP's silly song with a line about them.

That's great, but I believe Sen. Stevens said it was a series of *tubes* ( if that's what you're referring to).

Why is there a bike in the first pic?

For the oompa-loompas


Many datacenters--presumably including Google's--are huge, and it beats running everywhere.

Exactly. A friend of mine works at Google and she has toured one of their data centers. She said it's massive. Like mile-long massive.

Intentional or not, the bicycle helps you see the scale of the facility. And the commitment of Google to its color scheme.

Pix are from an article in this months Wired magazine. Interesting comments on the evolution of data centers from racks of desktop type boxes in the '90s to racks with 100s of naked specialized motherboards plugged in to really large back planes. The energy consumption is huge but lots of engineering has been done to reduce electrical costs and to manage heat
in very economical ways. Considering how desktop computers age out over 4-5 yrs, and HD going from under 0.5 TB 5-6 yrs ago and now 3TB, one can imagine the investment made in these server farms with complete rebuilds over 3-5 yr time frames.

" one can imagine the investment made in these server farms with complete rebuilds over 3-5 yr time frames."

There won't be complete rebuilds. The racking, cooling & electrical infrastructure are good for decades. They'll just swap out HD's and servers as needed. There's no reason that a given rack slot can't go through 10 servers over the next 30 years.

Are the pipe colors randomly assigned or a code? I was surprised that one of the photos shows a fire suppression system that is pressurized water. I'd have thought they use Halon / Foam or something like that.

These places must be top terrorism targets. Getting Google crippled for even a few days would have a effect far spectacular than a refinery or power-plant.

Marginal Revolution probably runs down the green pipe. I'm not sure if that's political affiliation though. All the Google pipes could be colour coded for political affiliation. Yellow might signify amber 'don't know yet', you know like for 1st year students?

I believe the colors are coded. The original photo has a caption saying that pink pumps water to an outside cooling tower. It's like the Centre Pompidou, I think.

The colors would indicate function, return water, supply from chiller, whatever.

As for halon, I wouldn't want to be anywhere in the room when it blew. You would need to be able to evacuate the room before it let loose, and with a room this size that would be impractical. I suppose the strategy is a water spray tied into a power shutdown. If it goes you fire up the bulldozer and start again.

I think they want to avoid the bulldozers: Caption says:

"Google says it keeps pipes like these ready with highly-pressurized water in case of a fire. The water is cleaned and filtered, so if Google uses it it won't contaminate the facility."

Apparently they think they can salvage 'em after the dunk.

Having recovered a data center after a good dunking, as long as when the equipment dries out there isn't a mineral residue (hence the filtering), it'll generally come back up just fine. You can run into issues if the water hits while the power is still on, but even then most stuff will work fine.

Takes some time with large fans to dry everything out, but most data centers have decent blowers already and the ability to bring in dry out fans helps immensely.

I've seen racks of servers and switches with water pouring over them running fine until the water eventually shorted out the power connections. Most water routes follow the cables down rather than infiltrating into the tightly packed cabinets. Still, if you run your own racks, you might want to consider storing some clear plastic tarps to toss over your racks for temporary protection.

Google tends to have an "open" generic hardware server layout, so water might affect them a bit more than it would big iron or blade servers.


A few years back when Los Alamos was building another iteration of "world's fastest supercomputer," cooling the thing was an issue. Water is not abundant at 7,300 feet in New Mexico, but wasterwater was an available resource that did the job.

Water is only used as a heat sink in such operations, so all you need is a big enough reservoir that can effectively radiate heat away. After you fill the reservoir all you need to add is enough to account for loss. So theoretically, if you used a sealed system, you'd never need any additional water.

However, it's generally cheaper to use an open loop system, ergo waste water, river water, etc. You take it in, dump heat into it and then send it back slightly downstream of where you got it from.

Google idiots have no need to keep all that data in CValifornia when keeping it in cold cold Northern Canada eliminates the heating costs. Google is, afterall, an internet company and its workers can be at a distance from the ddata.

The first looks like an outtake from Gilliam's Brazil, the second looks like a design from Lang's Metropolis. Do we ever escape nostalgia?

Comments for this post are closed