The ghost in the machine

I visited two wonderful churches in Barcelona. The first, of course, was La Sagrada Familia. Ramez Naam put it best, this is “the kind of church that Elves from the 22nd Century would build.” I can’t add to that, however, so let me turn to the second church.

The Chapel Torre Girona at the Polytechnic University of Catalonia in Barcelona is home to the MareNostrum, not the world’s fastest but certainly the world’s most beautiful supercomputer.

BSCC

Although off the usual tourist path, it’s possible to get a tour if you arrange in advance. As you walk around the nave, the hum of the supercomputer mixes with Gregorian chants. What is this computer thinking you wonder? Appropriately enough the MareNostrum is thinking about the secrets of life and the universe.

In this picture, I managed to capture within the cooling apparatus a saintly apparition from a stained glass window.

The ghost in the machine.

ComputerSaint

Hat tip: Atlas Obscura.

Comments

I just visited La Sagrada Familia. My immediate comment was "this puts the Vatican utterly to shame". True enough. Perhaps more relevantly the interior of La Sagrada Familia is now open to the public. The church was consecrated as a Catholic Basilica back in 2010 by the Pope. Services and concerts are now held on a published schedule.

I can not recommend La Sagrada Familia too highly. Go on a tour and get the explanations of every detail. Just the museum under the church is very well worth it. Visit the spires as well. Allow for several hours.

It turns out, that I visited La Sagrada Familia many decades ago. Back then it wasn't nearly as popular. So little money was available, that completion was projected after 2100. Now it has 300+ construction workers building it full time. Completion is currently projected for 2026 (the 100th anniversary of Gaudi's death). Lots of Spanish Civil War history in just one place.

Be sure to go online and get your tickets in advance. So much easier than waiting in line. Probably also helps to pick a sunny day to accentuate the impact of the stained glass windows.

Colin,

+1 on both points. La Sagrada Familia is now very popular and lines can be long.

42

38?

Raspberry Pi versus Cray X-MP supercomputer: http://vk5tu.livejournal.com/50955.html

that a $38 (2015) kids' computer actually does match a $8,000,000 (1988) supercomputer in Whetstone shows pretty clearly that there is no great stagnation

The nanosecond you drive it off the lot, a new supercomputer loses ten percent of its value.

John,

"that a $38 (2015) kids’ computer actually does match a $8,000,000 (1988) supercomputer in Whetstone shows pretty clearly that there is no great stagnation"

Actually, this shows that there WAS no great stagnation. Back in the real world, CPU clock rates peaked 10 years ago and declined thereafter. There is more to the story. However, the "Golden Age" of Moore's Law ended some time ago.

No, not quite. Line widths continue to decline, so transistors continue to get smaller and faster. Defect density continues to decline, so chips can get bigger with more circuitry on them -- note that Moore's Law concerned transistors per chip, and a large part of the increase came from making bigger chips which is constrained by the defect density on the silicon wafer.

We are now stressed by three factors that were less important a decade or two ago. The speed of off-chip communications has not tracked with on-chip bandwidth, so the penalty for accessing off-chip DRAM grows ever higher. Larger on-chip cache mitigates this, but not completely. Power density constrains our ability to run a cutting-edge fast chip, and it imposes problems for densely-packed datacenters -- one rack can consume 50 kW, which is a huge amount of power. As the transistors get smaller and faster, the wires get shorter but their cross-sectional area goes down too, so they don't scale at the same rate -- the transistors are getting faster than the wires do.

MT,

More transistors used to translate into greater throughput per cycle. Ever higher clock speeds were a multiplier. Now clock speeds are roughly stable and more transistors are used to put more cores on one chip. Unfortunately, exploiting more cores to gain additional throughput is much harder than simply running on a faster chip. The exception(s) are applications that are inherently multi-threaded (web servers, SQL databases, etc.).

Note that Gordon More recognizes this. Quote

"I see Moore’s law dying here in the next decade or so."

Put differently, we got so good at chip density and petaflops, that we just don't care. It's a new game. Battery life, basically, upon which Moore is silent.

Gorgeous view of the MareNostrum. If anyone needs a definition of the word "aesthetics", that picture will suffice.

Another chapel-turned-computer-center can be found at Rensselaer Polytechnic Institute.

ok, but what about a computer-center-turned-chapel?

It reminds me of those old Star Trek episodes (in fact, every other episode) in which the Enterprise crew learns that the natives worship an ancient supercomputer in a temple or cave and have all their basic material needs taken care by said computer, then Captain Kirk convinces the computer to self-destroy so the population can from complete dependence to total independence with no warning or preparation because starvation and chaos buid character.

And that's why the Prime Directive was really just sarcastic irony.

Hey man, ill have you know that in "The Apple" Kirk does not convince the ancient supercomputer to blow itself up, the Enterprise just blows it up the old-fashioned way, with Phasers.

And in "For the World Is Hollow and I Have Touched the Sky" they disable the supercomputer god-figure using its instruction manual.

Points taken. Neverthless, he convinced Landru, the ruler of Beta III, to self-destroy ("The Return of the Archons").
He also convinced Nomad, the intelligent probe, and M-5, the intelligent supercomputer, to destroy themselves, he disabled the androids from Mudd planet, the android from Holberg 917-G and the Controller from Sigma Draconis VI. He destroyed the androids from Exo III, the Kalandanian colony computer and the computers which chose the casualities in the Vendikar-Eminiar war (risking the beginning of total war and annihilation of both civilizations).
The lesson here is clear, if you depend on a computer to manage your civilization, prevent the destruction of your species, carry on the mission of your long gone civilization, pilot your starship, conduct scientific research or keep an immortal Earthman company, you should not let Kames T. Kirk anywhere near it. If you fail to observe this principle, you only have yourself (and Starfleet's Human Resources Department) to blame.

Too true, too true. Dont let Kirk anywhere near your super intelligent computer or your girl.

Also, Kirk should destroy your computer for bringing up "Spock's Brain"

"Kirk should destroy your computer for bringing up 'Spock’s Brain'”.
A deserved fate, I agree, but it would be more useful if he traveled to past (or is it the future?)-in Star Trek, time-travelling is easier than boarding a plane, there are dozen of ways of doing it and apparently they rely on the honor system to assure no one will try to change history), to convince the typewriters of people writing "Spock's Brain" to self-destroy. I would pay to watch it.
In fact, when he was not killing computers or gods (Apollo, "Who Mourns for Adonais?"; the god of Sha-Ka-Ree, Star Trek V), he was killing god-computers. When he was not messing around women or robots, he was messing around robot-women. He also sabotaged the Kobayashi Maru program (in two different universes!). When all you have is a hammer, you keep hammering people until they stop yelling and moving.

'What is the computer thinking, you wonder?'

Computers don't think, Mr. Cowen.

…err, Mr. Tabarrok.

I actually live two blocks away from the magnificent Torre Girona chapel )

Let me know if you need restaurant advice for Barcelona.

Comments for this post are closed