*One Giant Leap*

The author is Charles Fishman, and the subtitle is The Impossible Mission That Flew us to the Moon.  Here is one excerpt:

It [NASA’s Mission Control] was the first real-time computing facility IBM had ever installed.


…the Apollo flight computer was the first anywhere to have responsibility for human lives.

That computer had 73 kilobytes of memory and had 0.000002 percent of the computing capacity of an iPhone.  And don’t forget this:

At least while you were headed outbound, you’d have plenty of fuel to correct things.  Coming home from the Moon is a lot less forgiving.  The heat of reentry, the splashdown targeting into the ocean, and the g-forces piling up on the spaceship and the astronauts inside combine to create a very thin slice of air you need to slide your spaceship into.  The command module had just 1 degree of latitude on reentry.  Too shallow an angle, and your space capsule skips off the top of the atmosphere like a flat stone — out into space and a wide orbit around the Earth, from which there was no rescue.  Too steep a cut into the atmosphere, and the speed, heat, and g-forces would combine to incinerate your space capsule.  And unlike on the way out, on the way back there are no go-arounds.

Definitely recommended, gripping from start to finish.  Overall the best history of how the space revolution and the computer revolution were interconnected.


And they did that in 1969.

1865: Properly functioning telegraph cable laid across the Atlantic.
1969: Humans set foot on the moon.
2065: Telegraph cable laid across the Sea of Tranquility.

"It [NASA’s Mission Control] was the first real-time computing facility IBM had ever installed."

No! It was the SAGE system.
Semi-Automatic Ground Environment. The Semi-Automatic Ground Environment (SAGE) was a system of large computers and associated networking equipment that coordinated data from many radar sites and processed it to produce a single unified image of the airspace over a wide area.

Here is an assortment of books on the moon landing (including Fishman's book): https://www.nytimes.com/2019/06/14/books/review/moon-landing-anniversary.html The question presented in all of these books is whether it was worth the cost ($180 billion in today's dollars). As Cowen points out in his comment on Fishman's book, the greatest contribution was in computing. Here's a long interview of philosopher David Chalmers about artificial intelligence, in particular artificial general intelligence, that shows just how far we are expected to go with computing: https://www.nytimes.com/2019/06/18/opinion/david-chalmers-virtual-reality.html

I should have mentioned that Chalmers in predicting artificial general intelligence is as few as 40 years and as long as 100 years. Chalmers: "Artificial general intelligence, A.G.I., is a system capable, like us humans, of performing open-ended tasks independent of specific problems or contexts — conversation, common-sense reasoning, experiential learning and so on. The popular science fiction example is HAL 9000 from the film “2001: A Space Odyssey.”

Let's hope the longer time frame is true for AI. The human race likely won't survive the invention of superintelligent beings.

Of course it was worth the cost. $180 billion in today's dollars from start to finish is peanuts. How much did waging the Vietnam War cost per year? Answer: about $1 trillion in today's dollars. Unsurprising the NYT would concentrate on anything negative, though.

Ya think? My city is preparing to tear down and rebuild its convention center (RIP convention center, 1992-2020) for $1,875,000,000. So we could've had like a hundred convention centers strewn around the country like Carnegie libraries - except empty most of the time - each with a lifespan of 28 years.

Or perhaps the Times thinks the cure for cancer or racism lay in that $180 million dollars.

*$180 billion

Apollo cost about $19.4 billion in actual dollars spent at the time they were spent. There were two *years* of the Vietnam War that cost more than that.

By that measure, at least, Apollo was a bargain. The economic impact across the U.S and the world for decades was 100 times what we spent, if not more.

It was a $180 billion dollar (or $300 billion -- see Larry Kummer below) sporting match against an opponent who never quite admitted he was playing the game.

It was a very impressive feat, but more advances would have been made in computing if the money had been spent on making advances in computing. Same with other technological improvements attributed to Apollo. The basic problem with the program was there were spill over effects that actually resulted in money being spent on putting people on the moon.

NASA low-balled the cost of the manned space program. Including omitted costs, Project Gemini, and the lunar landers, the cost in our dollars would be almost $300 billion. That spending as a fraction of our GDP would be roughly $700 billion.

These numbers from this excellent analysis by  Casey Dreier, using new data from NASA archives.


The publisher of Fishman's book says it describes "the extraordinary impact Apollo would have back on Earth, and on the way we live today." Total nonsense on both points. The money would probably have produced higher gains for America in other R&D, or public infrastructure.

For more about this missed opportunity, see "Why we have not gone into space. Why we must."


The money probably just wouldn't have been mobilized for other public purposes if Apollo didn't happen.

Oh, who knows, maybe we would have dammed the Grand Canyon:


The wonderful possibilities boggle the mind.

Raised the Titanic.

Reversed the flow of the Mississippi.

Much of the opposition to the space program has come from the right not the left. Historian Walter McDougall, a conservative, believed the space program was a step on “The Road to Serfdom.” “To train X thousands of engineers, to reach the moon by 19xx, to place X numbers of missiles in silos regardless of Soviet deployment, to plan for economic growth of X percent without unemployment or inflation, these were not the assignments of a free society but the dictates of a command economy.” He further argued that the Soviet Union made its way into space first because it was the world's first technocracy, which he defines as "the institutionalization of technological change for state purpose." McDougall's view is consistent with the libertarian view, which opposes state action, whether the space program or public transit.

And yet that other right, the populist patriotic flag waving right, currently assailed rather unfairly I think, found the undertaking momentous impressive and awe-inspiring.

Some on the left don't believe we should go. Some on the right don't believe we *did* go.

"…the Apollo flight computer was the first anywhere to have responsibility for human lives."

That is one of the main technology spillovers from the space race. NASA literally reused one of the Apollo flight-control computers and called Neil Armstrong to develop fly-by-wire aircraft technology https://www.nasa.gov/centers/armstrong/news/FactSheets/FS-024-DFRC.html

You don't think computers would have been invented anyway and used for flight control? Programmable computers had been around 20 or more years before NASA, and the really enabling technology was the transistor, again invented in the 1950's.

Before there were digital computers small and lightweight enough to go into a spacecraft or airplane, there were analog computers. The X-15 space plane used these for stability augmentation, for example.

Of course, digital computers enable the implementation of much more complex algorithms, as well as greater precision.

The point isn't so much that microelectronics would not have happened without Apollo (or the cold war- those missiles needed guidance-and-control systems) but that by providing a market for this technology long before costs reached consumer-friendly levels the space race significantly accelerated development.

"but that by providing a market for this technology long before costs reached consumer-friendly levels the space race significantly accelerated development."

Robotics engineer Hans Moravec has graphed computer speed in millions of instructions per second (MIPS) at $1,000 in 1995 dollars from 1900. Ray Kurzweil shows this exponential graph (on a log-linear scale) for his talks.

The trend from the early 20th century is smooth with no "greatly accelerated" part in the 1960s due to a space program or anything else.

See the second graph in this paper:

Transistors are basic components, first built on 1947. The first Airbus with a flight control computer flew on 1988. The first Boeing until 1994. If transistors were the enabling technology, why it took half a century to bring to the market the civilian application of flight control computers?. Middle class people could buy cheap Ataris and Nintendos with lots of transistors before civilian airplanes had this technology. As cthulhu explains below lots of brains worked on logic, reliability and redundancy to produce a great solution.

Minuteman was doing more to help the IC market by 1965. Apollo was big, but it wasn't that big.

Phase 2 of the F-8 DFBW program was much more important than Phase 1, which was the Apollo Guidance Computer reuse. Phase 2, which built all-new computers, demonstrated the frame-synchronous multichannel voting scheme which is the heart of all redundant aircraft and spacecraft flight control computers today.
But the Apollo Guidance Computer was definitely amazing.

I should explain that redundancy among the flight computers, the sensors, and the control surface actuation control is the only way to get sufficient reliability for anybody other than a test pilot to trust their life to DFBW. That's why the F-8 DFBW Phase 2 was so important.

"That computer had 73 kilobytes of memory and had 0.000002 percent of the computing capacity of an iPhone."

So, what is everyone planning to do with their personal supercomputer today?

Not that putting unicorn hair on a live chat isn't a valid usage ..

"640K is more memory than anyone will ever need on a computer" -- Bill Gates.

Nonetheless, it's conceivable that there may be diminishing returns to incremental improvements in computing capacity.

I was playing with OpenSCAD yesterday and actually bogged down my computer for a 5 minute render(*). Amazing, almost like the old days. But for the most part, when people buy "more powerful" computers they are buying "even more excess capacity."

* - single threaded, so it was "fast" while I waited.

As an undergrad, I had a summer job processing solar wind particle data from a lunar instrument package. The computer we used was the hand-me-down from the pre-Apollo Gemini program. The program loaded from paper tape, and the computer had 4 kilobytes of memory. About the size of three household refrigerators.

It's worth recognizing that nothing else in human history has changed at the rate we have experienced with computers.

Not tying to be pedantic, but we don't really experience the rate of change of increasing computer power directly but instead experience the effects from that. In 1993, almost everyone was using a land-line phones and by 2003 most were also using cell phones. 15 years after 80% in the richer countries use a smart phone.

True enough. And as noted below, increases in memory and bandwidth have increased at broadly similar rates. Although neither would work without the increased computer performance.

I compared the impact with this improvement to auto milage for a class some years ago - IIRC if auto milage had similar improvements you could drive across the country on a thimbleful of gas. Or today, probably feed the world from crops grown in your backyard.

Everyone seems to reference Moore's Law here, yet during this time hard-disk capacity grew at least as fast as improvements in processors and semiconductor memory.

And network speeds.

Can you show where network speeds were increasing exponentially?

Remember the pro-tip, always google first.


Interesting, I always thought the space race was in the 50s and 60s, not from 1983.

lol, and do you know what I was thinking when I posted that link?

There is no truth so simple that someone won't argue with it on the internets. But in case you are really so foolish as to pursue this, Engineer started with:

"It's worth recognizing that nothing else in human history has changed at the rate we have experienced with computers."

I'm going to go with "human history runs up until today."

I just asked if you could show networking speeds were increasing exponentially during that time and you haven't. Maybe they did, but I'd like to see a graph of that.

No you didn't. You asked "Can you show where network speeds were increasing exponentially?"

Ever hear of "context"? What period do you think I was asking about on a post called *One Great Leap*?

Giant Leap... not to be confused with the Great Leap.

Far be it for me to rush to the defense of anonymous, but in this case it seemed like he made a fair response to your question, and that you were asking specifically about the 60s wasn't implied at all.

It's easy enough to find evidence of exponential growth of phone services in that era, which is probably the most analogous data you can get.

I'm not trying to pick a fight here. I don't think anyone really doubts network speeds were growing fast in that era.

Anonymous' comment was right below "...yet during this time hard-disk capacity grew at least as fast as improvements in processors and semiconductor memory.

See the time context?

Maybe network speeds were increasing exponentially then, so I asked if he could show it. His graph is for the space shuttle era.

If you have a graph of the exponentially increasing network speeds of the 50s and 60s, I'd like to see it.


There’s a kink in the 70s with the introduction of optical fiber, but it’s exponential throughout.


See table 2.2 for phone calls per day, which I think is going to dominate network bandwidth in that era.

I suggest trying Google. Do you actually think there's some chance this isn't true, or are you just being pedantic?

If you want to nitpick the various exponential growth trends, now is the time you want to focus on. Things were healthy back then.

It is surprisingly hard to find a good graph.


See figure 18 for international calls. Dialing back the snark, you might not think it, but telephone systems were being radically build out in the 50s and 60s. It was not a fait accompli at that point.

That's quite a bit for not showing some exponentially increasing network speed in the 50s and 60s. I don't think it happened then but I don't know.

That fist graph is not "exponential throughout" but has jumps, and it isn't a graph of increasing network speed but of the BL. We need the "L" part taken out"

Define "network." If you're thinking strictly of computers, we were into the era of increasing bus width and faster clocking, and more generally sophisticated IO. The CDC-6600 came out in 1964. So we were definitely getting rapidly growing bandwidth out of each individual link, at least in high-end implementations.

If you are thinking of total bandwidth, which is what I was think of, phones likely dominate and were growing quickly.

The book's title (forget the lost subtitle) does help us think of contemporary space exploration and aeronautical entrepreneurs we might like to see take "one giant flying leap" somewhere . . . . I can think of at least three nominees, that is, for the first one-way trip to Mars.

The moon landing never happened.

Very true. But under President Captain Bolsonaro, it won't be long until Brazil achieves the feat before Red China and Heathen India.

Neither did "President Captain Bolsonaro." He's a figment of your imagination, TR.

He's actually Stanley Kubrick without a beard.

Don't say that in front of Buzz Aldrin: https://www.youtube.com/watch?v=CF_OeMkSAmg

That's no moon....it's a space station.

General Kenobi!

You are a bold one.

Okay, I can sort of see how pushing the whole "Apollo was a hoax" thing could make sense for some people now.

After all, it would take a massive amount of Russian collusion...

'the Apollo flight computer was the first anywhere to have responsibility for human lives'

Guess it depends on how one defines responsibility and computer in terms of fire control systems. This may not be a computer by some measures, but it is also 76 years old - 'One advantage the American destroyers had was the radar-controlled Mark 37 Gun Fire Control System which provided coordinated automatic firing solutions for their 5 in (127 mm) guns as long as the gun director was pointing at the target. A dual-purpose system, the Mark 37's gunfire radar and antiaircraft capabilities allowed the destroyers' guns to remain on target despite poor visibility and their own radical evasive maneuvering.' https://en.wikipedia.org/wiki/Battle_off_Samar

More information is also found at https://en.wikipedia.org/wiki/Ship_gun_fire-control_system#MK_37_GFCS

Those are analog computers and not really programmable, which I think is a distinction that makes a difference.

I should also note that analog computers in the form of stability augmentation systems have been used in aviation for several decades, arguably all the way back to the first Sperry autopilot in 1912. And during the '50s, analog stability augmentation systems became more-or-less fully flight critical components of military aircraft.

Apollo crews just flew around Earth for a few days in 1969. It was already a great achievement for that time. The rest is science fiction. At each Apollo anniversary, NASA is launching a campaign to announce that they would "go back" to the Moon in some date in the future. It was the same in 1999, 2009 and now 2019. I can bet with any of you that nothing will happen by 2025. Moreover it is very insulting for the rest of the world to imagine that something Americans would have achieved in 1969 is still out of reach of the other nations in the world as of today. Honestly if it was possible, we would have North Koreans going to the Moon as we speak. It would be even way simpler for them as they don't need to come back.

But but, but, daguix, how can they go around it several times when it is clear that the earth is flat? I mean the idea that it is round is just science fiction. This is obvious. Somebody as smart as you should know that.

One of the readers above mentioned the fire control computers on US Navy Destroyers. What came to mind when I read the article was the Navigational System (Sperry ?) on the USS Nautilus on its trans-Arctic underwater cruise. ?I suspect that it comes close to being a computer.

Magnetic-core memory was the predominant form of random-access computer memory for 20 years between about 1955 and 1975. Such memory is often just called core memory, or, informally, core.

Core memory uses toroids (rings) of a hard magnetic material (usually a semi-hard ferrite) as transformer cores, where each wire threaded through the core serves as a transformer winding. Three or four wires pass through each core.

Each core stores one bit of information. A core can be magnetized in either the clockwise or counter-clockwise direction. The value of the bit stored in a core is zero or one according to the direction of that core's magnetization. Electric current pulses in some of the wires through a core allow the direction of the magnetization in that core to be set in either direction, thus storing a one or a zero. Another wire through each core, the sense wire, is used to detect whether the core changed state.
Magnetic core made the mainframe ubiquitous. By 1975, the cores were producing some 256 kbs per large 10 by 10 inch board.

Some math/CS nerd probably already said this, but it doesn't require huge amounts of memory to write the basic equations needed to go to the moon, relative to providing such monetizable/taxable goods and services as dwarf porn, baby wipes, funny cat videos, outrage youtube uploads (because some acne-faced fast-food worker was "unprofessional") and of course ammo, without which your beloved right to keep and bear arms is pretty much just talk.

'but it doesn't require huge amounts of memory to write the basic equations needed to go to the moon,"

I almost wrote that...

Comments for this post are closed