Science is getting less bang for its buck

By Patrick Collison and Michael Nielsen in The Atlantic:

…we ran a survey asking scientists to compare Nobel prizewinning discoveries in their fields. We then used those rankings to determine how scientists think the quality of Nobel prizewinning discoveries has changed over the decades…

Our graph stops at the end of the 1980s. The reason is that, in recent years, the Nobel Committee has preferred to award prizes for work done in the 1980s and 1970s. In fact, just three discoveries made since 1990 have yet been awarded Nobel Prizes. This is too few to get a good quality estimate for the 1990s, and so we didn’t survey those prizes.

However, the paucity of prizes since 1990 is itself suggestive. The 1990s and 2000s have the dubious distinction of being the decades over which the Nobel Committee has most strongly preferred to skip back and award prizes for earlier work. Given that the 1980s and 1970s themselves don’t look so good, that’s bad news for physics…

Why has science gotten so much more expensive, without producing commensurate gains in our understanding?

There is much more evidence and argument at the linkThis appendix provides more detail on their empirical work.  Self-recommending, if there ever was such a thing.

Comments

This seems like another edition of "for some reason people are surprised that growth doesn't continue at the same rate forever".

Diminishing returns are the norm and do not need an explanation.

That would be a shocking departure from how innovation is normally modeled. Most assume exponential growth i.e., that one advance opens up 2 more via knowledge sharing and the like.

I'll add that your curve would have pretty dire consequences for humanity in general. Pretty much every distopian vision boils down to slowdown in innovation --> shortages in goods --> social chaos.

Early in the curve, innovation begets innovation, and there is rapid progress, but eventually progress slows down as low-hanging fruit of potential improvements/discoveries/etc is consumed, and there is less and less potential benefit of further innovation.

This is absolutely true within bounded fields, with numerous examples. Cars improved a lot less from 1988-2018 than they did from 1900-1930, for example. How much have printed books improved in the last 100 years?

Physics is a bounded field like that. There is almost certainly nothing left to be discovered that is even 1% as important as Newtonian principles.

In the bigger picture, we haven't seen "innovation" slow down because new fields keep opening up. Still, I do expect that eventually overall progress would be subject to diminishing returns, as well.

Respond

Add Comment

I think you've oversimplified the dystopian thing. There are many story-lines that begin with technology overshoot as the source of, enabling, or precipitating the shortages. Blade Runner. 1984. Brave New World. Just about any post-war or post-plague novel.

I guess you might argue that Mad Max is about a failure to come up with an alternative to petrol. But it is at the same time about an over-reliance on petrol creating the conditions for itself. The innovation efforts crawl further up its own bunghole, rather than identifying the need to change the paradigm completely.

In any case, absent a major reversal in what we think we know about earth-bound physics, Nobel's are going to be pretty boring until we start colonizing space without needing to carry earth-replication with us, do time travel, end aging, or find a way to preserve consciousness outside the physical body.

Respond

Add Comment

Respond

Add Comment

Actually I have a paper on estimating average growth rates over the very long run that show that, correcting for the effects of WW2, growth in GDP per hour worked peaked in the mid 20th century. That was when most of the low hanging fruit was being harvested.

However, I think that growth will begin to accelerate again soon as China and India finish most of their industrialization process and their 2.8 billion people start producing ideas at the rate the 900 million people in the developed world are doing today.

Respond

Add Comment

Respond

Add Comment

because the low hanging fruit has been already collected?

That is exactly what is being suggested. Tyler has made this argument but not all agree. It may also be true that the low-hanging fruit of the "known unknowns" has been picked. But perhaps there are major discovery areas we are still ignorant of.

Respond

Add Comment

Respond

Add Comment

We have a lot more scientific tech around today than even 10 years ago, but not many Prizes for engineering a slightly smaller or better microchip or medicine!

Engineering is perpetually undervalued.

Well, Dalen's Nobel Prize for physics was basically for engineering:

https://en.wikipedia.org/wiki/Gustaf_Dal%C3%A9n

Mind you, he was the Daredevil of engineering.

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

I suspect that there's more focus on improvements or enhancements as opposed to discoveries. There's a large body of opinion that progress is derived more from improvements and enhancements as opposed to discoveries.

Respond

Add Comment

William Thomson Kelvin (1900): "There is nothing new to be discovered in physics now. All that remains is more and more precise measurement".

Respond

Add Comment

I felt a whipsaw in the middle of this article. It moved from "fully explored" physics, to "wide open" computer science and biology, and then back to "science" slowing down again.

That's the answer isn't it?

Stop complaining that highly understood areas are understood and focus on high growth fields.

(We are a rich society and you can keep spending money on expensive physics if you want, but it looks like it will drag down "the total ROI of 'science'.")

Note that there are also lots of "one young guy" successes in computer science. A guy named Guido wrote Python.

A guy named Guido wrote Python *in the late 80's*

A guy named Graydon designed Rust.

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

America has become decadent.

Eh, how do you quantify that? After all, people were saying that in the 1970s and the 1980s, and no doubt before.

But now it is true. As Brazil's future Foreign Affairs Minister pointed out, America has proved unable to stop Red China's crise and prevent the Maoization of the world.

But what has that got to do with “decadence”? You haven’t drawn any connection between the growth and evolution of nations and some sort of moral positioning. If anything, it has far more to do with failures to advance STEM and support basic research than “decadence”, unless you are using the term in some very unusual manner.

The point is, there was a time when Americans used to build things. Now, they just sell apps and derivatives to one another. It is not enough to stand up to Red China. As President Bolsonaro pointed out, God has anointed him to lead Brazil against Red China.

What is Brazil's contribution to science and technology? Less than your hated Red China unfortunately.

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

It must also be said that, if you go back as recently as 1900, much scientific knowledge looks either rudimentary or plain wrong, particularly in physics, which was also acting as the limit of knowledge in chemistry. And science often looked like people able to prove that they were wrong, but not sure what could be right. 1850 isn't recent, but scientific practice then looks to me more like 1750 than 1930, and as for knowledge, try the 1600s. The method identifies a great kicking down of the door correctly, but a sample size of 93 judging 10 events per decade is small! Very small!

Respond

Add Comment

Are Tyler's recommendations getting less valuable over time?

Respond

Add Comment

1) As science becomes more complex, think particle colliders, research requires larger and larger teams. Until the Nobel committee starts thinking in terms of teams, the awards will stay in the 70s.

2) Science has always been preceded by engineering advances. R&D takes an ever larger part of corporate spending, so I assume engineering cost growth has outpaced inflation by a meaningful spread.

Respond

Add Comment

In 100 years this concern will look silly.

More like 10 to 15 years.

Respond

Add Comment

My benchmarks are 1. repair damaged neural tissue, 2. find the "off" switch for metastatic cancer, and 3. nuclear fusion. I've been told all my life we're 20 - 30 years away from any one of these. I'm old enough that the deadline has been reached and I'm still being told 20 - 30 more years. Oh well.

1. stem cells have already significantly helped stroke patients in a 2016 Stanford trial. Stanford U, U of Arizona and U of Wisconsin plan on treating heart failure to the extent patients will no longer need medication n 2020, 2020 and 2025 respectfully.

2. I don't about an off switch but everything points to far more effective treatments in 2028 compared to today.

3. Nuclear fusion now looks around 10 years away, maybe 15.

4. Isn't the coming explosion of virtual reality a thing considering it didn't exist commercially in 2013?

5. Excellent machine translation is here that didn't exist 10 years ago.

VR will be the next heroin. I'm not terribly excited about it.

Metastasis remains a death sentence. Something deeper beyond our ability to cut, irradiate or poison rogue cells is going on.

Machine translation will become obsolete as the globe adopts English as the lingua franca.

Stem cells, I certainly hope for the best. I would love to see paralytics walking again and no more ALS.

Fusion will pretty much eliminate what remains of actual Scarcity, and the economists will have to get real jobs. I'm looking forward to my cheap, subsidized existence but again, controlled fusion seems to be a perennial moving target.

It's too early to know what all the uses of VR will be in the 2020s.

It will take a while before everyone knows English.

The goal is to detect cancer before it spreads and get rid of it. That is likely in the 2020s.

The advances in nuclear fusion have been great enough that it is no longer 20 years away.

Everybody alive is in a pre-cancerous stage. That's the price of having a body that repairs itself. At some point, a switch gets flipped in the genetics of cells and they go rogue. I don't know that the research is going down that road. Or maybe it is.

Otherwise, it reads like you agree with me more than you disagree.

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

@Todd K. What's the evidence for 5.? I mean, what progresses have been reported on controlled nuclear fusion?

Just looking up a few articles from the past couple of years. I don't think it can be predicted to a 5 year period but probably 10 years: 2025 to 2035.
In 1998, I was arguing against the alarmist view of climate change and said:
1) solar in wide use by 2030 based on the exponentially deceasing solar cost curve then added "plus or minus five years."
2) Nuclear fusion by 2030 but not sure how much of the energy mix.
3) It will be feasible to take CO2 out of the atmosphere in the 2020s and easier/less expensive in the 2030s if such an action is needed.

All three still look like they will be correct twenty years later.

Probably optimistic but interesting. On 3) I am skeptical, but mostly because I don't know much. Without 3), let us note that 1) and 2) will not be sufficient to curb CO2 emissions as is (arguably) needed, even with a 100% clean electricity produced by a mixed of fusion, solar and some hydraulic.

That's because only about one third (and less in less-developed countries) of our energy use is from electric energy. We would also need to raise that proportion. This means electric cars, electric trucks, electric planes (?), electric ships (?), and also electric heating for building, etc. Also there will be problems of storage of the electricity produced by solar method especially, so we need better battery. Much more technologic progresses are requested than "just" mastering fusion and making solar super-cheap.

Right. I did add in 1998 that better storage will be needed, which seemed very likely by 2030. It isn't obvious that CO2 sequestering will be needed considering the temperature increase range, but it's possible.

I also said that from 1999 to 2009, the temperature increase will be outside the error range but very small and the public would lose interest in global warming. It turned out that there was no increase in the temperature those ten years and that the public interest did wane although much of that was because of the exciting Great Recession. Climategate came at the very end of that ten year period.

Respond

Add Comment

I'm answering this too late (probably), but have a look at the progress made at MIT on fusion. The determining factor is that superconductor electromagnets have dropped in price (and size, the too are related). Interestingly the reason for the drop is capitalism, not government intervention...

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

wow, let's use a totally unscientific, subjective survey method to measure the temporal change in quality of science innovation!
Really dumb.

Ask a small nonrandom sample of "scientists" their opinion about the quality trend in the highly subjective awarding of Nobel Prizes -- then magically distill those answers and generalize them to ALL scientists & science itself. Certainly "Social Science" quality is in a steep nose dive.

I suspect that even "a scientist" will have trouble ignoring the cumulative rewards of older science, as well as the risk in the new and fresh.

It will always be a safer bet to pick a discovery for which applications exist.

Uh-oh, you just made an argument for publicly-funded research and development.

Of course ;-), I'm one of those people who is aware he is using the Internet.

Respond

Add Comment

A deciding how to spend B's money on C is never efficient.

There are numerous and very idealistic billionaires with more money than they or their families can ever spend. For the most part, they park it in government bonds and equities. When they do spend the money, it seems to be on abortive education ventures or protein powder for countries with 4 - 6+ TFR.

Maybe they've looked at the science and don't think there's anything else out there. I don't know. I don't know why billionaires aren't setting up their own de facto countries and populating them full of Da Vinci's. The billionaire class seems strangely pedestrian. But I guess a single-minded mundane focus is how you get to be a billionaire.

Seems to me most of the tech billionaires are indeed trying to either protect themselves from the desperate impoverished masses, or trying to protect their own children from the addictive mind-control gadgets they created and got rich peddling.

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

The article is simply confirming TC's biases. I also detected this measurement error: "However, the paucity of prizes since 1990 is itself suggestive. The 1990s and 2000s have the dubious distinction of being the decades over which the Nobel Committee has most strongly preferred to skip back and award prizes for earlier work. Given that the 1980s and 1970s themselves don’t look so good, that’s bad news for physics…" . So two things: 70s and 80s was 'not that good' and "90s and 00s" was "even worse" in this subjective survey. Hence, if you believe the article, real Nobels were given for stuff before 1970s? If so, they are valuing nuclear inventions too heavily over stuff like DNA replication and gene splicing.

Respond

Add Comment

Respond

Add Comment

Because the great corporate funded research centers (Bell Labs , PARC, etc.) ceased to exist or are shells of their former selves.

Those guys did great work, but we live in a different world, with much more diverse and distributed development:

"Solomon Hykes started Docker in France as an internal project within dotCloud, a platform-as-a-service company,[8] with initial contributions by other dotCloud engineers including Andrea Luzzardi and Francois-Xavier Bourlet"

One guy and a small team. The software debuted to the public in Santa Clara at PyCon in 2013. The software has been downloaded more than 13 billion times as of 2017.

There is no great stagnation and there never was. 0 to 13 billion in 4 years.

The software has been downloaded 1.5 times for every single person on earth? THAT's the innovation that deserves the Nobel prize; convincing people to buy something over and over.

1. It's free.

2. It's likely it is automatically provisioned with new "instances" for software services.

Still, even with automatic provisioning it is one guy really changing the way things are done in the world.

Respond

Add Comment

Respond

Add Comment

13B downloads? I'd bet 90% of the planet has no idea what Docker is.

In a sentence, what does the software do?

I've used it briefly but I'm hardly an expert. I would say that as the number of servers in the cloud hit millions, everyone badly needed a way to put jobs on them easily, scalably (from 1 to 10 to 10K copies of your software on that many servers). I think, Docker was one of the first, pivotal, effective ways to do that.

The first "container software?"

In other words, an arcane application used by an extremely narrow sector to build, what, smartphone apps? Word processors and spreadsheets?

The server side for smartphone apps probably is the biggest segment. Narrow but huge.

More apps on my phone. There is no Great Stagnation.

While that is an acceptably curmudgeonly thing to say, it is also true that most of the Earth's billions interact with technology through their cell phone.

Nobel prize winners:

A Bell>S Jobs>this app guy

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Is any of you aware of this paper?
https://web.stanford.edu/~chadj/IdeaPF.pdf

Respond

Add Comment

Respond

Add Comment

Someone has a working sense of irony, and I'd rather make an attribution to Tyler and MR than to any scribe or editor in the employ of The Atlantic.

The irony for yours truly consists in the stated concern with "the temporal velocity of scientific discovery", since elsewhere I hew to the definition of "science" as "any intellectual discipline dedicated to proving that, while its processes can be described with provisional accuracy, Nature cannot be trusted to be natural in any preferred temporal regard"--which is the attitude modern science inherited directly from medieval alchemy in its quest to "speed Nature and its processes along".

If anyone wants a hint about just how poorly the Nobel-dispensing Scandinavians are at assessing talent, just scan the list of all the Nobel laureates in Literature since the Nobel philanthropy was launched.

Respond

Add Comment

Galileo>Newton>Einstein>whatsisname

Handy>Dixon>Zeppelin>Coldplay

Galileo better than Newton? Surely you jest.

Newton and Einstein are orthogonal.

Instead of using the greater-than symbol >, perhaps I should have used a leads-to symbol =>

I was using > in the spirit of a Grateful dead setlist. Which describes a song that transitions from one to another without interruption.

Speaking of symbols, let's focus on the point, rather than the specific names.

Hopefully we can preempt an argument about whether there's an interim step between Zeppelin and Coldplay, or how innovative rap and trance music was.

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

If the rate of science is slowing, wouldn't the AVERAGE age of discoveries honoured by nobel prizes increase?

Stuff the survey results - is there any trend here?

Respond

Add Comment

At least I am very excited with recent breakthroughs with genome-wide association studies and polygenic scores (not seem to be mentioned in the article). It's nothing less than a scientific revolution and we are in the middle of it!

Respond

Add Comment

Work done in 70s and 80s may have been very fundamental that it provided foundation for the work of 90s. That might be a probable reason to awarded the older guys. Moreover, even though Nobel prize can be the biggest prize for research in sciences, we should probably look into what other great prizes have focused on.

Respond

Add Comment

Uh, this seems completely off base with regards to biomedical science

Simple points:

The 2018 Nobel Prize - the earliest cited work is from the 1990s. Some from 2005 (!) (The Nobel Prize site lists the citations underlying the prize). Immunotherapy for cancer is obviously a major innovative medical advance.

The 2012 Nobel Prize was clearly based on the ability to generate IPSCs. This ability was made possible in the 2000's. IPSCs are an enormous fundamental advance.

Other points:
1. There is obviously no trend with biomedical science (Medicine and Physiology) and Chemistry - look at the graphs. If anything, a slight upward trend, especially with Chemistry

Here are two obvious Nobel Prize and a reasonable, narrow "why":
1. Next-generation sequencing (NGS has already allowed us to understand the molecular deficits underlying hundreds of diseases for which there was no known cause - this was not possible before NGS. Look at Mendelian genetic disease to see why).
2. Development of methods for ancient DNA extraction. (This great technical advance allows us to understand the recent migration and evolution of humans, work that is ongoing and has already led to surprises and insights).

It's stunningly obvious that there have been incredible advances in computer science that just don't fit in a Nobel Prize category - but easily would be thought of as key advances.

As a former academic biomedical scientist (professor) and current biotech scientist, I really don't get why there is such a narrow obsession with CRISPR. Just seems weird and off to many of us.

And for those of you looking for new stuff that impacts health, here is a very recent (open access) publication:
https://www.nejm.org/doi/full/10.1056/NEJMoa1810858

Uh, maybe a Nobel for PARP-inhibitors? Look at the data in the above and argue that PARP inhibitors are no big deal...

"As a former academic biomedical scientist (professor) and current biotech scientist, I really don't get why there is such a narrow obsession with CRISPR. Just seems weird and off to many of us."

I've felt that way about a number of paleontological trends. I think that all this really shows is that scientists are humans. We follow trends, like anyone else. And when the sexy new theory, method, or whatever comes along, we're as prone as anyone else to succumb to the desire to fit in. Doesn't hurt that popular ideas get the funding, too, which adds another layer of humanity to this--researchers themselves follow trends, and those granting research funds follow them.

Respond

Add Comment

Respond

Add Comment

A counter-theory: Perhaps since the 90's it has gotten a lot easier to go back and review the work that was done decades ago and recognize insights that were initially missed. If the 1970's had Google, perhaps they would have went back more to the 30's and 40's.

Respond

Add Comment

I have never seen an account of the concept of "low-hanging fruit" that wasn't just a circular argument.

Does a non-question-begging account exist?

Respond

Add Comment

Expansions in science generally follow engineering breakthroughs that allow us to look at the world in ways we couldn't before. Astronomy was going nowhere until the invention of the telescope. Microscopes paved the way to huge advances in biology. Particle physics was dependent on the development of giant colliders for advancement.

The problem we face now is that we have explored the low-hanging fruit - the things we can see and measure easily. We learned what we could from small telescopes, and needed gigantic projects like Hubble and the new large observatories on Earth to progress.

This means that exploring the boundaries of the known is getting increasingly expensive, and therefore the domain of governments and large organizations like CERN. This concentrates scientific research into the hands of fewer people, which lowers the rate at which we discover new things.

Maximum advancement happens when the 'adjacent possible' opened up by new technologies can be explored by as many people as possible. This is how complex systems expand and learn. By concentrating scientific research into the hands of smaller groups of people, we lower the rate of discovery.

There are areas where this is not happening. Crowd-sourced 'citizen science' movements have led to big discoveries that no one expected, such as the existence of strange dust clouds around 'Tabby's Star'. Amateur astronomers are still important for discovering comets and asteroids and other periodic phenomena, but they don't win Nobel prizes for it.

Another similar issue is increasing specialization. Back in Newton's day, it was still possible for a single individual to be highly educated in many different fields, as the sum total of human knowledge was much smaller. This ability to draw lessons from multiple fields allowed individuals with few resources to make breakthroughs, which in turn attracted more individuals to try. But now we have people who know a lot about very specific things, but who can't possibly know a lot about many things. Therefore, they need to work in teams and be collaborative, which then involves managers, bureaucracy, politics, and decision-making at higher levels where the actual knowledge is lowest. In my opinion, this makes research less efficient and drives research down 'approved' areas of inquiry, which minimizes the chance discovery of unknown unknowns, which is where the real breakthroughs come from.

Consider the rise of SpaceX. NASA had already evaluated self-landing rockets, but a single accident in the DC-X program caused the bureaucratic enemies of the program to kill it. We wasted another two decades throwing away rockets before an entrepreneur came along and decided that re-landing was the right way to go. Had NASA remained the only game in town for rocket launch, we would be spending half a billion dollars per launch for decades to come, and the opportunities for space advancement would be severely curtailed. The revolution happened because someone outside the establishment took a big risk. That rarely happens today inside 'big science'.

Respond

Add Comment

Is any of you aware of this paper?
https://web.stanford.edu/~chadj/IdeaPF.pdf

Respond

Add Comment

It's incorrect to claim that 'science has gotten much more expensive' when there is such a huge disparity in US Federal Funding between physics and the life sciences. For instance in 2017, funding for the life sciences totaled nearly 50% of Federal Funding for R&D vs less than 10% for the physical sciences, a 5x differential. The reverse was true through the 50s and most of the 60s.

Respond

Add Comment

Two thoughts occur.

First, all the low-hanging fruit has been collected. Take biology, for example. Linnaeus got his start doing what we now consider fairly basic taxonomy. Stuff that would be considered suitable for undergraduates now. Ecology, on the other hand, occurs on scales beyond that of a human lifetime (rather by definition in some cases). In geology the case is similar. The first geologic map was revolutionary; today something with that little detail and that many inaccuracies wouldn't pass Field Camp. As the easier discoveries are made, obviously the remaining discoveries will be harder, take longer, and cost more.

Second, looking at Nobel Prize-winners--or any award, for that matter--will obviously bias the results. The Nobel system is known to be flawed, but ignoring that, you're looking (if you buy into the hype) at the top 1% of research and making a statement about the other 99%. That's just bad statistics. The reality is that the rest of science can be chugging along, quite happily and, outside of those in the field, without notice, continuing to make modest advancements at a steady or even increasing rate, without any significant Nobel prizes being awarded.

I spent three years of my life working with a paper out of Paleontologia Electronica. It was your standard paleo report about an out crop of limited extent--sedimentology, stratigraphy, paleoenvironmental analysis, faunal descriptions and assessment, etc. Good, solid work, but nothing that would ever get noticed outside of paleontology. Yet it was key to preserving numerous paleontological resources (ie, fossils). That experience would never be accounted for by looking at the top percent of scientific research, yet it demonstrates the absolute best outcome of such research.

Respond

Add Comment

This short tweetstorm:
https://threadreaderapp.com/thread/1064741431363989505.html

This is amazing work. In this case DNA sequencing quite literally *is* the new microscope (see Deming post referenced in MR a couple weeks back). And see Marcotte's other recent publication on using *microscopy* to perform *protein* sequencing. "There is no great stagnation" in scientific tools. They are advancing rapidly and if anything software, yes the software needed to process the massive data sets being generated, lags the tools significantly.

Could policy improvements be made to push a greater share of funding to more individual scientists? At an earlier age? And to fund much needed software development to make the tools more broadly useful and available? Yes, yes and yes! And maybe that's what Nielsen and Collison were getting at.

The Marcotte lab is but one example of incredible output, with tool development a subset of that. To claim that scientific returns are diminishing seems premature. How much did it cost to sequence DNA in 2000? How much does it cost now? (About 100,000 fold less). What could you do with the DNA sequencing output in 2000? And what can you do with it now? Potentially use it to map the brain connectome!?! Are you kidding me? How exciting is that! It is hard to see how that could be viewed as getting less bang for the buck. The ultimate return on this investment will be so great as to be incalculable. What now is the ROI for the harnessing of electricity?

From the article: "Even to the scientists doing these experiments, it wasn’t obvious whether they were unimportant curiosities or a path to something deeper. Today, with the benefit of more than a century of hindsight, they look like epochal experiments, early hints of a new fundamental force of nature."

That is, "The future is already here — it's just not very evenly distributed."

Respond

Add Comment

Respond

Add Comment