A Very Depressing Paper on the Great Stagnation

by on December 17, 2016 at 7:33 am in Economics, Science, Web/Tech | Permalink

Are Ideas Getting Harder to Find? Yes, say Bloom, Jones, Van Reenen, and Webb. A well known fact about US economic growth is that it has been relatively constant over a hundred years or more. Yet we also know that the number of researchers has increased greatly over the same time period. More researchers and the same growth rate suggest a declining productivity of ideas. Jones made this point in a much earlier paper that has long nagged at me. With just one country and rising world growth rates, however, I wondered if the US had somehow had offsetting factors. Bloom, Jones, Van Reenen and Webb, however, now return to the same issue with a more detailed investigation of specific industries and the picture isn’t pretty.

Moore’s law (increasing transistors per CPU) is often trotted out as the stock example of an amazing increase in productivity and it is when measured on the output side. But when you look at Moore’s law from the perspective of inputs what we see is a tremendous decline in idea productivity.

The striking fact, shown in Figure 4, is that research effort has risen by a factor of 25 since 1970. This massive increase occurs while the growth rate of chip density is more or less stable: the constant exponential growth implied by Moore’s Law has been achieved only by a staggering increase in the amount of resources devoted to pushing the frontier forward.

unmoored

In some ways Moore’s law is the least disturbing trend because massive increases in researchers has at least kept growth constant. In other areas, growth is slowing despite many more researchers.

Agricultural yields, for example, are increasing but the rate is constant or declining despite big increases in the number of researchers.

agyield

Since 1950 life expectancy at birth has been growing at a remarkably steady rate of about 1.8 years per decade but that growth has only been bought by ever increasing number of researchers. Here, for example, is cancer mortality as function of the number of publications or clinical trials. Each clinical trial used to be associated with ~8 lives saved per 100,000 people but today a new clinical trial is associated with only ~1 life saved per 100,000. lifeexpectancy

And how is this for a depressing summary sentence:

…the economy has to double its research efforts every 13 years just to maintain the same overall rate of economic growth.

In my TED talk and in Launching I pointed to increased market size and increased wealth in developing countries as two factors which increase the number of researchers and therefore increase the global flow of ideas. That remains true. Indeed, if Bloom et al. are correct then even more than before we can’t afford to waste minds. To maximize growth we need to draw on all the world’s brain power and that means we need a world of peace, trade and the free flow of ideas.

Nevertheless the Bloom et al findings cut optimism. The idea of the Singularity, for example, comes from projecting constant or increasing growth rates into the future but if it takes ever more researchers just to keep growth rates from falling then growth must slow as we run out of researchers. As China and India become wealthy the number of researchers will increase but better institutions can only push lower growth rates into the future temporarily. Most frighteningly, can we sustain a world of peace, trade and the free flow of ideas with lower growth rates?

Just because idea production has become more difficult in the past, however, doesn’t make it necessarily so forever. We could be in a slump. Breakthroughs in ideas for improving idea production could raise growth rates. Genetic engineering to increase IQ could radically increase growth. Artificial intelligence or brain emulations could greatly increase ideas and growth, especially as we can create AIs or EMs faster at far lower cost than we can create more natural intelligences. That sounds great but if computers and the Internet haven’t already had such an effect one wonders how long we will have to wait to break the slump.

I told you the paper was depressing.

1 chrisare December 17, 2016 at 7:37 am

This seems an apt explanation:

https://www.amazon.com/gp/product/0525952713/

2 Ray Lopez December 17, 2016 at 12:13 pm

The link is to this: “The Great Stagnation: How America Ate All the Low-Hanging Fruit of Modern History, Got Sick, and Will( Eventually) Feel Better Hardcover – June 9, 2011 by Tyler Cowen (Author) ”

My comments on TC’s observations are two fold:

1) it’s not depressing. Note the first figure is for growth rate for chips, which has NOT GONE DOWN. That’s optimistic! 35%/yr growth rate means every two years things double. Bacteria do that. So we’re having to run faster to ‘stay in the same place’ but that same place is doubling good things every two years. Ditto for crops (all positive growth). For cancer, it’s a tough disease to kill, and some say the secret to eternal life is to find a cure for cancer. Hard problems need bigger incentives.

2) a better (Ray Lopez) patent policy would allow society to break free of the Great Stagnation. But even with Great Stagnation you have positive growth in microprocessors (the Gold Standard for growth), in crops (all positive growth) and even (barely) in cancer prevention. And to be frank most people implicitly don’t believe innovation can be taught (pace myself, who sees it can be) and don’t care that we’d have flying cars and living to 120 if we had a better patent policy. They are in many ways perfectly happy the way things are, they just like to complain. The kids are alright says Who?

3 Pshrnk December 17, 2016 at 7:25 pm

+1

TC being a Christmas troll

4 Yoav December 18, 2016 at 7:25 am

Well, we might be getting more transistors per chip, but keep geting less from each transistor.
The low hanging fruits in voltage reduction, frequency increase and CPU architecture have been depleted.
That is why if you measure the performance per chip, not number of transistors, it ismostly flat in recent years.

5 Kelly Parks December 30, 2016 at 1:33 am

You’re overlooking the obvious. Increased government regulations have been a serious drag on growth in many fields. Endless FDA regulations make the process of drug approval longer and more costly so of course the rate of innovation decreases and people die by the thousands waiting for drugs that could have saved them. That’s the cause of the dip in effectiveness of cancer research.

6 bmcburney December 18, 2016 at 12:34 pm

No, that book does not contain anything which can be described as an “apt explanation.” You could say it contains an accurate, if only a partial, description. At best, what may pass for “explanations” and “solutions” in that book are just half-hearted speculations. As, perhaps, they must be. It may be, as TC implies, that from now on sustained progress will require people (or machines) to be smarter than ourselves. It may be that the corruption of science into mere “cargo cult” behavior is a product of those fundamental limitations. We can’t actually do the science any more but we do know how to wear lab coats and wave our hands.

Personally, I tend to believe this reverses cause and effect. I think science doesn’t work as well as it used to because an important ingredient of the scientific process is now missing. As Feynman pointed out, science requires an extreme level of personal integrity. Is that level of integrity still a part of science? You might visit Retraction Watch and draw your own conclusions. As a lawyer I have interviewed and cross-examined approximately one hundred experts in STEM fields. For what that may be worth, in my opinion a STEM expert could probably be found to testify under oath that up is down, left is right, and two plus two equals seven. This is why science worked best when it was practiced by stuffy old Victorians. For a while, probably, some post-Victorians managed to retain the attitude of integrity in the lab even after it had vanished from the rest of the world. Now, not so much.

Read “The Hockey Stick Illusion” by A.W. Montford. Regardless of whether you believe in Catastrophic Anthropomorphic Global Warming or not, look at the way Mann’s data and methods remained obscure for years while the conclusions of the original hockey stick paper made him world-famous hero of science. Of course, once data and methods were known, the original paper had to be given up for dead but new papers had to be written reaching the same conclusion. In time, the new papers, too, were debunked and now only the conclusion now remains; like the smile on the Cheshire Cat, while the rest is gone. That part is what we now call settled science and if you dispute that, TC’s colleagues at GMU will RICO your ass.

7 chrisare December 17, 2016 at 7:40 am

As the title refers to. Duh.

8 Scott December 17, 2016 at 7:54 am

Are all the people counted as “effective researchers” really working on the measured variables? Is every researcher in the semiconductor industry trying to make more transistors or are some of them focused on how to better use the ones we produce? You can make a great or mediocre CPU with the same number of transistors. In farming is everyone focused on yield? Are some focusing on shelf life (if that wasn’t factored into the definition of “yield”) while others on the optimum shade or red for selling tomatoes? Is some of the counted medical research on cosmetic surgery? …

To be fair, I didn’t follow the links and read the articles so maybe these are addressed.

9 Mark December 17, 2016 at 9:46 am

I read the article pretty quickly, but the authors do address this point. If researchers are switching to more promising areas, then we should expect to see that reflected in the aggregate measure for the whole economy. Of course, once you’re looking at the aggregate, you have room for all kinds of other plausible causes. I was less convinced by individual outcome measures of Moore’s law (vs energy efficiency) or lives saved (vs quality-life years gained), but we still do care about those measures, and altogether it is a bit of a downer.

My wild guesses for sources of future growth: AI, solar, and AI combined with sensors (automating the transportation and mobility sectors)

10 Stormy Dragon December 17, 2016 at 11:12 am

Also, is total yield (rather than yield per acre or cost per bushel) the right measure for agriculture?

11 Troll me December 17, 2016 at 2:41 pm

In agriculture, “yield” means “yield per acre” or whatever unit of land being used.

12 Stormy Dragon December 17, 2016 at 3:09 pm

Ah sorry, I thought it meant total production.

13 chuck martel December 17, 2016 at 7:59 am

” Genetic engineering to increase IQ could radically increase growth.”

That’s what’s depressing.

14 Troll me December 17, 2016 at 2:45 pm

How much more than “enough” do we need?

What’s the next goal?

Constraints on these technologies will be needed to ensure we don’t go into an endless arms race that doesn’t really deliver anything that anyone actually wants.

So, maybe we reach Alpha Centauri a few thousand years earlier. Or … we get smarter faster and destroy everything sooner? It’s really not obvious. So in the meantime, obviously, frameworks which constrain the retarded possibilities would be good.

15 Alain December 17, 2016 at 2:57 pm

Lol, wut?

Who decides, you?

How about this: many think being smarter is better. If they want to fund such research then tough bean for you and you ilk. Ok?

16 Troll me December 17, 2016 at 5:33 pm

Me, of course.

So let’s all 7 billion of us have a massive dual to determine which “me” gets to decide.

My “ilk”. What’s “ilky” about me?

Speaking of potential risks is SOOOOO EVIL.

(P.S. – think culture, not how high the mountain of crap we don’t need can be built.)

17 Bill Quick December 26, 2016 at 6:41 pm

“I don’t want” /= “We don’t need.”

18 Prankeapple December 17, 2016 at 5:20 pm

That’s a dumb point of contention. Taking your idea to its logical end would mean that we stop researching new technology or trying to push the ball of civilization forward altogether.

I’m always puzzled when people become skittish about increasing the IQ of the human species – it’s obvious to me that a smarter race would be an unalloyed good. Jews are a sterling example of this – at less than 0.2% of the world’s population, they hold an astonishing 41% of the nobel prizes in economics, 28% in medicine, and 26% in physics. They also have a mean IQ of 115, which is about one standard deviation about the European mean. Imagine if you could engineer the average human to have a mean IQ of 130 or 145 – the world would become a paradise.

19 Troll me December 17, 2016 at 5:36 pm

It you do not extend an idea to logical absurdity, then it will not be absurd.

Attending carefully to risks relating to a potential “genetic arms race” (I’m actually more concerned about the effects this would have on culture and quality of life, than the potential to explode entire planets by year 3300 instead of 4000) … is not the same as suggesting that we should all become Amish.

20 chuck martel December 17, 2016 at 8:35 pm

What makes you think that intelligent people are necessarily “good”? Even now the elites that believe that they’re smarter than everyone else are making bad decisions that screw up life for the normals. Albert Einstein, for instance, was an avowed socialist. Maybe you, too, are a socialist, and desire to live in a socialist paradise like Cuba, where bright guys like Al would make the decisions. Or you could just move to Havana. The astonishing thing about the Nobel prizes is that anybody takes them seriously.

21 Troll me December 18, 2016 at 2:11 am

Does it not seem ever so slightly suspicious that the main target of genocide in the 20th century has these differential characteristics?

Personally, I attribute the observed difference to a culture of bookishness and heated open debate that leads them to perform well on standardized tests in addition to a variety of intellectual fields.

But … refer to point 1. Also, the fact that I’m necessarily “anti-semitic” to even wonder such a thing … kind of makes me wonder even more.

22 Troll me December 18, 2016 at 2:12 am

But they ARE the Chosen People after all.

So maybe we should just all make more space for our betters.

23 Pshrnk December 17, 2016 at 7:26 pm

Amplify our Neandertal genes!

24 Normal December 18, 2016 at 8:07 am

It doesn’t help that the high IQ individuals we already have get diverted from science to finance, which provides no lasting benefit to society.

25 Sam The Sham December 17, 2016 at 8:14 am

Don’t forget that a fair amount of growth has been caused by debasing our currency and propping up bubbles. It’s gonna be fun.

26 Joël December 17, 2016 at 8:16 am

Obviously if you hire more people to do research the marginal quality of an added research will decrease, and anyone who has worked in research know that this reduction is drastic. In basically any field, most researchers nowadays, despite their best efforts, are not capable to make any contribution of values.

Is that sad? I am not sure. When I read “agricultural yields, for example, are increasing but the rate is constant or declining”, I think, “sure. so what?”. Constant rates mean exponential growth, and any commodity produced with exponential growth with a non-trivial rate (like the 3.5% for corn in the 1960’s that Alex wishes would still be here) will soon enough (a question of centuries, not of millenium) exceed any available resource on our planet. For example just corn production, if growing at 3.5%, would contain in about 700 years more water that exists on earth

We can not reasonably want that, so we should rejoice of the diminishing growth rates, at least while we are stuck on this planet.

27 Mark December 17, 2016 at 9:51 am

The authors try to handle this by adjusting for a researcher’s average wage within industry. If lower quality researchers are added, the assumption is that they do not command the same salary as higher quality folks.

28 dcardno December 17, 2016 at 12:45 pm

I think the lower salary assumption requires that ‘researcher quality’ is observable – it may not be.

29 anon December 17, 2016 at 11:36 am

That is a good point. We enjoy exponential growth at certain times and in certain places, but it can’t be the norm.

30 Troll me December 17, 2016 at 3:03 pm

Maybe it’s better to use something like “constant dollars” or otherwise using baselines as the point of comparison?

Say, measuring 2016 productivity gains as a function of the economic size in 2000.

Because if you’re interested in the overall magnitude, it seems better to use a magnitude (e.g., yield in 2000) as a constant point of reference rather than an exponential comparison.

This would probably erase much of the supposed stagnation issues – and, probably, it’s really debatable whether one approach (focus on magnitudes) overestimates growth or the other approach (exponential) underestimates it.

You get to a point where the focus on magnitude becomes absurd. Say, my annual pay increase being an additional “stone age productivity unity” every year. The numbers wouldn’t be that useful. But counting by magnitude compared to a recent point of reference, precisely as the “constant dollars” example as already commonly used in economics, would be interesting.

Surely, for example, we can all understand that using constant 1860 dollars is difficult for us to compare or understand for 2016 data. Instead, we use a recent point of comparison and declare 1860s standard of living in 2016 dollars (the 2016 dollars being the ones we can relate to – even 1980 or 2000 dollars would be quite OK over that kind of timeframe).

31 Careless December 18, 2016 at 1:57 am

We can not reasonably want that, so we should rejoice of the diminishing growth rates,

This is really dumb. Yes, we can’t want it, so we wouldn’t do it. That doesn’t mean that being able to grow ever more corn on less land wouldn’t be a good thing.

32 kevin December 19, 2016 at 8:42 am

I thought he laid out a pretty thorough rationale for why ” ever more corn on less land wouldn’t be a good thing”. That is, since there is water in corn, and water is a finite resource on earth, at a certain point all the water on Earth is in the corn and none is left for other things–ie. drinking water.

33 AlanG December 17, 2016 at 8:24 am

I think there is a false equivalence going on here. Throwing more money into research does not equal increased productivity (or whatever other outcome one desires to measure). Look at public health. The three big ideas that improved longevity are: public sanitation and clean water; antibiotics, and vaccines. Everything else was just a marginal contribution. The ‘War on Cancer’ was declared by President Nixon back around 1973 but other than some of the blood cancers (leukemia and lymphoma) what progress has been made? Early detection and surgery are still the best ways to treat solid tumors.

Even Moore’s law, while still operable, is not as powerful as it in the early days. I’m old enough to have learned programming using punch cards on an IBM mainframe and also how Seymour Cray developed the early super computers. All of this has been supplanted by faster CPUs and better programming languages but where are we now? The advances in CPU design over the last 15 years have not led to marked changes in the uses of high end computing in STEM fields. One might only point to the IBM Watson project as something that “might” prove useful but that’s more AI and algorithm than an increase in computing power.

I suspect that the same could be said of agriculture. The greatest development in the 20th century was hybrid corn which did more to increase yields than anything else (not shown on Alex’s graph). What we have seen over the past 20 years is GMO crops that have led to incremental increases in performance in the US but probably hold out much more promise in the the developing world where fertilizer and pesticides are far more costly to farmers.

Lots of technologies end up seeing incremental (if that) increases after certain fundamental discoveries that resulted in huge changes. Nothing bad about this at all. We still end up seeing most goods and services at a lower price after adjusting for inflation.

34 garten December 17, 2016 at 10:33 am

“More researchers and the same growth rate….”

very vague and jumbled argumentation by Bloom, Jones, Van Reenen, and Webb.
It in itself represents non-productive “research”.

The referenced category of “researchers” seems quite arbitrarily limited to technical researchers.
Economists and the masses of social science/academic researchers with their vast output of studies/analysis … don’t increase the growth rate (??)
(of course not, they decrease it).

Even most nominal ‘scientific research’ is faulty, wasted effort. Wasting scarce capital and labor resources on unproductive activities is guaranteed to hinder growth.

The key to economic growth is savings & productive capital investment.

If growth rate is lacking, look to the well known economic basics… not to some silly bean count of supposed “researchers”.

35 Todd Kreider December 17, 2016 at 1:15 pm

” The ‘War on Cancer’ was declared by President Nixon back around 1973 but other than some of the blood cancers (leukemia and lymphoma) what progress has been made?”

A lot.

Leukemia 5 year survival was 35% in 1975 and 65% in 2010
meyloma 5 year survival was 25% in 1975 and 50% in 2010
non-Hodgkin-lymphoma was 47% in 1975 and 72% in 2010
Hodgkin-lymphoma was at 72% in 1975 and 88% in 2010

Breast cancer 5 year survival was 75% in 1975 and 90% in 2000 (little improvement since)
Prostate cancer 5 year survival was 65% in 1975 and 99% in 2008
Thyroid cancer 5 year survival was 92% in 1975 and 99% in 2008
Bladder cancer 5 year survival was 72% in 1975 and 78% in 2008
Kidney cancer 5 year survival was 50% in 1975 and 75% in 2008
Colon cancer 5 year survival was 50% in 1975 and 67% in 2008
Melanoma of skin 5 year survival was 82% in 1975 and 93% in 2008
Lung cancer 5 year survival was 11% in 1975 and 19% in 2008
Brain cancer 5 year survival was 23% in 1975 and 36% in 2008
Stomach cancer 5 year survival was 14% in 1975 and 32% in 2008
Liver cancer 5 year survival was 3% in 1975 and 20% in 2008

This doesn’t include the gains of the past few years. And there has been a lot of progress in immunotherepy so that cancer should drop sharply in the early 2020s.

36 dearieme December 17, 2016 at 4:24 pm

“Breast cancer 5 year survival was 75% in 1975 and 90% in 2000”. If you detect breast cancer earlier then more people survive 5 years even if there has been no advance at all in treatment. The survival rate is a statistic so incomplete that it verges on the bogus.

37 Pshrnk December 17, 2016 at 7:30 pm

Thank You @dearime

38 garten December 17, 2016 at 6:07 pm

Those “5 Year Survival” statistics are very misleading. Cancer ‘mortality rates’ are the proper metric.

Leukemia mortality rates have not significantly changed since the 1970’s, as is the case for most cancers.

Our improved technical ability to diagnose cancers at an earlier stage of the disease… gives the false impression that cancer patients are living longer.

Also, there are lots of deceptive manipulations made to cancer statistics– they now include things that are not really cancer… for example, more women with mild or benign diseases are being included in cancer statistics and reported as being ‘cured’ of cancer.

The U.S. alone spends about $60B per year on cancer research, with very little success. We are no closer to curing/preventing cancer than we were 40 years ago.
Our modern advanced cancer treatments (like chemotherapy) kill as many cancer patients as they save.

39 Todd K December 17, 2016 at 9:51 pm

But you are assuming early detection accounts for all or almost all of the improvement in 5 year survival rates. That isn’t the case.

No improvement in breast cancer treatment since 1975? Evidence?
In 1975, 31 out of 100,000 died of breast cancer. It increased a little and then back down to 31 out of 100,000 in 1993. In 2000, it was 27 out of 100,000. In 2013, 21 out of 100,000 died of breast cancer.

Overall mortality from cancer out of 100,000: 1975 = 200; 1990 = 215; 2000 = 200; 2013 = 163

That’s a 23% drop in mortality since 1975 and a 23% drop since 2000.

40 dearieme December 17, 2016 at 4:26 pm

“The three big ideas that improved longevity are: public sanitation and clean water; antibiotics, and vaccines.” Spot on.

41 Dallas Weaver Ph.D. December 17, 2016 at 7:42 pm

Are you saying what no-one sees, that sanitary engineers have saved more lives that all the doctors put togeather? People can see their own culture.

In China, it was their cooking style (everything is sanitized before eating and you only drink boiled water) that allowed them to stop the fecal/oral pathogen transmission and they don’t see how important that is either.

42 liberalarts December 18, 2016 at 12:40 am

I have to think that research into the problems of smoking and the subsequent decline in smoking rates have to be right up there for ideas that have extended longevity.

43 harpersnotes December 17, 2016 at 8:24 am

Diminishing marginal returns to increasing investment in the number of researchers should not be depressing. Not for economists anyway. For economists it should be the default expectation, and much of managing depression is really just about managing expectations. After all, the ‘marginal revolution’ seems to me in a sense the revolution in thinking afforded by taking an economic perspective which very often means thinking in terms of diminishing marginal returns. Depressing is when a person’s expectations get adjusted to reality. It’s the brain saying, “Don’t put so much dopamine into those particular neural circuits!” So I say – now that we have some ‘depressing’ news, now we can start to make some real progress! For example they can start to disaggregate the ‘total number of researchers’ variable into different components and apply productivity measures to those. (Hat tips to Hannibal (tv) dialogue writers on ‘managing expectations’, and to Niels Bohr quotes on ‘real progress’.)

44 Emericus Durden December 17, 2016 at 8:45 am

If we indeed have a problem with fewer and fewer innovative ideas as time goes on, we have only our own beliefs to blame for it. All ideas must originate in the imagination which, as we know from personal experience, is limitless and infinite. Therefore the source of our ideas is limitless and infinite. So why are we not constantly overwhelmed with millions of innovative ideas literally every single day? I think the answer lies in the fact that our beliefs about ourselves, society, and the universe prevent us from considering those innovative ideas seriously. In particular, the single belief in scientific progress has, withe time, limited the quality and quantity of innovative ideas we seriously entertain and elaborate on. If a new idea doesn’t logically accord with the already established beliefs underlying scientific progress, we discard it because it’s assumed false; in the process, we assume that most of what our imagination serves up is false. Thus, as long as we strictly adhere to the core beliefs of scientific progress, our imagination will be stuck in this ‘conceptual rut’ or ‘dogmatic slumber.’ I’ve written and published on this issue several times. It’s a real problem, and it’s only solution seems to lead into irrationality, when in fact it leads to consciousness expansion. — Emericus Durden

45 Lanigram December 17, 2016 at 10:21 am

Progress proceeds one funeral at a time.

46 TvK December 17, 2016 at 8:47 am

This has been known for quite a while in the transistor industry. They’ve been pushing Moore’s law for a while and its getting harder and harder. It’s possible to make better transistors but they will cost more.

Sophie Wilson (the designer of the first ARM processor) “The Future of Microprocessors” 22 nov 2016 (check around 32:00 for the economic problems)

https://www.youtube.com/watch?v=_9mzmvhwMqw

47 rayward December 17, 2016 at 9:09 am

Does economic growth affect the discovery of big ideas or does the discovery of big ideas affect economic growth? Where do big ideas come from? Researchers working in their garages? Or researchers working in the lab for Big Corp? And are big ideas, once discovered, more likely to be brought to market if discovered in the garage or discovered in the lab of Big Corp? Are big ideas more or less likely to be discovered, or brought to market, in industries with more or less competition? Are big ideas more or less likely to be discovered, or brought to market, in places with lower or higher levels of inequality? The idea that we are running out of big ideas is a very small idea.

48 Jonathan S December 17, 2016 at 9:49 am

Your thoughts were nearly identical to my thoughts. I’m skeptical that more research in a particular field leads to greater economic output from that field. For instance, governments, foundations and donors can throw all of the money they want at medical research, but that doesn’t guarantee that those researchers will actually uncover new treatments, etc. Similarly, there is plenty of research being done in fields where there might be a negative impact on economic output (i.e. public policy).

49 Lanigram December 17, 2016 at 10:28 am

There is a lot of friction for new ideas to overcome inside of institutions: turf wars, egos, dueling authorities, and all the usual and customary human frailties.

50 Jonathan S December 17, 2016 at 4:31 pm

Definitely.

One of my best friends was doing a post-doc in astrochemistry. The team he was a part of routinely published articles that they know to be false at the time of submission so that they could later publish a follow up publication that “corrected” the original article. All of this was peer-reviewed. They were able to get away with it because they basically had a monopoly on their area of expertise (no competing labs) so nobody was able to call BS.

51 Bill December 17, 2016 at 11:06 am

Diversity and collaboration across fields also affect outcomes, increasing the size of silos does not.

Good book on this, recommended by a math prof friend, is entitled How Information Grows. A very good book.

52 Max December 17, 2016 at 9:51 am

But might they be looking in the wrong sector?
The next big step forward will probably not be in a sector or business field that everyone is looking at right now.
In my opinion especially the biology field is still in its infancy with regards to possible marketable ideas and research projects. However, the brightest and best have picked computer science, physics and similar fields as their destination at the moment, so probably this might slow down advances in other fields.

Also, imo, the bigger the bias against markets and selling products is, the lower the actual output. This hypothesis is true for many fields that are studied in universities.

53 Thomas Bayes December 17, 2016 at 9:55 am

Maybe, just maybe, we’re going about it all wrong. Some previous commenters have alluded to this. As a researcher myself at one of the more prestigious medical schools on the US east coast, I see every day the effect of the extreme pressure to perform that these hypercompetitive environments create. Fresh minds come in all the time, the best on the planet, and no one makes it more than a few years before the spark is gone, except for the occasional legitimate genius (I know a few), and the psychopaths (who end up in charge). It’s a really great situation if you want to take an existing paradigm and wring out every last drop. It’s a terrible situation for coming up with genuinely new ideas and getting them off the ground. The fact that this happens at the rate it does is a testament to the irrepressible nature of intelligence. I shudder to think of a human future where we genetically engineer drones for I.Q. and then fling them into this machine. We need to take a step back and give people the space they need to dream, and the support they need to take risks. I may sound like a Marxist utopian idealist, but this type of situation has arisen in differing contexts (I hear that cold wars are great for this). I can’t yet see how it’s going to work on our new dark timeline.

54 Lanigram December 17, 2016 at 10:33 am

“…new dark timeline.”

What do you mean?

55 Thomas Bayes December 17, 2016 at 11:20 am

General closed-mindedness seems to be having something of an ascendant moment just now. At the extreme, I don’t think that a loose collection of warring city-states is a great place to dream and take risks either.

56 Lanigram December 17, 2016 at 11:54 am

“…closed-mindedness…ascendant…”

What makes you think that?

“…warring city-states…”

To which states are you referring?

You have some interesting ideas.

57 Behemot December 17, 2016 at 12:39 pm

Interesting ideas….I think at at least in one age (Renaissance Italy), warring city-states were actually a great place to dream and take risks.

However, I do understand that the more industrial/bureaucratic/cooperative/large scale nature of modern research/innovation means that those days are gone.

58 Thomas Bayes December 17, 2016 at 1:19 pm

The threat of a powerful neighbour can certainly be a driver of innovation. We haven’t really been out of the primate house for that long, and we are quite good at banding together to throw rocks at existential threats as long as they’re readily evident to our primary senses in consensus. But yes, there’s a scale mismatch now. At one point basic research involved lowering myself into a tub of water. Now I have to recreate the conditions inside a star.

However, a future city-state reality won’t resemble any past example. At least there’s that. I dislike the term “post-geographic”, but… we are, for instance, having this exchange right now. Interesting times.

59 Jim Tobias December 17, 2016 at 12:42 pm

Exactly. Without a culture dedicated to cognitive openness and devotion to facts and logic, no level of ‘investment’ will spur true and sustainable innovation. The US is reaching Lysenko levels of political suppression of inconvenient truths. Luckily for the world, this dooms our supremacy. I for one will welcome our new scientific overlords, no matter which Asian language they speak.

60 Pshrnk December 17, 2016 at 7:35 pm

@Jim Tobias nailed the only “depressing” part.

61 Troll me December 17, 2016 at 3:11 pm

I don’t think there’s anything Marxist about thinking that researchers work better when they have space to think on their own terms.

62 Troll me December 17, 2016 at 3:13 pm

Rather more capitalist, actually. And democracy. And freer. And more what people want.

The benefits of the alternatives?? Maybe some overlord can get off on the changing decimal places in his relative advantage or disadvantage compared to rival overlords?

There is nothing of value that such an approach could deliver to a slave puppet drone researcher.

63 daguix December 17, 2016 at 10:07 am

You can draw the same conclusion just by reading the summary of each Nobel Prize in medicine since the beginning. The Nobel Prize goes each year to a research that is less and less impressive.

64 dearieme December 17, 2016 at 4:31 pm

Surely that can’t be right? Are you really saying that the Penicillin prize was for work that was less impressive than all the work done for earlier prizes? If so, can you elaborate on what you mean by “impressive”

65 GoneWithTheWind December 17, 2016 at 10:23 am

Necessity is the mother of invention. Not more researchers and more money.

66 Troll me December 17, 2016 at 3:14 pm

Nothing like engineering crises in the minds of researchers to make them want to “get in with you”. (A rejection of a possible alternative consistent with your reasoning…)

67 GoneWithTheWind December 17, 2016 at 5:32 pm

Even your fine analytical mind failed to grasp my reasoning…

68 Troll me December 17, 2016 at 5:37 pm

Sorry. Good point. It set me on some tangent …

69 Zod December 17, 2016 at 10:28 am

We can’t extrapolate these diminishing marginal returns to investment to the economy (and real world) at large. True that it takes more and more investment to wring out further increases in integrated circuits, crop yields and lives(!). It’s what we human beans do with those returns that produce outsized growth (quantitative) and improvement in living conditions (quantitative and qualitative). At some tipping point circuits became cheap and powerful enough to move from the laboratory to billions of mobile devices. A better-fed (living) person is infinitely more productive and creative than a starving (or dead) one.

Agreed also that there are plenty of suboptimizing factors to big ideas research – the comments named a few. My own Pollyanna p.o.v. is that some combination of large greedy corporations, idiot/”change the world” vc’s/entrepreneurs and yes, government programs will produce the next great things.

70 Troll me December 17, 2016 at 3:16 pm

But opposing diminishing marginal returns, there are network effects which positively impact the rate of discovery as a function of population, potentially with greater than scale returns to population size (theoretically, but maybe that only works within smaller levels like the Silicon Valley example).

71 Curt F. December 17, 2016 at 10:31 am

Moore’s law is supposedly that transistors increase at 35% per year. Over 25 years, this implies an 1800-fold increase in “output”. Meanwhile, research effort has supposedly gone up 25-fold. Thus, the growth rate of the output is 72 times higher than the growth rate in the input. And we are supposed to lament this fact?

Comparing relative growth rates in output to relative growth in inputs is very misleading.

72 Curt F. December 17, 2016 at 10:35 am

Actually it’s forty-five years, which makes for a 720,000-fold increase in output. So the growth rate of output is “only” 30,000-fold higher than the growth rate in the input. What a disaster!

73 Alain December 17, 2016 at 12:15 pm

+1

74 Steve Sailer December 17, 2016 at 8:32 pm

The formulation of Moore’s Law was crucial to it continuing to take place over the last third of the 20th Century. It gave firms and investors confidence to bet enormous amounts of resources on making it come true.

75 AntiSchiff December 17, 2016 at 10:46 am

It has perhaps been hinted at in comments here, but I’ll make it more explicit. How do we know that after a certain point having ,more researchers doesn’t start to lower productivity? What if, for example, data starts being produced more rapidly thab it can properly be synthesized? To the degree that happens, AI data analysis and model-building could be very beneficial.

And I wonder how big of a problem low power/sloppy statistical testing is, especially in the social sciences.

76 John Seater December 17, 2016 at 10:49 am

The Jones et al. paper tells us nothing at all about the prospects for further economic growth. Its results are consistent with modern theories of growth that predict the results they present but that also predict continued economic growth at the rate seen historically.

Most of the evidence in Jones et al.’ s paper is aggregate, relating aggregate R&D inputs to a growth rate. Aggregate evidence, however, is irrelevant. The authors themselves explain why (without admitting the final conclusion): modern theories of endogenous growth explain that what matters is R&D effort *per firm*, not R&D effort in the aggregate. If the number of firms grows and each firm devotes the same amount of resources to R&D, the aggregate amount of resources will grow at the same rate as the number of firms but the economy’s growth rate will be constant. The correlation, whatever it may be, between aggregate R&D and economic growth is uninformative. See the following references: Peretto (JEG, Dec 1998; JME 1999), Dinoupoulos and Thompson (JEG Dec 1998), and Howitt (JPE 1999). There have been a large number of extensions over the past 18 years to taxation, government spending, foreign trade, technology transfer, corporate governance, demography, and other issues. There also have been many articles published that test the theory in various ways, finding that the theory is not rejected by the data, whereas its competitors are.

Jones et al. do one set of tests with firm-level data from Compustat. Those tests show diminishing returns to R&D effort at the firm level. However, their finding is precisely what one would expect from the literature (see the previous references and those below) and so do not constitute a refutation of that literature or anything either surprising or alarming. If Jones et al. want to do a true test of endogenous growth theory with firm-level R&D data, they must control for (1) diminishing returns to R&D inputs (the exponent on S, the number of scientists, in their formulation), which should be positive but less than 1 according to a large body of evidence in the IO literature, (2) cross-firm, cross-industry, and cross-country knowledge spillovers, which are very big according to another body of evidence from the IO literature, (3) endogenous rent protection, which can raise the costs of doing R&D without reducing the growth rate (see Dinopoulos and Syropoulos, Economic Theory, 2007), and (4) a large increase in marginal tax rates and government regulation over the last 60 years that have been shown to have decreased growth rates because of increased *exogenous* costs (see Dawson and Seater, J. Econ. Growth, Jun 2013). Jones et al. control for none of those things.

Jones et al.’s results, as they now stand, are completely consistent with modern endogenous growth theory and do not tell us anything at all about the prospects for future economic growth.

77 Mark Thorson December 17, 2016 at 11:05 am

People (including engineers) often confuse Moore’s Law with chip density. Moore’s Law is about devices per chip, and although chip density is part of that, chip size has contribued about as much and will probably become a larger contributor in the future as we hit the wall on transistor and wire size. Chip size is driven by defect density, which continues to go down allowing larger chips. At the high-performance end of the market, thermal considerations are already a limiting factor, but not all chips run hot. Large neural networks of the future needn’t be hot, but may require large chips.

78 Alain December 17, 2016 at 12:17 pm

Large neural networks of today run very, very hot.

79 Mark Thorson December 17, 2016 at 1:29 pm

What neural networks would you be referring to? I worked with the Intel 80170, at it ran cold. That’s an older chip, but there haven’t been many hardware neural network chips to compare it to. There have been DSP chips used for neural networks that do run hot, but they are not neural network chips — they run serial software emulations of parallel neural network models. A large neural network chip that is mostly interconnect won’t necessarily need to use much power, especially if only a few neurons are switching at any particular time, as a future brain is likely to be. On the other hand, a chip used for something like image processing will have many or all of its neurons switching all the time, like a retina.

80 improbable December 17, 2016 at 2:53 pm

I don’t know anything beyond news articles, but it doesn’t sound like google’s tensorflow chips run cold…

https://www.wired.com/2016/05/google-tpu-custom-chips/

81 Mark Thorson December 17, 2016 at 5:13 pm

That doesn’t sound like a neural network chip. It sounds like an accelerator for software simulation of neural networks. Those are completely different things. A neural network chip has a separate circuit for each neuron, and each circuit operates in parallel — there’s no software simulation at all. This can be the least amount of power used to implement a neural network because no instructions are being executed, but it depends on the neural network model and the problem being solved.

82 stephan December 18, 2016 at 12:08 am

Yes , the number of components per integrated circuit for minimum cost was Moore’s original formulation. For a quick and dirty look I took two microprocessors 44 years apart. The Intel 4004 (1971) with 2300 transistors and area = 12 mm^2 and the Intel 22 core Xeon Broadwell-E5 ( 2015) with 7.2 B transistors and 456 mm^2. I get fwiw

number of transistors per chip doubles every 2.04 years

chip size doubles every 8.4 years

number of transistors per unit area doubles every 2.7 years

83 Roger Sweeny December 17, 2016 at 11:14 am

Maybe more researchers means less progress. They have to wade through all the mediocre–and just plain wrong–research that has accumulated.

84 yes December 18, 2016 at 3:40 am

It’s sort of like if Nathan, rather than commenting here, were publishing research all day.

85 MikeP December 17, 2016 at 11:39 am

I think it’s a slump before a leap. Isn’t it clear that we’re about to reap the benefits from recent tech advancements that are allowing imminent and huge changes in automation? Self-driving cars, trucks, drones deliveries, inspection, security, etc. starting now, robots doing every task imaginable soon.

86 Behemot December 17, 2016 at 12:25 pm

+1.

87 chuck martel December 17, 2016 at 4:22 pm

What’s the real benefit of a self-driving car or a drone delivery? How often will self-driving cars drop a passenger off at the wrong hotel and then what? Or the pizza drone leaves the pepperoni/onion at the home that wanted a sausage/green pepper? Hopefully, all these super-automated dreams will be realized and retro-entrepreneurs will be able to personally take me to the airport and deliver my pizza. I’ll like it better that way.

88 Briant Wolfe December 17, 2016 at 11:50 am

Maybe we need more interdisciplinary research. If we are throwing more researchers at increasingly narrow specializations we are perhaps losing the view of the forest and even the trees and are too much focused on individual components. I think that drawing together learnings from the detail, and even other disciplines can help, but there are a lot of cultural forces inhibiting that sort of behavior and attention.

Historically, we have measured the output of stuff as an indicator productivity, what happens when we have enough stuff and turn our attention to more intangible pursuits? How do we properly measure that as an indicator of value?

89 Thomas Bayes December 17, 2016 at 1:22 pm

Yes! Exactly!

90 chuck martel December 17, 2016 at 11:56 am

What’s with this growth obsession? Growth in what? If it’s GDP, simply make more byzantine laws and regulations so there’s more legal billing, more law school admissions, more civil suits, more divorces, more child custody cases, and so on.

91 Adovada December 17, 2016 at 12:03 pm

This is sure to be the case with long established,heavily worked over technologies. Maybe look at solar panels, drones, smart phones. Story may not be so depressing there?

92 Axa December 17, 2016 at 12:24 pm

For Moore’s law: Microsoft announced a couple weeks ago Windows 10 now runs on ARM processors. These processors work with 2-3 Watt input.

93 collin December 17, 2016 at 12:44 pm

Well, the second most important Economic Idea (after Opportunity Cost) is the law of diminishing returns. Maybe innovations are naturally decreasing because we have picked the best innovations first. For instance, the greatest health advancement in history was the really simple idea that doctors need to wash hands to control the spread of germs. (Which leads to the question why did it take so long?) And in terms of life expectancy, isn’t adding any year over 70 take extra effort of advancement and quality health?

And we also forgot how long innovations really took to distribute. By the 1880s indoor plumbing was the future and a truly great advancement for mankind. Even by 1940 census (60 years later) 35% of US households did not indoor plumbing. Heck there is still 20% of India households without plumbing. Compare that to the global growth of smart phones.

94 George December 17, 2016 at 12:45 pm

Isn’t part of the problem the fact that there are fewer entrepreneurs today?

95 Krzys December 17, 2016 at 12:46 pm

Majority of the extra inputs (I.e. researchers) have only marginal value and produce little. It’s the sociology of higher education expansion that drives those inputs. Not any real need. I’m pretty sure if we look at German rates of inputs/ outputs, it’s gonna look way better, since they have not succumbed to the college for everyone insanity.

96 Luke Edwards December 17, 2016 at 1:19 pm

The sickness in American research is well-known. On average, an NIH researcher will not get his first grant until his 40s[0]. The increasingly bureaucratic funding agencies are increasingly risk averse.

Unfortunately globalization, trade, and peace make this problem worse. Globalization spreads the American system to every corner of the Earth. It is seen as the only possible system.

Fragmentation increases institutional variety. The price of greatness is surprisingly small. A Mars mission is estimated by NASA to cost only 1/4th the price of the F-35 fighter program. A country that was free from the ideologies of social democracy and worldwide hegemony might find resources to do it.

Unfortunately, everybody that matters are social democrats nowadays. They can’t cut back spending on the global empire.

[0] https://nexus.od.nih.gov/all/2012/02/13/age-distribution-of-nih-principal-investigators-and-medical-school-faculty/

97 Luke Edwards December 17, 2016 at 1:41 pm

There is an Olsonian (Mancur Olson) story to tell here about institutional sclerosis, combined with institutional imperialism.

98 Troll me December 17, 2016 at 3:29 pm

Interesting. It’s the social democrats who want the empire?

Sounds pretty much opposite to most of post-WWII history if there is a relative distinction to be made between social democrats and non-social democrats on that specific question.

I understand that some “drawbridge up” people are not at all social democrats. This does not therefore imply that social democrats are on the opposite end of that spectrum. Among those on the most hawkish+globalist extremes of the hawkish and globalist spectra, I imagine basically none are social democrats.

99 dearieme December 17, 2016 at 4:39 pm

“Globalization spreads the American system to every corner of the Earth.” I was speaking to a British scientist the other day. He held it as axiomatic that the MIT way was best. He declined to explain why that must be a universal truth. He refused to accept that other ways might have other merits. I hope his science isn’t performed in such an uncritical spirit.

100 Troll me December 17, 2016 at 5:40 pm

MIT is best because people like him think so. Which causes MIT to be best. Because it is best.

Being the best attracts the best, and therefore sustains itself. Although competition seems to be heating up …

101 Roger Sweeny December 17, 2016 at 5:37 pm

A Mars mission is estimated by NASA to cost only 1/4th the price of the F-35 fighter program.

NASA also said that the Space Shuttle would be a “space truck”: cheap, reliable, and safe. That particular set of lies got the program approved. They are hoping for a similar outcome when it comes to funding a mission to Mars.

102 Peter Schaeffer December 17, 2016 at 1:58 pm

The pharmaceutical industry is a case study in diminishing returns. After WWII, the pharmaceutical industry invented numerous innovative drugs (words like “blockbuster”, “breakthrough”, “astounding”, etc were quite justified). Prices were quite low and the drugs were marvels. For a generation or so, that has no longer been true. Each new generation of drugs is substantially less impressive than its predecessor… But costs vastly more. A reasonable analysis shows that drug development productivity has fallen by a factor of at least 100 (perhaps 1000). Note that increased regulation accounts for only a small fraction of the productivity decline.

Cancer drugs are a tragic example. The single most important drug for treating many types of cancer is 5-FU (Fluorouracil). It was invented back in the 1950s and costs just a few dollars (literally). The newer anti-cancer drugs (which are used with 5-FU and don’t replace it) add a few months of (miserable) life expectancy, but cost 1,000-10,000 times as much (literally).

Here is an easy way to understand the modern pharmaceutical industry. Imagine if every 5 years, new PCs cost twice as much, but were only half as fast. But of course, they were very popular because the government made them “free” and trial lawyers (plus Sarah Palin) got together to demonize anyone who dared to challenge the status quo.

103 dearieme December 17, 2016 at 4:42 pm

An excellent account is available in paperback: “The Rise And Fall Of Modern Medicine ” by Dr James Le Fanu.

104 patent agent December 17, 2016 at 2:05 pm

Google’s Michelle Lee and others have done their best to weaken the US patent system via AIA, PTAB, and the USPTO’s unstable behaviour in the wake of the Alice decision. If the reforms under discussion in China are adopted there, then China will be a more inventor-friendly venue than the US (especially in software).

To attract investment, startups need to be able to prove that they can remain economically viable once the big corporations try to go after their market. In areas where network effects are difficult for a startup to utilize (eg. who can try to compete with Amazon in the cloud?), patents are about the only way to do this. But under the new regime Google, Facebook etc. have begun to freely infringe patents knowing that they can usually outlast the infringee in court, and that in the worst case the infringer will only have to pay the regular license fees.

105 Carl Shulman December 17, 2016 at 2:15 pm

“Nevertheless the Bloom et al findings cut optimism. The idea of the Singularity, for example, comes from projecting constant or increasing growth rates into the future but if it takes ever more researchers just to keep growth rates from falling then growth must slow as we run out of researchers”

This may be true of Ray Kurzweil, but these data are quite compatible with an intelligence explosion from AI that can fully substitute for human R&D staff (and in line with calculations that I know people like Nick Bostrom and myself have done before, using similar datasets). In the graph above you have an improvement of chip density of over 700,000x alongside an increase in labor inputs of about 25x.

Using this metric (which actually understates the total increase in world computation), if the labor inputs were themselves automated by AI they could increase with computation. 700,000>>25, so you would have rapidly accelerating progress until physical limits and diminishing returns meant that technological advance no longer increased inputs more than then need for inputs.

Similar dynamics were involved in our technological advance thus far: technological advance allowed much larger populations, which allowed more innovation effort (offsetting diminishing returns). The demographic transition and constraints of human reproduction have prevented the supply of highly skilled labor from catching up to what the Earth could theoretically sustain, but AI allows the production of labor substitutes much more rapidly. Your colleague Robin Hanson’s papers describe the logic here fairly clearly.

Speaking as someone interested in the idea of an intelligence explosion given AI, I think this is a mistake.

106 Carl Shulman December 17, 2016 at 2:41 pm

Basically it goes to #1 in this list, but not #3:

http://www.yudkowsky.net/singularity/schools

107 Asher December 18, 2016 at 3:23 am

Can’t understand “singularity” and “optimism” in the same sentence. In my mind the singularity is the ultimate dystopian scenario. Since when is believing that human life and civilization as we have always known it will come to an end soon “optimism”?!

108 Peter Schaeffer December 17, 2016 at 2:18 pm

For some more detail, see

“Diagnosing the decline in pharmaceutical R&D efficiency”
“Jack W. Scannell, Alex Blanckley, Helen Boldon & Brian Warrington”
Abstract

The past 60 years have seen huge advances in many of the scientific, technological and managerial factors that should tend to raise the efficiency of commercial drug research and development (R&D). Yet the number of new drugs approved per billion US dollars spent on R&D has halved roughly every 9 years since 1950, falling around 80-fold in inflation-adjusted terms. There have been many proposed solutions to the problem of declining R&D efficiency. However, their apparent lack of impact so far and the contrast between improving inputs and declining output in terms of the number of new drugs make it sensible to ask whether the underlying problems have been correctly diagnosed. Here, we discuss four factors that we consider to be primary causes, which we call the ‘better than the Beatles’ problem; the ‘cautious regulator’ problem; the ‘throw money at it’ tendency; and the ‘basic research–brute force’ bias. Our aim is to provoke a more systematic analysis of the causes of the decline in R&D efficiency.”

“Why has R&D productivity declined in the pharmaceutical industry?”
“Ruffolo”
Abstract

“Productivity in pharmaceutical R&D has been on the decline for the past several years, and much has been written on the subject. The causes for the decline in productivity are many and complex. Some of the causes are external to R&D and therefore difficult to address, such as growing regulatory conservatism and lack of international regulatory harmonisation. However, a number of the causes for the decline in productivity are internal to R&D groups and can be addressed by R&D managers, such as cost-containment and maximum use of resources. This article focuses on some of the major issues that have caused productivity to decline, and some of the areas where those who manage large R&D organisations may focus to improve R&D productivity.”

“Data shows declining productivity in drug R&D”

“Drugmakers still rely heavily on sales from an aging portfolio of products and the proportion of sales from newer medicines actually fell last year, after a decade of record research spending yielded few new winners.
The 2010 Pharmaceutical R&D Factbook, compiled by CMR International, a Thomson Reuters business, and released on Monday, painted a gloomy picture of the global pharmaceuticals sector.
New drugs launched within the last five years accounted for less than 7 percent of industry sales in 2009, down from 8 percent in 2008, the Factbook showed, highlighting the big problems that companies are having in trying to reinvigorate their portfolios.
Many companies — including Pfizer Inc, GlaxoSmithKline Plc and AstraZeneca Plc — have been taking a knife to research operations in a bid to improve returns, a trend which analysts expect to gather momentum this year and next.”

109 Troll me December 17, 2016 at 3:45 pm

Also, if a breakthrough discovers something that’s basically free, it doesn’t show up in the stats much.

Consider that $1 salts to prevent death from severe diarrhea constitutes a significantly greater advance than the $500,000 cancer treatment that extends life by a week longer, on average, than the $150,000 treatment course of the competitor.

So there might be some “accounting” problems too …

110 Peter Schaeffer December 17, 2016 at 11:57 pm

TM,

In real life, society (not just medicine) devised any number of ultra-cheep technologies (vaccines, chlorinated water) for improving human health many decades ago. Because they were ultra-cheep, they don’t show up in the GDP numbers. However, the impact on death rates was profound. Type ‘death rates 1900 chlorinated water’ into Google and look at the images. What is remarkable is that the greatest gains in life-expectancy were a consequence of ultra-cheep technologies. It is quite true, you won’t find the benefits in the GDP numbers. However, they definitely show up in the health statistics. Conversely, the period since 1970 is characterized by astoundingly greater spending and much more marginal gains. Classic diminishing returns.

111 Troll me December 18, 2016 at 1:28 am

In real life, we just said the same thing.

112 Peter Schaeffer December 19, 2016 at 12:58 pm

TM,

“In real life, we just said the same thing.”

I am not so sure we did. What is striking about the pre-1960 period is that so many (near) zero-GDP health care innovations appeared (chlorinated water, sanitation, vaccines, antibiotics, etc.) and had a huge impact. Since 1960, not so much. yes, ORT (salts) qualifies (trivial cost, huge impact). So do mosquito nets (I think). However, since 1960, there just haven’t been that many super-cheap, super-effective health care innovations. The actual innovations have tended to be considerably less impressive and vastly more expensive.

A few other notes. Chlorinated water is really cheap. Sanitation requires so major infrastructure and may not be so cheap. The actual benefits of vaccines and antibiotics may well be overstated. Type ‘death rates vaccines’ into Google (images). Death rates from infectious disease fell astoundingly well before vaccines and antibiotics took off. Why? Perhaps chlorinated water and sanitation were bigger influences. Perhaps the real hero is the Fritz Haber (and Taylor Swift).

113 Troll me December 17, 2016 at 2:28 pm

If you have an economy that is 10 times larger, with 10 times as many competitors, you might have 10 times as many researchers in the same activity, much of which duplicating each other’s efforts. They will not find 10 times as many ideas because they are working on the same thing.

(You’d think that on average, 10 teams should find the “best answer given current knowledge” faster than 1 team working on the same problem, assuming roughly similar characteristics and that there are no Stephen Hawkings on the team, for example.)

114 improbable December 17, 2016 at 2:48 pm

Yes, this was my thought too. The semiconductor industry itself has grown enormously, so wouldn’t you see much the same graph if all the managers had just blindly kept R&D’s budget at 10% of turnover?

115 Troll me December 17, 2016 at 2:30 pm

On the matter of gene tinkering and AI, etc., I suggest that attending fastidiously to the costs and risk-weighted costs side of the equation is important.

How do you weight an infinite cost at small probability far in the future? What if there are a hundred or a thousand such pathways to account for as a function of such ideas?

116 Troll me December 17, 2016 at 2:37 pm

On AI and gene modification ideas

Let’s say we were all brought up on the same brain prosthesis (see BCIs – brain computer interfaces, for some indications of present possiblities and technological possibilities in coming years.)

Development might become so similar that we would be highly vulnerable to “mind hacking” which would be of more universal applicability due to the more similar development process (theoretically) – not to mention the risks of being unable to function without a prosthesis.

In such a case, some foreign nation might more easily “hack” us all directly, or perhaps even some alien species might arrive, observe a few patterns, and for lack of cognitive diversity (which I believe to be a function of cognitive liberty) and “mind hack” humanity overnight, without a single laser fired or bomb dropped.

There would be EMPs, yes. But very very micro. Not blowing stuff up. Hacking your mind.

Gene modification ideas might be relevant here, with similar risks and also affecting those other risks, but I think the main thing is that we don’t have a clue what second or third order effects might be associated with whatever tinkering some mad scientists might be tempted to engage in.

Which mad scientists? Would be good to know.

117 Troll me December 17, 2016 at 2:39 pm

While a lot of researchers can be working in different new explorations at the same time, and thus a larger amount of researchers should have a direct positive impact on the expansion of the cutting edge, quite a lot of research activities are essentially repeating old stuff in new organizations, or otherwise involved in the propagation and maintenance of knowledge, far more so than the creation of new knowledge.

So … that’s gotta affect the analysis somehow.

Just think how much time you spend in the lab before having competence as a researcher in science. And even then, perhaps only a few percent of students who are taught by these researchers will end up directly in research – mostly they will be applying this knowledge instead. Which means that a lot of researchers are also highly involved in non-research activities, regardless of whether at the cutting edge or not.

So, while there may be declining returns of new knowledge as a function of the number of researchers that an economy is sustaining, this is a highly incomplete accounting of their activities.

Also, as the economy becomes more advanced, more people or organizations who would previously have been “tinkering” or otherwise experimenting or designing methods which might include activities that can lead to knew scientific evidence or knowledge. These tinkerers have no more tinkering to do (low hanging fruit mostly gone, perhaps), so now unsolved issues which increase productivity are more likely to be addressed through “research” as opposed to other things.

Say … a farmer who’s also a handy man trying lots of ways to individually modify his tools for his specific needs. But probably a lot more of those kinds of things are solved, and now it involves formal research activities built into, say, a farm equipment production company.

(Also, in considering the relatively lower previous levels, maybe this reflects an earlier under-allocation – or, more likely in this case, having broken through certain areas of scientific knowledge lowers the barrier of entering into these fields of research – but even though it’s worth it for companies or government to allocate resources to attract or direct such research activities, this does not automatically make solutions fall from the sky 25 times faster. With 25 people at the same cutting edge as opposed to one, perhaps we’ll just have 25 instead of 1 person working on the same problem until some other area of knowledge figures something out that makes it possible for that further breakthrough to occur.)

118 Mark Thorson December 17, 2016 at 3:13 pm

How has the output of classical music, plays, and poetry grown relative to population growth? We should have double the output of first-rate sonnets, concertos, and abstract impressionist art compared to the 1950’s. Do we, or has that stagnated too?

119 Asher December 17, 2016 at 4:35 pm

Maybe you should see a psychologist. Are you also very depressed about the fact that the sun will explode someday? Or the fact that someday you will die, as all human beings have always done after a few decades of life? Is our life today so awful that we should be unhappy about the fact that future generations may enjoy a similar rather than greater material level of well being?

I don’t think my childhood was materially any happier than my parents’, and I don’t think my kids’ is any materially happier than mine, Most likely the opposite is true because when I was a kid there were no cellphones to allow your parents to keep tabs on you all the time. Why should I be unhappy to hear that their my grandchildren’s standard of living will be similar?

Pardon me but being depressed about a constant rate of innovation sounds absolutely sick to me.

Thousands of years ago reasonably healthy people with a modicum of leisure were content, can’t see why someone with a 21st century standard of living should be depressed.

120 Troll me December 17, 2016 at 5:46 pm

But how will we beat the empire the next galaxy over if we do not pre-self-robot-and-“win”-everything just in case maybe there’s an evil empire in the next galaxy over … and+and+and+and+ mostly things that probably will never happen?

(P.S. – if the evil empire the next galaxy over figures out that we might pre-self-robot-and-“win”-everything, they might just have to crush us into stardust before we get the chance. “Good” or “good or evil as necessary” (e.g., not wanting to crush and dominate everything, but not dumb either) are almost certainly better than pre-self-robot-and-“win”-everything.

So .. let’s be human. Perspectives such as your own could be quite useful in such discussions.)

121 The Lunatic December 17, 2016 at 6:06 pm

So, one of our three examples is a case where linear increases in input are maintaining exponential growth in output (chip density), while the other two (agriculture and health) are cases where there are both massive regulatory apparatuses standing in the way of converting new developments into actual products and massive government intervention in the functioning of the markets.

122 Dallas Weaver December 17, 2016 at 8:25 pm

The Lunatic is onto something. The raw idea is the easy part, it then has to pass through the reality filters. The first filter is technological (will it actually work, even in theory), then will it be economical. We then get to the real barriers, is there a market, followed by what permissions from regulators are required. For anything really innovative, in the world of the precautionary principle, this regulatory filter is a killer.

Imagine just using modern technology to transfer root zone competition genes (root zones are war zones for plants) to corn or soy eliminating the need for roundup and roundup ready crops. How long would the approvals take and how much would it cost. 10 yrs and a 100 million dollars. I and my VC and AI investor friends would be too close to our “sell by dates” to invest in a 10 yr dream depending upon some idiot political bureaucrats. What IRR would be required to take that long-term risk?

123 carlospln December 18, 2016 at 1:12 am

You want to fuck with the genomes of grain crops & you’re worried about some IRR?

‘In the world of the precautionary principle’

Created & maintained to keep the world safe from people like you.

124 Alain December 18, 2016 at 11:19 am

And you get to decide, of course.

Luckily, you are a nothing and you will never have power.

125 The Lunatic December 18, 2016 at 1:19 pm

Every single grain crop’s genome has been fucked with since the 1950s by the expedient of exposing them to high levels of artificial ionizing radiation and mutagenic chemicals in order to induce random genome changes, then hybridizing the new mutants into the germ line of the species. And idiots like you eat them without a care in the world.

But if somebody dares make a planned, targeted, limited changes made by careful methods aimed at a specific effect, that afterward goes through three separate regulatory testing processes (FDA, EPA, and USDA) for safety before entering the food supply, that you scream about.

And unfortunately you and the other nothings like you cumulatively have power, and the world is restrained from the objectively safer, cheaper, controlled path to crop improvement in favor of the mutate-randomly-and-breed method.

126 Maury Nickelson December 17, 2016 at 7:04 pm

In order to boost the production of ideas it would be prudent for the government to change or reform the education system. Altering the way in which children are taught, would in theory boost the production of new ideas. Instead of focusing on teaching certain curriculum, teachers should focus on encouraging and cultivating critical thinking skills.

127 Andao December 17, 2016 at 10:06 pm

Anecdotally (from my non-tech background) it seems like there was a seismic shift around 2007-08 when chip makers stopped caring about speed and focused exclusively on power consumption and size.

Not a bad thing, just very different from the trend. Presumably this will greatly expand the pool of people with access to powerful tech. Maybe speed doesn’t matter anymore if you can manufacture millions of mini chips for next to nothing

128 Alain December 18, 2016 at 11:20 am

Read about the Pentium-4.

129 The Lunatic December 18, 2016 at 1:29 pm

They didn’t stop caring about speed; they hit serious physical limits. If anybody could figure out how to deliver a 6 GHz processor (without super-exotic cooling), they would ship it, and make a fortune.

130 Andao December 19, 2016 at 12:39 am

Could they though? Is it possible there isn’t a demand for x speed at y price? Video game consoles haven’t been upgraded in many years, even though faster tech exists

131 buddyglass December 17, 2016 at 10:43 pm

Someone on Facebook (gasp!) pointed out the following: it’s not the nominal # of researchers we should care about but some GDP-relative measure of R&D investment. If it takes twice as many researchers to generate the same yearly gains but the economy is 4x as large, then that’s not necessarily cause for concern. Just a thought.

132 stephan December 18, 2016 at 12:23 am

There’e going to be a lot of depressed singularitarians out there when it will become obvious nothing too special will happen in their lifetime. Growth is now a tortoise not a hare.

133 Todd K December 18, 2016 at 9:23 am

Donald Trump Tweeted his way into the presidency. If that isn’t a sign of the coming Singularity (any day now), I don’t know what is.

By the way, I didn’t notice any mention of exponentially increasing MIPS (millions of instructions per second) for $1,500 of computation that Hans Morvec graphed in the 90s, and one that Kurzweil puts up at most talks.

From what I’ve read, that hasn’t slowed down as of 2014.

134 Sondre R December 18, 2016 at 11:16 am

This idea TFP they are debunking here kind of feels like a straw man.

For it to be true in these examples, expontential growth would lead to an exponentially growing growth rate.

E.g. Transistors: The constant return to TFP as they define it, predicts that price/performance of transistors would be increasing 25 000 % per year by now. So, that is the null-hypothesis they are trying to debunk.
Wait, what? Who exactly thought that was the case?
I certainly didn’t. And I certainly don’t think Tyler did. Is it really depressing that this fairly tale isn’t true? What did you expect, exactly?

In their conclusion they do represent what is actually a normal view, namely that: Knowledge is non-rival. Make a hammer once, now everybody can get a hammer. Therefore can have forever exponential growth with ever-falling “idea TFP”.
Yet more, knowledge is cumulative. Make Uber once, now anyone can make Uber for X. So still actually opens the door for increasing exponential growth.

The only thing this paper seem to rule out, is a notion that would have expected 25 000 % yearly improvement in price/performance for transistors today, or similar achievements in life expectancy. And whoever did that, I am glad you are now enlightened, welcome to the sane world.

Sweet.

PS.
– The method for “researchers” is a specific bookeeping account/high-skilled wages. What if that isn’t = to researchers?
Sounds implausible to me. Everything else has become more capital intensive. I know for certain that transistor R&D has. You have a few researchers with some very expensive machines.
– Also, this is a minor, but could there have been a change of the decades as to how people booked on this account?

135 jorgensen December 18, 2016 at 1:06 pm

What parts of “low hanging fruit” and “diminishing returns” could possibly come as a surprise to anyone?

When they said they were doubling the number of calculated digits of PI every two years did anyone really think they were doubling the usefulness of PI every two years?

Science and technology are stalling out in providing growth out here at the frontier of what is possible but there is still an enormous amount of catch up growth potential for 80% of the worlds population.

136 jorgensen December 18, 2016 at 3:01 pm

Maybe economists need to spend more time contemplating the economic consequences of the first and second laws of thermodynamics.

137 jorgensen December 18, 2016 at 5:55 pm

The yield per unit research effort probably peaked around 1890 when the AC electric motor was invented. It has been all diminishing returns since then. The gasoline engine had been invented before that and the diesel engine was invented a few years later.

138 Thanatos Savehn December 18, 2016 at 8:04 pm

The attempt to industrialize research, especially that conducted by way of p-hacking; has been an unmitigated disaster. Not only has it wasted billions of dollars it has herded three generations of promising young scientists into intellectual dead ends. The good news is that there are lots of good ideas for those willing to do real research and not just noise mining. Take month’s biggest paper in Cell as an example of what might be – a spectacular demonstration of a link between gut dysbiosis and Parkinson’s. It would never be discovered via data dredging – it took instead a hunch drawn from the work of others who identified neuro-signaling and protein folding signaling molecules among the pee and poop of gut microbes and said “that’s weird”.

The revolution is upon us.

139 Rob Thorpe December 18, 2016 at 8:52 pm

The authors are very confused about all this.

Moore’s law is not directly about CPUs. It’s about the density of transistors that can be achieved. Memory has always outperformed CPUs on this metric. In recent years Moore’s “law” or a doubling every two years has become slower.

When discussing Moore’s law the authors tell us the businesses they used as a benchmark. This is done first on a graph “Intel, AMD, Fairchild, National Semiconductor, Texas Instruments, Motorola”. They mention more on a footnote “International Rectifier, Texas Instruments (including 50% of their research since they were mainly a calculator company early on), a tenth of the R&D by Motorola (mainly a television company early on), Analog Devices, Unitrode, Semtech, and Ripley.”

I work for one of those firms. Those firms are primarily IC firms, they make silicon chips. They design chips, not the equipment used to make chips. It’s that equipment that determines things like Moore’s law. The equipment determines the resolution of semiconductor processes. Intel are the only company on the list the invests heavily itself in pushing forward semiconductor processing.

The task of all of the companies mentioned is to apply available semiconductor technology. They design chips for PCs, smartphones, TV, networks, instruments, industrial equipment, sensors and practically everything else you can think of that has electronics in it. Their job is to apply what has been created by the advances in semiconductor technology. They don’t develop that technology themselves though; ASML, Canon, Nikon and Ultratech do that.

I’ll make an analogy. Let’s say that the semicon process businesses are like concrete producers. The chip makers (Intel, AMD, TI, Fairchild, etc) are like firms or builders and architects. This research attaches the productivity of the concrete makers to the architects and builders who use their work.

140 Alexander Hamilton December 19, 2016 at 12:27 am

Didn’t read much of the paper, as these guys have a well-deserved reputation as being some of the sloppiest researchers around.

Generally, for any given technology, the growth path is logistic (slow at first, then fast, then slow again). Of course Moore’s Law will run out of steam. But altogether new goods technologies will also be developed. And there’s no easy way to quantify the introduction of new goods. Are there “more” new inventions today than yesterday? How do you count a new invention vs. an adaptation of an existing technology? Impossible to answer, in fact, and these guys don’t do it.

141 bill reeves December 19, 2016 at 10:58 am

There are two elements you forgot to cover:

1. Research today is much more government funded with government priorities and regulation. The way you win in today’s science is via publication or what my Osage ancestors call ‘counting coup’ – essentially it’s a popularity contest with the older more established and retrograde scientists in charge of the process. Science used to be tied more to money making, now its tied to status seeking and is resolutely Not for Profit.
2. The enormous proportion of this type of not for profit research that ends up being fraud: 70 percent of the best biomedical studies can’t be reproduced by Pharmas who have every incentive to do so. Since there are no stakes much of what is done is derivative, sloppy or even fraudulent. As is most status seeking. Whether in high school, politics or science.

Most of the new scientists are just doing state sponsored busy work. It’s a waste of time.

142 Kelly December 19, 2016 at 11:28 am

I was surprised that a search for “ZMP Researcher” did not show any results on this post and comments. Or “ZMP Academic?”

143 Achal K December 19, 2016 at 5:51 pm

And most investors/shareholders think letting most companies spend 3X to 4X of their R&D spend on share buy backs and dividends is a good idea. (look at the bar chart in this article – https://www.bloomberg.com/news/articles/2016-11-21/goldman-how-corporations-will-spend-their-huge-piles-of-overseas-cash)

144 Fred Zimmerman December 26, 2016 at 2:57 pm

Analysis driven by data availability — it is easy to count researchers, hard to create a physical model of innovation v. nature of universe.

145 Tom Davis December 26, 2016 at 7:21 pm

I wonder what role changes in regulatory burdens have on innovations. One of the authors of the ACA itself admitted earlier this year that the law regulated physician driven innovation out of the healthcare system, to everyone’s great cost. As the mass migration of practicing clinicians exponentiates, it is hard seeing how healthcare delivery can recover from such losses.

146 Mark Buehner December 26, 2016 at 8:02 pm

Doesnt this assume all researchers being equal? If were increasingly squandering resources due to social, political, or economic factors, it would render this analysis suspect.

147 TheRadicalModerate December 27, 2016 at 1:13 am

An obvious hypothesis to look at is that “minds” are not the essential growth medium for “ideas”. What if the primary growth medium is simply network effects? What if only a small number of minds are necessary to catalyze those network effects? If that were true, then the idea growth rate of particular scientific disciplines or technologies would be relatively constant, irrespective of the number of researchers/engineers, which is pretty much what we see, isn’t it?

Another hypothesis: the growth rate of technology commercialization is sub-linear, while the growth in ideas is still exponential. This also kinda makes sense: commercialization requires industrial design, developing marketing plans, writing documentation, and training sales forces, none of which would seem to have any propensity for exponential growth.

Or maybe Sturgeon’s Law has a hitherto undiscovered exponential component.

148 Kelly Parks December 30, 2016 at 1:34 am

You’re overlooking the obvious. Increased government regulations have been a serious drag on growth in many fields. Endless FDA regulations make the process of drug approval longer and more costly so of course the rate of innovation decreases and people die by the thousands waiting for drugs that could have saved them. That’s the cause of the dip in effectiveness of cancer research.

Comments on this entry are closed.

Previous post:

Next post: