Are Ideas Getting Harder to Find?

There is now another paper on this theme by Nicholas Bloom, Charles I. Jones, John Van Reenen, and Michael Webb, abstract:

In many growth models, economic growth arises from people creating ideas, and the long-run growth rate is the product of two terms: the effective number of researchers and their research productivity. We present a wide range of evidence from various industries, products, and firms showing that research effort is rising substantially while research productivity is declining sharply. A good example is Moore’s Law. The number of researchers required today to achieve the famous doubling every two years of the density of computer chips is more than 18 times larger than the number required in the early 1970s. Across a broad range of case studies at various levels of (dis)aggregation, we find that ideas — and in particular the exponential growth they imply — are getting harder and harder to find. Exponential growth results from the large increases in research effort that offset its declining productivity.

Here is the NBER link.


Moore's Law seems a particularly poor example. Ideas that refine an existing technology or process would seem to be non-comparable to sui generis ideas that create a technology or process.

Moore's Law isn't just the refining of the same technology, it encompasses completely new forms of data calculation, from mechanical, to vacuum tubes, to transistors, to memristors and quantum computing. A transistor is by no means simply a "refined mechanical computer" - it's a completely different technology that draws from completely different fields.

A transistor is basically a switch dude, sure it works a bit differently but it IS actually a refinement of the idea of a mechanical switch.

What's a "switch dude"? Are you thinking of switching back from dudes?

Ideas are hard.

Several generations were taught that commas/grammar, facts, history, math, personal responsibility, STEM, virtue are [take your pick] genocide, fascism, fundamentalism, homophobia, Islamophobia, racism, sexism, white nationalism, white supremacy.

It's not their faults that they're idiots.

They're all "switch dudes", especially the last: transistors, honkin' brass-and-steel circuit breakers, levers that reroute trains, Maxwell's Demons.

"They’re all “switch dudes”, especially the last: transistors, honkin’ brass-and-steel circuit breakers, levers that reroute trains, Maxwell’s Demons."

One of the most theoretically impressive toys that my older brother received was a Digi-Comp II, a toy that showed children how a digital computer "made decisions". The switches were literal plastic switches mounted on a plastic board, each switch tilted to either the left or the right to indicate 0 or 1. Instead of electricity running through the switches, you dropped marbles into a chute and they would rattle through the maze of switches, maybe leaving them as is or maybe switching them depending on how you'd set up i.e. programmed the board.

I was too young to know what all that meant but it was certainly cool watching the marbles roll through the switches, an educational version of pachinko or pinball.

Unfortunately the construction quality was poor and the switches soon became sticky and then broken.

Digi-Comp I's lasted slightly longer before breaking but they were less cool, instead of marbles it was a complex erector-set looking structure and had only 3 display bits. IIRC the Digi-Comp II had 7 or 8.

It is the refining of the same technology. The law merely describes the number of transistors on a chip doubling every 18-24 months.

That's it.

Scott is refering to Kurzweil's Law of Acceeration.

+1 @J. Ott, thread winner. I will also point out that human growth has slowed down after an exponential increase in the 19th century until the end of the 20th century, so naturally if society depends on largely uncompensated Good Samaritan genius for innovation (the traditional way societies in the West innovate) then naturally the number of such ideas will slow down as population slows down. Solution: better patent laws so the people who invent but otherwise work on Wall Street due to the money will start inventing again.

So your proposal to foster more innovation is to make new ideas harder to use?

Exactly. It is the free ridership problem. When music is free, who makes money producing it in the studio? Nobody, so you have to go through the hassle of touring (most of these aging rockers don't like to tour, I can assure you, but they must).

That only works to a point. How many more Lord of the Rings books will J.R.R Tolken write if the copyright is extended? 0. He's dead.

To be sure there's a free ridership issue at play but to a point, but I'm not going to be the person who invents quantum computing regardless of whether the patent would last 10 years or 100 years.

You're right to identify that wall. st has captured a lot of society's best and brightest but I think that's primarily because a career in research / science is so bad/stupid. You grind away to get a Ph. D and then work as a post-doc, earning less than the janitors who clean your bathroom, and then compete hard to get funding.

Maybe the solution to the free rider problem would be one that makes it easier to live as a scientist – more funding for research and researchers and no visa schemes that can reduce wages for people in science. I think we could justify the costs by doing a better job of selecting candidates for funding or diverting funding from fields that are less likely to be economically relevant (like the arts and race/gender ‘studies’).

There's at least 3x the number of people in America now than there was in the 1920s - do we have 3x the innovation?

It's weird to claim that innovation more or less happens as a function of the amount of people participating in innovation when we now we have more people than ever but less innovation.

Agree with Ott. Moore's law is very specific - it forecasts the number of transistors on an integrated circuit. It has nothing to do with vacuum tubes, mechanical switches, or other technologies. Moore created the law in 1965, and at that time the IC existed as well as bipolar, NMOS and CMOS. Those were the big ideas.

‘When God is thinking and calculating, he is creating the world’ as Heidegger once cited. The purely mechanical view of technology has created an obsessive compulsive behaviourism. Thought fractals exist in the online culture of blogs and newspapers and memes only so far as they serve as comedic metaphors. But to Leibniz, this was not nearly sufficient. Though they can express a contingency of affinity. They do not solve the infinity problem. It is not the logical maximum but the determination of the problem, and thus its final solution where the beauty part.

Let's take a plenum. When the light of a bulb is emitted, in Leibniz's world, that bulb was always going to emit light on that date in that plenum. But what if a mirror is placed at such a distance after the bulb switch is flicked on and before the light has reached the mirror, so that the light reflects off the mirror? Is that not a synthetic truth or is an example of syncretism.

The street lamps do you think so because one of our forefathers was a governor and three were generals and MOther's weren't or are you confusing sin and morality? women don't do that your mother is thinking of morality whether it be a sin or not has not occurred to her.

Niggers sasy a drowned man's shadow was watching for him in the water all the time.

Correct. People often try to make Moore's law means something broader. That's wrong, it's just about integrated circuit maximum density.

Perhaps Moore's Law and the learning curve power function (Henderson's Law) are the same thing. Henderson said that every time you double total output you get X% more efficient. The higher the X, the steeper (and better!) the learning curve. We have to do a lot more research and build a lot more chips now to double total transistor production.

Oh man I forget yesterday was the first anniversary of Hillary Falls Down Day. Did you guys do anything special?

Look how many more movie sequels there are each year than there used to be. Of course ideas are getting much harder to find.

And when was the last time we had a new major religion? The 7th Century A.D.? Religious ideas are almost impossible to find.

Mormonism doesn't count? What about Scientology? Sikhism is relatively new.

Well, set a threshold for major religions. If you make it 1% of global population, it's Sikhs and Mormons. 2% and neither of them qualify.

When Christianity began it wasn't a "major religion", that took centuries. Who is to say that Mormonism won't be a much bigger religion in the future? You're looking at something at a point in time.

Part of the problem is the big production companies are getting squeezed. To keep their bottom line only movies with name recognition get funded since they can count on a certain number of people buying tickets.

RE: New Religion.
I personally think the radical left / SJWs is the new religion.
Original Sin = White Privilege
Devil / Satan = NAZIs
Jihadis / Crusaders = ANTIFA
Missionaries = Activist
Church = Universities
Priests = University Profs

How can people improve the world if they don't live in the real world? Or they only start to live in the real world AFTER 4 years undergrad + 6 years P.hD + 5 years of "experience" = 15 years. By the time they've gained enough experience to gamble on turning their mastery of a subject or domain into advancement of the whole field the cost of failure is too high.

I'm sympathetic to this. On the one hand, so much extra schooling tends to create a follower mentality inconsistent with generating new ideas. OTOH, it's entirely possible that the extra schooling is more a consequence of prior research productivity than a cause of diminished future productivity. So much low and middle hanging fruit has been picked that current and future generations have to climb ever higher up the tree to find something new.

"So much low and middle hanging fruit has been picked that current and future generations have to climb ever higher up the tree to find something new."

The brain implants should help.

Yes, your technomania might be treated with those.

With an attitude like that, we might not let you have any.

yes colors are ambivalent about people.

LOL. I'll take my chances.

I don't think is a new development. The marginal cost of new ideas has pretty much been rising since the beginning of history.

The entirety of Ancient Magna Graecae was 5 million at most. Maybe translating into 1.5 million free citizens. They produced Socrates, Plato, Aristotle, Pythagoras, Archimedes, Ptolemy, Euclid, Hippocrates, Menelaus, Thales, Pappus, Herodotus, and Democritus. All would have qualified for the Nobel or Fields assuming it existed in antiquity.

In general make a list of the most important ideas or thinkers in human history. If you weight the total count of each period by global population, you'll find that the past has significantly disproportionately higher representation. The distant past even more so. The situation even looks worse when you consider that pre-industrial populations had much lower average IQs due to malnutrition.

This is why continued global population growth is so important. It's the only reason progress doesn't grind to a halt. Yes the global frontier of ideas is more difficult to mine now, but we have exponentially more people. If that ceases to be true the rate of innovation will slow down substantially.

Sounds plausible.

Moore's Law is more of a goal than a scientific law. Moore inspired vast efforts to make his "law" come true.

Someone needs to start poking holes in the condom factories in China?

Our new Norman Borlaug?

"This is why continued global population growth is so important. It’s the only reason progress doesn’t grind to a halt. Yes the global frontier of ideas is more difficult to mine now, but we have exponentially more people. If that ceases to be true the rate of innovation will slow down substantially."

Until computers can come up with new ideas.

"pre-industrial populations had much lower average IQs due to malnutrition"

Is this true? You are referring to early ag cultures who subsided on cereals?

Fortunately we have also more people looking for new ideas. Or do we?

And how widely afield are they looking?

Probably too much is invested in "known next big things" like AI and VR and not enough on weird stuff.

A famous example of "why digital was different" was to compare chip advancement to automotive. "Cars would cost pennies and get millions of mpg" etc. You have to understand how chips were different to understand why they were.

Basically, it was a game of miniaturization that started easy, and got harder over time. Chips got more powerful and chip manufacturing lines got more expensive apace.

Everyone always expected a limit, as the next complications in manufacturing became too expensive to recoup. It is amazing that chip makers have finagled a way this long.

That said, I see no reason that this miniaturization, repeat, cycle should generalize to "ideas."

(If you could fit in a five inch car, it might be very cheap. There is a movie in that!)

You Bare-faced cuck

By the way, is Moore's Law still a thing? Is it still happening?

My last two computers (say 2012 and 2015) weren't all that much faster, which was different from all the computers I'd had going back to my IBM XT in 1984. But they are pretty fast as it were, so maybe Moore's Law is still going on in more needed fields?

Transistors per dollar peaked in 2012. There are other things to optimize for, like low power, but for most uses transistor count equals power.

Moore's Law is continuing with a 2.5 year doubling. It looks like it will have to end around 2025 although the Law of Acceration will continue for several years after that. In 2009, Intel said the exponential trend will continue until at least 2030 using 3D chips, etc.

That's because software swallowed all that increase in power and pissed it down a drain. I'm writing this on a 1.2 GHz iBook G4, and the Apple software is lethargic. Today, this computer is practically an antique. When I was in college, if you had asked me whether we would ever have instruction cycle times faster than 1 nanosecond, I would have confidently said no. That's beyond any reasonable advances in technology. Today, we're throwing processors like that in the garbage bin.

Debian on a mere quad core screams, but the switch to SSD is part of that.

The trick is deciding if you are software bound, or cpu, or disk, or network bound. For most people with older systems it is probably disk and network. SSD is a cheap fix. Light software helps though.

Moore's Law is still happening, sort'a. The PC I had back about 2010 had an AMD Athlon dual-core processor operating at about 2 GHz. In 2015 I had a 4-core AMD integrated CPU-GPU operating at 3.2 GHz. And now I have an 8 core AMD FX8370 CPU running at 4.0 GHz.

So speed is going up, but the main thing is the number of processors. My cynical bet is that the number and speed of processors in PCs will eventually come to some sort of limit, but we'll be assured ever faster and better processors are manipulating our data "on the Cloud" so Moore's Law is still true even if our home desktop units aren't changing.

And that's where Moore'sLaw meets Amdhal's Formula and shuts down. Software isn't infinitely parallelizable and even small amounts of paralleilzation can have such a high upfront cost that it's frequently not worth the (immense) trouble.

Correct. Also, lots of software is still not parallelized, or not parallelized very much. Parallel programming is hard.


"By the way, is Moore’s Law still a thing? Is it still happening?"

Moore's Law exactly as originally stated is irrelevant. What's relevant is the number of computations per second per dollar spent. That's has been doubling about every 14 months, with no sign of stopping. In particular, computers (graphics processors) capable of ~20 petaflops--roughly the power of a human brain (YMMV)--should be available for ~$1000 in the early 2020s. And then double that in ~14 months, and four times that in 28 months, 8 times that in 42 months, etc.

"Nvidia CEO Jensen Huang has become the first head of a major semiconductor company to say what academics have been suggesting for some time: Moore’s Law is dead."

“Nvidia CEO Jensen Huang has become the first head of a major semiconductor company to say what academics have been suggesting for some time: Moore’s Law is dead.”

Yes, but if you read the column and read my comments, you'd see he's talking about the original statement of Moore's Law, which is irrelevant.

"The enablers of an architectural advance every generation — increasing the size of pipelines, using superscalar tweaks and speculative execution — are among the techniques that are now lagging in the effort to keep pace with the expected 50 percent increase in transistor density each year, Huang told a gathering of reporters and analysts at the Computex show in Taipei."

That's not relevant. As I previously commented, what's relevant is number of instructions per second per unit of cost. (Or the inverse of that...cost per million or billion instructions per second.) *This* is a relevant article:

That does not look as up to date as my other eetimes link.

Does it show the 2012 reversal in cost per transistor?

And your last chart, the one that relies on highly vectorized MIPS?

As long as we understand it does not apply to 99% of all operations.

"That does not look as up to date as my other eetimes link. Does it show the 2012 reversal in cost per transistor?"

What's important is the cost (of GPUs) per million or billion instructions per second (or the inverse of that...millions or billions of instructions per second per unit cost). Your EE Times link does not directly address that question...although the photo with the caption: "Nvidia’s Huang predicts further advances to come from GPU computing" has an upper curve that may address that question (the y-axes units aren't shown).

The cost per transistor is not what's important. The cost per million or billion instructions per second is what's important.

I agree that we should no longer be focusing on CPUs, but instead should be focusing on GPUs, since they perform a much greater share of the world's AI calculations.

Moore's law on the chip has pretty much topped out, but it has moved to the system as a whole. More things are being taken off the main chip and subsystems are being optimized.

Here is a Bell Labs researcher addressing the problem

One issue could be that STEM careers are now so low status and shitty that the talented, highly intelligent people don't go into the field any more.

That's a bizarre meme you've been spreading. What's the yearly income of:

Econ phds
Art history majors
Software engineers
Physics phds

I don't get your point at all. If your meme is that it's easy to outsource, that's not really true. If your meme is that it's easy to insource with immigrants from Asia then you're half right but mostly wrong. If you count bio and psychology then you're doing it wrong. The skills still need to be in demand relative to supply.

What's your thesis is here exactly ? Just trolling ? A college kid applying for banking with an engineering degree has a much better shot than one with an art history or "business" degree.

The physics doctorates who left for wall street are making far more than those doing applied research.


The people who do well in STEM seldom care about status and love what they do.

The accounting illusion of Ideas being "harder to find" is created by the economist definition of IDEA which, under present conditions, has little to do with STEM. We have a rapidly growing number of STEM type ideas with growing productivity, thanks to Google scholar and ability to access the knowledge base of the world. However, the veto filter created by regulators, standards, activists, NIMBY's, politicians, etc. which all have veto power to prevent STEM ideas from becoming the economist definition of IDEAS has created this illusion and a real economic impact.

For example, after the transistor was invented, the largest and most advanced user of electronics in the world, the US DOD, in the late 50's through the 60's was trying to advance the vacuum tube technology and ban the purchase of chips. The justification being the sensitivity of chips technology to EMP pulses in a nuclear war. However, they couldn't block all uses and markets and an unknown geotechnical firm now called Texas Instruments blew the field open. The DOD couldn't write standards outside of MIL specs but did slow the adoption of modern electronics by DOD. In the early 80's, my cousin was drooling over the chips in my new Mac (when they just came out) that were two generations ahead (4 times faster, 25% of the size and 25% of the power) of anything he could use

If the number of researchers required has also been growing exponentially, that means that the "transistor density per researcher" has only been doubling about every 2.5 years rather than every 2 years. Which is... still quite fast.

Bah, humbug! Back in the Dark Ages of the Nixon administration, in 1970 or so, about 2.8% of US GNP went into research and development. People thought that was an impressively large figure and attributed much of US industrial pre-eminence to such spending. And now close to half a century has passed, and the US is spending 2.7% of GNP on R&D.

Do those numbers suggest to you that "the number of researchers ... has been growing exponentially"? If so, I've got a bridge to sell you.

If there's some truth in that exponential growth notion, it's that companies around the world have been increasing their R&D teams, but that the number of leading edge chip manufacturers has fallen considerably.

In 2009, some one invented this idea, pick up a telephone and call a cab. Result, the biggest money losing cab company of all time.

HAHAHAHA That's a great comment.

I've spent many hours doing patent searches, on my ideas and those of my employers. I am impressed how thoroughly the idea space has been explored. It is seldom that an idea is so new and unique that there is little or no prior art. If you want to find a new idea to exploit, just look in the patent literature. If you find something good from an independent inventor and it's more than 5 years old, there's a good chance you can buy it for a song. The vast majority of patents are money losers for their inventors.

Their model is based on very different assumptions from Martin Weitzman's 1998 QJE paper "Recombinant Growth" which posited that much like network effects, new ideas are typically created by combining existing ideas, and thus there is exponential growth as the number of ideas grows and thus the number of combinations grows and grows.

But it's perhaps not surprising that a 1998 paper would make an assumption of unlimited new ideas out there, and a 2017 paper would assume ... whatever they're assuming, diminishing marginal returns or increasing marginal costs of research or a limited number of new ideas.

In a world where technology prices fall at high rate, it is difficult to get an accurate measure of productivity. When the price of some services falls to almost 0, society might actually be gaining a lot, but traditional measures are not going to capture the reality.

Examples abound where human effort has been replaced by technology, often in the form of software. The real price of the resulting products or services also falls, so it appears that productivity gain is negative, and yet more value is being delivered to customers/users, even though they paid less for it.

At the same time, other things have dramatically changed over the last 50 years, that would also affect this discussion. Maybe the rate of idea generation has been affected by the perceived opportunities out there. Economists can't measure the effects of ever increasing regulatory burdens, often because it is the unseen lack of something new being created because the barriers to entry in many fields have become so large. Or the effects upon young potential entrepreneurs. A simple case in point: many a future hard charging entrepreneur had a lemonade stand or some other micro business as a youngster. 35 years ago or more, there was no one to tell them they couldn't. Now, the news is replete with young people being stymied by overbearing government, with ridiculous scenes of children having their lemonade stands shut down by the city or county. Their perception of what is possible in the future changes with those kinds of experiences. More damaging is probably thousands of examples that permeate our society, that are not easily observed unless you are running a business, of cases where people face escalating difficulties in starting and running a new company. Often they are hit hard by regulations they only find out about after they are fined. (This is not theoretical, I know many business owners who have complained about that to me).

As someone who has had several new ideas and even a patent, I would say that there are many more ways for ideas to die these days.

First it is increasingly complicated to get money to study something. Government grants take increasing more time to write and the rejection rate continues to climb. We now have professional grant writers because spending resources on just the application for money can be cost effective. On the private side getting corporate funding is harder as there are ever fewer corporations actually looking for new things and the profit margins demanded grow ever higher to compete with the many derivative products with minimal margins (e.g. computer/IP base products).

Say you have the money to try an idea though. Now you need to get into the nuts and bolts of actually working with it. That means hiring procedures, environmental impacts, IRBs, etc. All of these domains have had decades to accumulate bureaucratic detris which makes it ever slower to do something new. It is decidedly non-trivial to comply with the many requirements regarding non-discrimination, environmental protection and the like - even just documenting that your project has no impact on any of these is a decidedly non-trivial endeavor.

Now say you actually have an idea you want to monetize, okay almost certainly there is some regulatory body that can have dramatic impact on your new idea's profitability. Regulatory capture and status quo bias both mean you can watch your entire profit margin get drained, say because a new health law needs a source of revenue and your whole field gets they lucky tax or some Californian ballot proposition makes it easy to be sued on speculative grounds. Again we have had decades for people in these domains to figure out many ways to get paid or build petty fiefdoms with regulation and oversight.

I am reminded of how Southwest took a simple idea - fly just between the three largest cities in Texas and snuck in under the wire because there was not enough on the ground for legal, regulatory, and lobbiest challenges to ground their big idea, though fighting these challenges sucked down huge amounts of their capital. But how twenty years later they were able to successfully block high speed rail connections between these same destinations by using their entrenched position. I suspect that many, many ideas die because exploring them requires too effort fighting the red tape and bureaucracy.

That was too realistic of a comment for this rather philosophical blog. Tonight I misheard a baseball announcer talking about all-night baseball games - I thought at first - what could be more wonderful, games lasting all night - but then I realized all the little guy was trying to say was that all the games on some following Tuesday were night games (not all-night games, just night games, usually done and wrapped up by eleven or so - sad!). Good luck with your patents. Appreciate the fact that you have received genuine gratitude, if that is the case, and if it is not the case, well, God loves us all anyway. True that, as I used to say to the nurses on the fictional night shift to wake them up a little bit. Live not by lies - few people know this, but that is the unofficial motto of the patent office. (PTO). Orwell (not Lewis, not Zamyatin, not Huxley) said it - it was the best thing he ever said, and I am not sure he even knew what he was saying, at least not the way he should have known. Well he is a big shot for the poor kids who have to be humble and learn about him for AP English points (sad!) but really he was interesting to be around, he did not want to be boring and he certainly did not want too much monetization - he wanted enough, but not too much. At least I hope so, but who knows? I understand dogs and cats pretty well but writers, famous or not, and other humans... who knows? Simpler than dogs and cats in some ways, less simple in others. One hopes for the best for everyone, of course.

examine dna meaning artemis Diana, at an apartment dinner party, a woman would have to eat flies to marry a pilot. Something about preferring nonfiction. Philosophy is of course fiction. Corporate lawyers Hercules piorot truck drivers, advertisers fired for personal treatment,

The noun adjective problem ie the past perfect, I felt éclat

Economists somehow think that exponential growth is forever possible with constant inputs. It isn't. During the 40 years of Moore's law, the density of computer chips increased by 2^20, or approximately one-million-fold. And researcher input is supposedly 18-fold larger. So should we all lament that outputs increased a factor of ~60,000-fold more than inputs? This doesn't make sense, and it didn't the last time this topic came up either.

Even the claim of 18 fold input research increase is an accounting artifact. What were areas of research and expertise originally outside of the semi-conductor industry shifted to in-house or to being counted as support industries adding to the industry R&D accounting.

For example, clean room technology existed in hospitals and biology labs long before the transistor was even invented. However, when chips went from the sizes we created in metallurgical labs in the 60' to the point where a virus was a boulder on a chip, the accountants just shifted this whole area of evolving technology to the chip sector from hospitals.

Yes, R&D money continued in clean room technology and the "chip" industry didn't get as much "free" spill-in of technology from biosecurity such as BLS-4 facilities (also bunny suit operations working with nasty bugs like ebola) and even spilled technology to biosecurity. Should this increase in the number of researchers be considered of great accounting significance.

If they broke down the number of Ph.D. employees working in actual chips VS those working in supporting areas ranging from vacuum physics to metrology to CAT scanning, you would find this ratio decreasing over time, thanks to accounting. However, the productivity of the individual researcher in shrinking the chips could be actually increasing.

1) Research isn't funded like it used to in America. You put in less money, you get less research.

2) All sorts of stuff is probably being invented in Korea, Japan, China and India but they don't speak english so we don't know about it.

This is too broad a categorical statement to be testable or relevant.

Take CRISPR, for example. This is a new base technology that will lead to many inventions.

Or, take a combination of inventions, not just a single one, which leads to several other inventions: e.g., cell phone becomes a portable computer which, along with the computer, creates large databases from which other discoveries are made (e.g., Big Data, or Big Data in healthcare from computerized medical records which lead to further discoveries on treatment methods or early detection)

In other words, invention can involve one significant event, or a combination of little events which lead to whole new fields.

The "This" I refer to is the paper above.

Yext, xaxis acquisitions, tdr stock doubles in 1 year, adobe 1 year since tubemogul acquisition, and still rips off and rocketfuel stock price.

Very few of the so-called researchers are doing anything of value, especially at this stage in the game where we're ~60 years from a big-c innovation

Big-C stuff gets done by kids in their early to mid 20s, yet they're getting almost zero research funding...and this is probably a good thing. "Researchers" in their 40s, 50s, and 60s are just Straussian parasites.

Yep, the solution for America is calling the Red Guards and closing down universities.

What does big-c stand for? I won't be back to find out, I just wanted you to know you wasted your chance to communicate your idea.

What if rather than a productivity decline of research, we see this as a signal for the necessary future state of employment? If ideas and true innovation (on many fronts - technological, economic, social, cultural) are what drives society forward, and in order to keep up with the needs of society there is an ever-increasing output required, then the necessary future state of employment is the individual as the idea engine.

Constant required output of ideas and declining returns to research input, implies increasing research input. Capital and individuals.

I remember reading this a while ago. I especially disagree with what it says about Moore's law.

I wrote about it on Reddit:

Quite a good post, and you get additional credit for putting it up EIGHT MONTHS AGO. I'm not there to personally congratulate your but I sure hope a couple people patted you on the back for that, it'd be totally deserved.

ARE ideas getting harder to find?

Didn't a duly-convened court somewhere just accord property rights to a (foreign-born) primate?

No guarantees on the quality of the ideas, mind you, but who needs a critical brain to generate an idea (marketable or no)? The District of Columbia itself is stuffed with brains, e. g., not all equipped with critical functionality.

"In many growth models, economic growth arises from people creating ideas, and the long-run growth rate is the product of two terms: the effective number of researchers and their research productivity."

Then those growth models are wrong. Economically-useful ideas come from and are driven by plenty of people other than "researchers." Container freight was created by Malcolm McLean, who was running a trucking company at the time. The creation of the SABRE computerized reservations system was sparked by a conversation between an IBM sales rep and the CEO of American Airlines. The inventor of the practical telegraph, Samuel Morse, was a professional artist, not a professional researcher. Tens of thousands of less-galactic but economically-important innovations each year are brought into being by people whose job title may be Product Manager, Logistics Specialist, Sales Representative, or Factory Worker.

It's dangerous out there to have new ideas. If the bureaucrats don't tangle you up, the SJW will decide it's racist or something.

"All mankind’s progress has been achieved as a result of the initiative of a small minority that began to deviate from the ideas and customs of the majority until their example finally moved the others to accept the innovation themselves. To give the majority the right to dictate to the minority what it is to think, to read, and to do is to put a stop to progress once and for all."

Mises, Ludwig von (1927). Liberalism

I read the draft of this paper. The thesis proposed in the draft is an artifact of how they did their measurement and accounting and says nothing about "Are Ideas Getting Harder to Find?". I did a detailed analysis of the draft (link below) and got no comments back from the authors.

We live in an era of the fastest rate of knowledge increase and the fastest rate of increase in the number of new ideas in the history of man. To create this illusion of decreasing Ideas, they defined Ideas as those Ideas that succeed in getting through the usual technological, economic and market filters that all useful Ideas must pass, but including extra filters to implementation such as regulation, standards, politics all of which can kill real Ideas. What happens in these rate limiting non-technology, non-science areas determines the economic growth rate.

In my comment, I looked at how Ideas really build on existing Ideas and grow at a factorial rate that is even faster than exponentials.

Any comments on my comment would be welcome

I would like to really understand how you get innovative per capita growth in a society that appears as an exponential rate from individual Ideas or areas that all have sigmoidal type growth curves (rapid increase followed by a saturation steady state and decline).

Corollay: eugenics programs are more important than ever. Ideally it's done with 'positive eugenics' (opt-in genetic engineering of fetuses), but however it's done, people with stratospheric IQs are becoming ever more essential to the progress of civilization and technological growth. Additionally, Flynn effect notwithstanding (which I think of as a 'software upgrade'), there's compelling evidence that humans have become dumber on average over the past century (think of this as a 'hardware downgrade') due to dysgenics and differential breeding patterns:

Comments for this post are closed