The Decline of Science in Corporate R&D

That is the subtitle, the title of the paper is Killing the Golden Goose, and the authors are Ashish Arora, Sharon Belenzon, and Andrea Patacconi.  The abstract shows what an important paper this is:

Scientific knowledge is believed to be the wellspring of innovation. Historically, firms have also invested in research to fuel innovation and growth. In this paper, we document a shift away from scientific research by large corporations between 1980 and 2007. We find that publications by company scientists have declined over time in a range of industries. We also find that the value attributable to scientific research has dropped, whereas the value attributable to technical knowledge (as measured by patents) has remained stable. These effects appear to be associated with globalization and narrower firm scope, rather than changes in publication practices or a decline in the usefulness of science as an input into innovation. Large firms appear to value the golden eggs of science (as reflected in patents) but not the golden goose itself (the scientific capabilities). These findings have important implications for both public policy and management.

There is an ungated version here (pdf).  Of course, for better or worse, this means there is more of a burden on universities.

Comments

Do I speed read a 43 page paper and post, or dishonestly just say "First!" and use this as a placeholder? I'm getting lazy... first!

Good paper, doesn't contradict any of my priors. Minor criticisms, observations:

This is partly wrong, for the period 1980-2007: "By contrast, (v) patenting by firms has increased and (vi) the implied value patents, including the premium paid for patents in M&A, has remained stable." - patenting has increased dramatically, but so has M&A patents paid, Google the Motorola patents among many others. It used to be you'd get a few tens of thousands per patent, about the cost of obtaining one, but Google (and others) now pay several hundred thousand dollars per patent.

"We interpret these patterns as part of a longer historical process wherein firms are specializing in different parts of the innovation value chain, in which large firms withdraw from science internally. They are becoming less reliant on internal research and more reliant upon external inventions." - possibly true, due to 'off-sourcing' science to/from colleges, as TC, says, Google: 'In 2002, The Economist magazine said, "Possibly the most inspired piece of legislation to be enacted in America over the past half-century was the Bayh-Dole act of 1980."' (Bayh-Dole says you can privatize and patent university R&D).

Another factor to explain the decline: "A concern is that our measure may merely reflect changes in how firms protect their knowledge. Large firms may still be investing in science but publishing less, perhaps in order to patent or better protect their research findings.." Very true IMO.

Killer sentence, rebuts TC, Gordon: "Firms would reduce investment in science if science itself becomes less useful for innovation (Jones, 1999; Gordon, 2012). However, we do not see any decline in the number of patent citations to science over time, nor do we find any evidence that the science used in inventions is growing older. Thus, the decline in private investments in science cannot be explained away by the changing character of innovation in the economy"

"We argue that lower investments in science by large firms likely reflect reduced incentives to develop significant new products and processes internally." -true, and an argument we need even strong, better patents.

"Second, the decline in investments in science is also associated with narrowing firm scope. A famous conjecture in innovation studies is that investments in basic research are more profitable in diversified firms, either because of scope economies in research.." - true, recall "Ma Bell" and "Bell Labs". What we need is more basic research by government, or more monopolies as before (especially in telecom, arguably). AlexT type prize funds are also good.

What we need is more basic research by government, or more monopolies as before (especially in telecom, arguably).

The claim is therefore that the net neutrality push may harm research? I don't disagree that, e.g., CableLabs does research, but while the theory isn't crazy I dunno.

CableLabs does not do basic science research.

CableLabs provides a forum for companies that do technical research to establish a common standard and IPR pool that is blessed by the cable operators.

I wonder how distinct monopoly-research and government research are. Among other things, AT&T famously created Unix and C language, and kicked off a whole technical subculture that came to dominate the Internet. But as far as I can see, AT&T didn't recapture much profit from it.

On the other had, I can easily imagine that a company that depends on various congressmen for perks will plonk a research lab or there in order to keep the politicians happy. And those politicians would have seen it only partly as pork. They would also have high-minded ideas about getting basic research done.

AT&T did create the longest and most successful corporate research effort of all time. In 1934, the head of Bell Labs declared the development of a semiconductor amplifier was the highest priority of the labs. It took 14 years to get there, but it was worth it. The Bell System was the largest single consumer of vacuum tubes.

Unfortunately, that has been an exception. Large corporate research divisions usually succeed at short-term goals applying incremental improvements in technology, but big breakthroughs are rare and seldom available just by pouring in research dollars. It's simply a more reliable path to wait for some start-up to develop what you want, then acquire them. Even that road is fraught with peril, but at least you get a chance to look at what you're buying before you put down the money.

While the transistor may be "the invention of the century," it's somewhat difficult to see how (or at least by how much) AT&T profited from this. Although many others surely did.

AT&T had to sell licenses to use its semiconductor patents at minimal cost to all comers after the 1956 Consent Decree, which not only jump-started Silicon Valley but also the old Japan, Inc. But AT&T was forbidden from profiting from this licensing, and was limited to developing semiconductor devices and computers only for the purpose of providing telephone services.

For AT&T semiconductors might have lowered the cost of long-distance telephone service, thus greatly stimulating demand for it. But at least through 1970 or so the main technical driver behind these lower costs would have been terrestrial microwave radio, which in this time period depended more on specialized vacuum tubes than on semiconductors.

One could, perhaps, make a better case for the Bell Labs' research that lead to semiconductor lasers and fiber optics?

The Belle chess-paying program developed at Bell Labs may have been in violation of the 1956 consent decree.

http://cm.bell-labs.com/cm/cs/who/dmr/leskmemo4.gif

As a PhD with 20 years of industry experience, I have to say, this makes sense to me. I never was able to understand why corporations would conduct basic science research when they have basically zero chance of converting research results into shareholder return. It's more likely that someone else will, assuming anyone ever does. Basic research results are a public good, requiring public investment.

Kenneth Duda
Menlo Park, CA

@Duda-- good points about public goods remaining public, pace the Bayh-Dole act of 1980, which does exactly the opposite, see http://en.wikipedia.org/wiki/Bayh%E2%80%93Dole_Act (but the government does have 'march in' rights, and a non-exclusive license, among other things). Bayh-Dole is common in the medical arts, not so much in software, which is probably why you likely are ignorant of it.

"March in" rights have been asserted through Bayh-Dole (and Stevenson-Wydler, for non-University federal research laboratories) so seldomly that they are a threat that is not credible.

the problem that Bayh-Dole addressed was real, but it was like killing a fly with DDT, and a flyswatter, and a tax break. It went way over board. If the response had been to just make all government funded research public domain it would have been great, but instead of fixing the licensing problem it just privatized it.

Is it too late to fix this? A court case? If government cannot own IP, how can it make it and give it away?

If We the People pay for something, can the courts tell We the People we can not own it?

A corporation is no different than government, other than lacking the sovereignty of government - both are collectives, one of voluntary collectivists, and the latter of involuntary collectivists by birth.

Sure, much basic research doesn't fit in a short sighted financial investment model, that doesn't mean that humans are simply incapable of organizing other institutions to facilitate it outside of big government, coercive taxation, and politicians.

Your suggestions are welcome.

Although the technology to make it easy, low cost, and trust-free are still in development, I would humbly suggest prediction markets and crowdfunding to pay for these public goods.

Amazingly, I'm with Jan.

If, unlike Jan, you believe is strictly keeping government to some well defined proper roles, then basic science is a perfect fit because:

* It's all about public goods, really globally spread positive externalities with difficult-to-predict benefits.

* Commercial R&D seems to happen only to the extent that companies get get monopolies (including patents) or form other cozy relationships between government.

* The institutions which spend the money are universities and other somewhat independent institutions. This reduces the bad effects of expanding the role of government.

Yes, I agree on this issue. Funding of Basic science is, in my mind, a reasonable government function.

Like this example: http://newsroom.unsw.edu.au/news/science-technology/australian-teams-set-new-records-silicon-quantum-computing

[Australian Research Council funded by AUS gov't]

Some companies did it because basic research was the best way to attract some kind of peolple that, despite being completely imprevisible, could add tons of value to other projects just by being around to talk to or join a team for a while in short notice.

Nowadays, it may be that projects have become too complex for that, but my best bet is that it's mainly because this model does not fit the "comoditized worker" model that management school preaches.

Imprevisible. Cool...I appreciate you teaching me a new word. :=)

Would be interesting to see how much open source innovation has influenced R&D spending. When you look at how much mobile phone companies can simply rely on Android instead of developing their own platform, for example. A lot of software that was once very expensive and proprietary is now free.

Open source software is not basic science research.

The term open source now extends beyond software, and to basic science. It is basically an organizational option, about how to publish.

A lot of open source software is used in basic science research, and affects its costs.

Quite a lot of open source software is funding through basic science research grants. I suppose you're arguing then that the NSF and NIH are making mistakes when funding things like Cytoscape?

Finally, in computer science a lot of open source software *is* basic science research.

I wonder how much of the issue is that software "doesn't count" as science in the eyes of most people.

It's not experiments and titration and physics degrees, but it seems pretty fundamental. Almost like math.

It is true that when a computer scientist invents something knew it makes sense for her to publish it as open-source software, and they have been doing so since before open-source became a meme.

But software, is mostly engineering and not science. Indeed many big projects like the Linux kernel are self-consciously, conservative and incremental in their innovations.

It depends on what kind of software you are talking about. A vast amount of open source software consists of implementations (libraries) of new algorithms and analytical methods. This is the published format of basic research.

How much of this is due to the massive increase in information/knowledge causing science to splinter into tiny specialities?

So one scientist's or one company's contribution to science means much less than it used to (the days in which a da Vinci could master both art and math are over).

But a patent - however small - is quantifiable and thus sought as a measure of achievement.

Overall, it's hard to dispute that scientific discovery isn't continuing at a rapid pace.

I'm not sure what their denominator is (too lazy to read 43 pages).

Are they saying papers per industry employed scientist declined? Papers per industry sector? Papers per firm? Papers per $ of turnover?

Too many variables here. If they want to conclude that research commitment is declining they can always construct a suitable post hoc narrative that fits their story.

"Industry" science started much after da Vinci, though. Basic science was already becoming pretty splinted by the time the modern industrial and corporate economy came about.

When modeling this it's useful to view a university's CTO (for the last 10-15 yrs) as an agent/broker for his or her university's science faculty and students. Doing this, it's pretty clear this is just an accounting paper.

"Public goods require public investment." Countries are just big companies. If the US or France conducts basic research, then any other country is able to monetize it.

If a company decides to lay off or fire a worker, its ongoing commitment to that worker is reduced or eliminated entirely. The cost of that worker to the company drops drastically.

But, what is the equivalent in the case of a country and a citizen it no longer finds useful? What would it mean, in terms of costs, for a country to "fire or lay off a citizen"?

Creative destruction: euthanasia and sale of body parts to the highest bidders.

"this means there is more of a burden on universities" <- Or more of a cartel.

"We find that publications by company scientists have declined over time in a range of industries. "

This must have been tugging at my subconscious because hours later it popped back into my mind.

Do scientists today "publish" or do they "do?"

Are the geeks at SpaceX publishing their findings on landing a rocket booster or are they just making it happen?

Is Venter published anywhere as he synthesizes life?

Palmer Luckey didn't write a paper about his algorithms that made the Oculus Rift prevent swim - he was literally in parents basement and posted them on a community forum.

Perhaps they're looking for science in all the wrong places. It's jumped the banks and not flowin where it used to.

Publications are not a priority in industry and often would share proprietary information with competitors. Much of what comes out of biological or pharmaceutical research in academia or industry are potentially very useful intermediate steps that are not easily protected as IP. Publishing them gives away any edge you may have after finding them. In academia, the publication IS the product. This isn't true in industry.

One exception is publications in small, startup companies. They seem (no evidence) to have more publications I believe to lure in freshly minted, idealistic PhDs.

"Publications are not a priority in industry" Yes, but that's putting it mildly. Publications can be regarded suspiciously. Are you trying to get another job? Wouldn't you time be better spent on [some company project]? Isn't this just a vanity thing?

And often they are right. From the point of view of getting ahead in a company, you might very well better spend your time on that cutting edge company project.

When people move around from firm to firm with high frequency and publicly share source code, academic publishing starts to feel a little quaint. It's not like the information isn't getting around.

Is what SpaceX doing technical/engineering research or more scientific research? There's a lot of difficult technical challenges to solve, but are they really breaking new ground and discovering new knowledge? I honestly don't know, but I think the technical/applied research versus basic research (Universities, former Bell Labs, etc.) is the key distinction in the paper.

Is this still an important distinction? Sure it's important if you're in one camp or the other so one interpretation means you get grants and the other doesn't, but for consumers of the results of all this? Does it matter if the knowledge involves bat reproduction or graph algorithms? Why does the first count but not the second?

I think there's a logical distinction to make between technical problem-solving and the generation of new knowledge without directly applicable results i.e. 'basic' research. Designing an incrementally more efficient intake manifold for an engine is an engineering challenge, but it's not necessarily scientific research. Breaking down the fundamental physics of air pressure, flow, fluid mixing, etc. within an intake manifold and providing new incites into how those things work would be.

If I'm in either camp, it's the applied side.

I cannot speak to whether the SpaceX group publishes.

However, J. Craig Venter publishes in the peer-reviewed literature regularly. A quick search on the Web of Science returns 38 papers over the last five year (2009-2014) where Venter is either first author or a coauthor. Other methods of searching may return different results, but I think it's pretty clear the Venter publishes.

anyone who as ever done basic research understands that serendipity and preparedness rather than planning drives progress. predicting research results is like predicting the stock market; thus, the vast amount of research will not result in a product. as a result, the model has been that the NIH funds basic research, and Pharma buys the resulting patent and does compound screening and product development. starving the NIH hurts industry and the economy. that this is new is news is simply amazing.

Nature had a story about R&D in companies, April 2014, and looked at company presence in research papers. "Since 1996, the number of papers associated with older, manufacturing-oriented companies has plummeted, while publications involving younger technology and Internet companies have increased." http://www.nature.com/news/basic-science-finds-corporate-refuge-1.15124?WT.mc_id=TWT_NatureNews

Serial killer seal. Very Tyler-esque thing to link.

http://www.smru.st-and.ac.uk/documents/2173.pdf

I just skimmed briefly, and as I expected, they made one very strange choice: removing publications in conference proceedings from their sample. Since (a) conference proceedings are the dominant mode of publication in computer science and machine learning, and (b) these fields are probably the most represented by scientists employed at technology companies for the past 30 years, it's not clear to me that this decision makes any sense at all.

Yeah, I think this is trying to avoid counting computer-related developments as science. Which probably matters a lot if you're a condensed matter physicist or something, but not so much to the rest of us.

I can see that there are methodological differences between what a botanist does and what an engineer at Google does, but it's really hard to deny that the engineer at Google is creating knowledge.

It's not just product engineers who are being ignored. A lot of academic computer science (and even economics!) research goes on at Google, Microsoft, and Yahoo. For instance, http://research.google.com/pubs/papers.html lists a couple thousand papers, only a fraction of which would be counted when ignoring conference proceedings.

A lot of people (including most computer scientists) argue that computer science isn't really a "science," but they are being a bit pedantic and making a methodological distinction. Clearly this work contributes to knowledge. I might make the argument that the academic side of computer science contributes less to knowledge than the practitioner side, but that's just a quirk of the field that makes it harder to measure.

The conference proceedings oversight makes no sense.

So is archeology science? It contributes to knowledge. Computer science isn't sciene its engineering. Engineers need to get over their inferiority complex and just own it. Stop trying to be scientist. It's not 1950s America anymore where scientist were riding high from the Manhattan Project and the aura of Einstein

"So is archeology science?"

I don't know. I'm arguing it's not an important question, and trying to add things up based on this distinction is going to lead to misleading conclusions.

Only some computer science is engineering, much of it is indeed science. There are a lot of subfields within CS that make it difficult to generalize about the whole of the field. Most people probably conflate CS with software engineering. It's important to note that most CS departments originated within Math departments, which their academic research still resembles.

Corporate America has found that it is cheaper to have the local research university conduct the basic research that will lead to a product, and then license from the university and commercialize the product, paying relatively low (without risk adjustment) license fees to the university. And, if you have friends in high places who can direct university research, or if the academic researcher owns on the side a company that will commercialize the product, what a sweet deal. Why would you do the research in house if the basic research risks could be born by the state.

Hey, if I can get you to build me a stadium, I should at least be able to get you to fund research that benefits me, and then license it to me and my academic buddy really cheap.

It is often even better than that. They can put up a few thousand to pay a graduate student to do the research and get really cheap labor while having some of the best and brightest in the field do the leg work under the care of a university faculty member.

This happens all the time and I think is mutually beneficial. As long as the university doesn't sell out ethics for money and publishes the actual results not what the funding organization wants. I'm sure that happens, but I think it is rare.

And sometimes the companies directly pay for the research by the universities through, it's just not "corporate scientists" and so would not be included by this paper from what I understand of its methodology. But the basic research spending and risks are borne by the corporation in that case.

It's just a form of outsourcing in that case.

"Scientific knowledge is believed to be the wellspring of innovation." By whom?

Consider the possibility that as Science has become more and more under the sway of managerialism in the universities, and more bureaucratic, and the age to get tenure gets greater and greater - so that you spend more and more years working on other people's interests - fewer spirited and intelligent people have pursued science.

It's probably worth noting that a number of tech companies got their start as university research and the ideas spawn the start ups. The last tech conference I was at you could pretty easily pick out the better papers by whether the professor also had a side start up made up of former graduate students.

I always thought it was wrong approach for green energy solutions to try and fund companies. Seems like the better approach is to fund the research at the universities and let the start ups form around the most successful research.

Reminds me of the Xerox Parc mouse legend story:

http://www.newyorker.com/magazine/2011/05/16/creation-myth

Key bit:

"In the nineteen-nineties, Myhrvold created a research laboratory at Microsoft modelled in part on what Xerox had done in Palo Alto in the nineteen-seventies, because he considered PARC a triumph, not a failure. “Xerox did research outside their business model, and when you do that you should not be surprised that you have a hard time dealing with it—any more than if some bright guy at Pfizer wrote a word processor. Good luck to Pfizer getting into the word-processing business. Meanwhile, the thing that they invented that was similar to their own business—a really big machine that spit paper out—they made a lot of money on it.” And so they did. Gary Starkweather’s laser printer made billions for Xerox. It paid for every other single project at Xerox PARC, many times over."

But how long will universities still be with us? Their degrees are becoming worth less and less in the job market, even as federal benefits continue to drive the price sky high, and political correctness (on the part of both university administrators and federal regulators) make them places no self-respecting man would be seen dead. I believe they are doomed. I wouldn't go to one today, nor send my kids.

"federal benefits," consistent cuts in state funding for years/decades, same difference.

I think that the rise of the pernicious fixation of shareholder value since c. 1976 can help to explain the declining levels of corporate investment in R&D.

http://pastspeaks.com/2015/02/05/killing-the-golden-goose-corporate-rd-and-shareholder-value-ideology/

What are the incentives? How much of that money has moved from R&D to the legal/accounting/lobbying arms of companies?

If I work on a project at work with my fellow coworkers or with subordinates, using company time and property, and we discover something interesting and potentially marketable, it would be uncouth and probably illegal to take those people with me and start a side business developing and marketing the product we came up with during our time at work. Would I be shirking shareholders of potential returns?

When this is done at universities, between professors and grad students no one bats an eye. Who funds the universities and what is their return when these start-ups succeed?

Only at very poorly run universities. There are patent and rights agreements signed before students or post-docs even walk in the door the first day.

What this study doesn't capture is that there tends to be a cyclical trend from academic research to innovative startups to R&D at large firms (rinse repeat). When the economy is booming, ideas from academia start making their way into startups. The succesful startups get bought up and rolled onto large firms who put the ideas to market. When the boom recedes, everyone heads back to academia.

Comments for this post are closed