Has the division of labor hindered knowledge integration and productivity growth?

…we suggest that this division of innovative labor has not, perhaps, lived up to its promise.  The translation of scientific knowledge generated in universities to productivity enhancing technical progress has proved to be more difficult to accomplish in practice than expected.  Spinoffs, startups, and university licensing offices have not fully filled the gap left by the decline of the corporate lab.  Corporate research has a number of characteristics that make it very valuable for science-based innovation and growth.  Large corporations have access to significant resources, can more easily integrate multiple knowledge streams, and direct their research toward solving specific practical problems, which makes it more likely for them to produce commercial applications.  University research has tended to be curiosity-driven rather than mission-focused.  It has favored insight rather than solutions to specific problems, and partly as a consequence, university research has required additional integration and transformation to become economically useful.  This is not to deny the important contributions that universities and small firms make to American innovation.  Rather, our point is that large corporate labs may have distinct capabilities which have proved to be difficult to replace.

That is from Ashish Arora, Sharon Belenzon, Andrea Patacconi, and Jungkyu Suh, “The Changing Structure of American Innovation: Some Cautionary Remarks for Economic Growth,” recommended, an excellent paper spanning several disciplines.  I would myself note this is further reason not to split up the major tech companies.


'I would myself note this is further reason not to split up the major tech companies.'

Which again shows a basic misunderstanding of 'tech companies,' all of whom are fundamentally based on the GPL (hardware manufacturers are still tech - open to question these days - excepted). And all of whom started as just a couple of people with an idea.

Or to be more concrete - apart from manipulating user feeds (without consent) to create differing emotional results, what notable research has Facebook (or divisions such as Instagram or Whatsapp) been engaging in?

You fail to realize that producing software that does what their systems do has necessitated developing technologies to accomplish it. You don't know what they are. But without them these complex applications couldn't exist.

This is why university research is so attractive. It looks like smart people are doing important things, they wear the right clothes and fit all the right prejudices. The people with a real problem that they solve by trial and error aren't quite as romantic.

'that producing software that does what their systems do has necessitated developing technologies to accomplish it'

Or the other way round, of course. Which is why the question of whether a company like Intel is still considered a tech company in these discussions is quite relevant. However, it was not Facebook driving the development of hardware over the past 2 decades, it was basically the market of the billions of people buying PCs and mobile/smart phones. The server market is real, of course - and basically a small enough fraction that a company like Apple essentially abandoned it completely.

(The word you might be looking for is 'enterprise' - but we are not talking about SAP or Oracle as tech companies that need to be broken up, right?)

'The people with a real problem that they solve by trial and error aren't quite as romantic.'

When did the appeal of a couple of people in a garage creating a gigantic tech company fade in lustre?

it is pretty amazing that a companies like intel have moved out of the conversation about "tech" in favor of facebook and Uber.

Sleep sacks, often known as baby sleeping bags, enable you to assist the babies sleep peacefully, even after a long distance journey.
When those nutrients lack, it can lead to many different health problems.
The indoor playground equipment are designed for
the usage in homes, whereas outdoor playground are been positioned in relaxation clubs, parks,
schools as well as nursery centers.

Facebook keeps going to Asia to drive FB innovation in tech like phones, microphones and speakers, head mounted displays, like Apple, Google, Microsoft, ...

Paying American workers to learn on the job how to build factries to manufacture stuff costs too much, and high costs kills jobs.

But if you cant manufacture stuff, you cant design stuff that can be manufactured without paying workers boat loads of money to go to Asia to be in the factories, and those high costs kill jobs.

The only was to create jobs is outsource all innovation jobs in hardware to Asia to cut costs.

In an article last year, I went over all of this tech. You can find it here: https://www.americanactionforum.org/insight/breaking-up-tech-means-breaking-up-technology-and-teams/, but here is the nut graf:

Any trust-busting action would also require breaking up the company’s technology stack — a general name for the suite of technologies powering web sites. For example, Facebook developed its technology stack in-house to address the unique problems facing Facebook’s vast troves of data. Facebook created BigPipe to dynamically serve pages faster, Haystack to store billions of photos efficiently, Unicorn for searching the social graph, TAO for storing graph information, Peregrine for querying, and MysteryMachine to help with end-to-end performance analysis. The company also invested billions in data centers to quickly deliver video, and it split the cost of an undersea cable with Microsoft to speed up information travel. Where do you cut these technologies when splitting up the company?

all of whom

Surprising coming from you, but corporations are indeed people. All we have to do is get you to start writing them love letters.


Large tech companies use GPL software when it's stand alone, and there are no sensible competitors, but they are pretty much allergic to it. not even a fraction of 1 percent of the code written in major tech companies links to GPL code.

There is significant research coming from companies, and some is patented. Pagerank is the obvious example from Google, and it's hard to argue against old Bell Labs as a major source of research bringing the world decades worth of technological advancements.

There are plenty of valid arguments against private research, which might or might not be good enough to prefer public research. That private research leads nowhere is just not one of them

Did any company ever really get a positive return on their investment from Corporate Research? We hear all about the great innovations from Xerox Parc, Bell Labs, IBM's various facilities, but none of these companies really seem to have benefit in any significant way from their innovations (apart from the tax incentives they received).

I still think large prizes are the way to go for basic research - we don't really know what areas are the most fruitful for research, there could be significant areas that are currently under-researched where rapid progress could be made just because they are unfashionable or not fitting a particular agenda. Perhaps if we introduced more retrospective prizes that would assist there. So have say $1bn split up into 20 prizes every year awarded to the most significant findings that year in any discipline - with the award being decided by a common vote of say the US Academy of Sciences. You can stage it like the Oscars - that maybe would even get more kids interested in basic science. Fame is a great spur to effort.

I think you are making too narrow a definition of what constitutes a "corporate lab". I think you need to stop thinking of Bell Labs and Xerox Parc, and think of research and development at, let's say, Intel or Pfizer.

Possibly drug companies are the exception to this rule. But I would bet that the median drug company loses money on R&D. There are a few big winners in drug companies, but most research ends in failure. I don't think drug companies have cracked the research problem yet - knowing what areas to research ahead of time that will be successful is a very hard problem. If you already know your research will be successful then you probably don't need to research it.

'significant findings that year in any discipline'.

What does the US Academy of Sciences know except their specific discipline? What does a research physicist, who by definition knows a whole lot about a very very narrow and specialized slice of his field know about another very very narrow slice of another field, say medicine or chemistry or something else?

Serious question. And the physicists I know understand how narrow their specialities are.

If the significance of a new innovation cannot be explained to outsiders, then by definition it is not significant.

'If the significance of a new innovation cannot be explained to outsiders'

You mean like explaining the Internet of 1991, in particular its significance?

'then by definition it is not significant'

You mean like the Internet of 1991?

Why yes, exactly.

Except the significance of the Internet was blindingly obvious, as a way to connect PCs, representing an immense jump forward in networking.

What is still extremely hard to explain is that the Internet is a concept to network digital devices using a shared protocol. Creating a network that includes a vast array of devices and services, without even needing to talk about things like the World Wide Web, which are a fairly minor part of what makes the Internet significant.

Though maybe a lot of people who use the Internet still don't understand its significance today.

My first job in the early 80s was doing R&D in medical diagnostics for a Fortune 100 company.

It was classic multidisciplinary corporate research, with all types of engineers, chemists, and medical technologists.

Sure, we made money.

Perhaps that's one clue. The multidisciplinary projects are the ones that have to be.

And as the abstract implies, once you stop doing that kind of project, certain products become out of bounds.

What did they have you doing, making coffee?

I was writing RTOSes and then the machine control stack on top.

I can see why they discontinued the project.

What's really amazing (easy to play off these things) is that it took PCs another twenty years to get preemptive multitasking.

It wasn't that hard, it just took the orientation to do it.

The '84 Macintosh used a cooperative multitasking that ended up similar in complexity, but with serious limitations.

My memory was a little fuzzy. The Amiga had PMT, and then Win NT/95 in the mid 90s. So 15 years, from my perspective, to hit the mainstream.

I invented the internet, too, by the way.

No, that was the government.

Don't argue with yourself. You know they couldn't have done it without you.

But strangely enough I *did* enjoy making the coffee.

And ugh, memories of old coffee on the Bunn hotplate for three hours ..

1. Does Google or Facebook count as university research as both got their start in the classroom?

2. The big corporate researchers tend to be tech companies and their efforts are skewing towards big government applications with military, intelligence, and law enforcement applications. These have consequences on liberty as we see in place like China.

3. The rest of corporate America and their investors have decided to not do R&D and instead choose to buy up promising startups that have none of the bureaucratic layers that need to be fed and with the complete buy-in of their management. This is perfectly rational.

Agree with 3. Corporate M&A has replaced R&D.

Replaced? For a company like Microsoft, that has been true to a major extent since the early 1990s. It is just they were really good at identifying a solid product early on, then re-branding it.

Facebook, by contrast, definitely prefers to have people think that Facebook, Instagram, and Whatsapp are distinct entities - which is an interesting commentary about Facebook as a brand.

I agree with not splitting tech companies but we should regulate them when it comes to:
- privacy. Default to opt-in and with stricter handling of personal data.
- app stores. These are monopolies with the usual problems for both consumers and developers.
- termination of service. When you get arbitrarily banned or deplatformed by these tech services you have no recourse and often the procedures are handled by robots with no access to a real person. I'm thankful my electric, gas, water, bank, cable companies don't do this.

Of course, these can be handled at the state level like Maine, Massachusetts, New York, and California have done recently.

'I'm thankful my electric, gas, water, bank, cable companies don't do this.'

Because you are a paying customer, and governmental regulation in these areas is strict (likely too strict for anyone that donates to the Mercatus Center).

Using a free service, or even more relevantly, expecting to profit from a free service through advertising, is not the same as being a paying customer.

Iif you are truly concerned about freedom, don't use companies at all. Mastodon is like bittorrent - peer to peer sharing that is not subject to corporate control, and thus prevents being arbitrarily banned or deplatformed.

Interesting thesis but one that's hard to prove, and it's not clear what the policy prescriptions are, if any.

In their conclusion, they state "the fundamental challenge of managing long-run research inside a for-profit corporation remains a formidable one." I.e. they admit that it may be no accident (nor a mistake of policy) that corporate research lapsed into corporate development; there's no or little profit in doing corporate research.

They (and Tyler in his comment) harken back to ... IIRC it was Arrow who noted that monopolists may have more incentive to innovate because they can capture all the profits that result from innovation. But that was always speculative, and counterbalanced by the x-inefficiency and complacency that a monopolist has, compared to the lean and hungry competitive firm or wanna-be entrant looking for the newest innovation that will give them an edge over the competition. Kodak didn't lead the way in electronic cameras, and the taxi companies didn't lead the way app-based ride-sharing. And IBM didn't lead the way with the personal computer, although they were a significant follower.

'And IBM didn't lead the way with the personal computer'

Actually, they pretty much did, though not in a narrow sense. The fact that IBM offered a template that was not restricted played what is likely the single largest role in the growth of the PC (do keep in mind where that term actually comes from).

IBM did not derive the profits that companies like MS or Dell did over the following decades based on the PC, of course.

IBM did not lead in the sense of being a pioneer, true. It did lead in actually being the origin of the PC, in both a literal and in a more figurative sense. The story is much broader than any single company, but IBM played the leading role in creating the PC, most definitely including paying for the development of a PC operating system.

Where IBM failed was in trying to put the shared profit genie back in the bottle, with things like PS2 or OS2. In these areas, its leadership was a total failure, as nobody much followed them into what was clearly intended to be an IBM walled garden.

Indeed, the decline of corporate research is, in part, because managing research in a corporation is difficult. Making money (and showing that one has made money from research to those in charge of managing it) is difficult (but perhaps Google X and the like will prove me wrong!). Thus, it is no accident. The policy prescription, if any, is that the nostalgic longing for the days of yore is misplaced. We have to make the system we have work better. The other observation is that much (though not all) corporate research was spurred by tough anti-trust, which closed off acquisitions as a means of significant growth. If big incumbents have to rely on developing new products for growth, they are more likely to invest in research.

We have waged our future on tech and it isn't delivering. "We wanted flying cars, instead we got 140 characters." That's Peter Thiel's famous quote. Recall several years back when tech promised to build a car, not any car but a driverless car. How long did it take for tech to abandon that effort? Not long. Building a reliable car is hard, really hard. Recall a few years back when GE set out to reinvent the industrial firm by combining tech with its heavy industry roots. It was a miserable failure. My observation is that tech has not lived up to its promise, not for failure to attract capital (indeed, tech is a black hole for capital), but for taking the easy path to profits, which is to say advertising, and to wealth, which is to say rising asset (stock) prices. In his next blog post, Cowen points out the low rate of growth of the construction sector. Well, construction is construction, the old economy (industry), where investment in technology is minimal, not the new economy (tech), where investment in technology is a black hole. I recall my excitement in 2016 when the NYT had a long article on GE's transformation (the article was titled "G.E., the 124-Year-Old Software Start-Up"). What GE couldn't produce were easy profits from advertising or from a rising stock price, and investors soon turned against GE, its stock price plummeted, its CEO was fired, and it abandoned the transformation. Unless and until the day of reckoning for tech (and it will come), it will continue to be the black hole, where capital will disappear along with productivity growth in the "real" economy. Read it and weep: https://www.nytimes.com/2016/08/28/technology/ge-the-124-year-old-software-start-up.html

Consider the relationship between GE and CNC.

Hella innovation there, and production impossible decades earlier.

Kind of a bs article, read this for the real story:


GE got into numerical control in the 1950s, and of course they have to be good at it. Or someone eats your lunch.

“It occurs to me,” Sara said, crossing one leg over the other, shuffling her bangs and looking at Tom. “A man grows old, right, say he crosses a Rubicon. Might he want some toys?”
“I thought you were a painter,” I said.
“What repression a man must hold for a boy with toys.” She picked up a pile of towels with a brush on top and handed them to me. “I was a painter. Storms and forests, the occasional joyride. All maps in their way and poignant faces in mine. I think I lost my voice. Maybe my skin lost its color.”
“She got tired in paradise,” Tom said.
“’Of, Tom, not in.’ But he’s right, you know. I saw a tree for what it was, a bow with arrows, and who am I to refuse its enquiry.”

I was educated in a science, switched to engineering. I spent 30 years doing R&D at companies large and small.

I think this abstract is legit.

I only skipped around the actual paper, but I found the observations useful.

One thing I didn't see was comment on the perverse trend to *make* universities into quasi corporate research centers. When a uni develops a thing, and then tries to license it, rather than just share it with anyone and everyone, that creates a blockage. It slows dissemination. It certainly confuses the mission.

FWIW, I'd like to see more research in the public space for the public domain, *and* a return to corporate research for patent and profit.

This lazy "let someone else do (and pay for) it" isn't working.

My pet conspiracy theory is that the Bayh-Dole act, or at least the idea behind the Bayh-Dole act, is a driver behind the corporatization of university research. It gives private industry relatively low-cost access to the exceedingly small proportion of academic research that has near-term value, but gives private industry no oversight over the academic research. The low cost of good R&D led corporations to slowly scaled back their own R&D, especially on medium- and long-term topics. and rely more on licensing university-developed technologies. But most university R&D isn't necessarily "good", so large increases in academic R&D budgets result in only tiny increases in the amount of technologies suitable for licensing.

I too would like to see more public-domain research and more corporate research. (In the drug industry, though, this is very difficult because highly desirable "composition of matter" patent claims necessarily need to be made WAY in advance of any clinical trial results. If there were some way for industry to get decent patent protection for just executing well-run clinical trials, it would help a lot.)

Looking back, Bayh-Dole seems like a critical error, with unintended consequences.

Specialization breaks up the extended family with labor mobility. Specialization is also bad for the brain.

University research has tended to be curiosity-driven rather than mission-focused. It has favored insight rather than solutions to specific problems

Would it be more accurate to say, "University research has tended to be about getting grants and/or writing articles to put on your CV. This means doing what interests your university peers, which will often not be what will be useful to people in general."

Not fully caffeinated yet, and already I find a remarkable sentence quoted at MR!

"The translation of scientific knowledge generated in universities to productivity enhancing technical progress has proved to be more difficult to accomplish in practice than expected."

So remarkable, says I, it bears repeating and re-reading:

"The translation of scientific knowledge generated in universities to productivity enhancing technical progress has proved to be more difficult to accomplish in practice than expected."

HOW could our beloved Cognitive Elites, Inc., have become so poor at translating? Boggles minds.

Comments for this post are closed