…all qualified scientists would get some guaranteed funding — no grants required. But there should be one added step: everyone must anonymously allocate a fraction of their funds to other researchers of their own choosing.
The goal of this system would be to let scientists devote more of their time to research…
In SOFA [Self-Organizing Funding Allocation], every participant starts with the same allocation of funding every year but must allot a portion to other scientists. Reasons to select someone could range from, ‘That was a great paper’ to ‘I think they will release useful data.’ Those who get the most give the most, because scientists give a percentage of everything received under SOFA. To avoid currying favour, this process will be anonymous…
We can limit collusions and kickback schemes — the financial equivalent of citation cartels — by mandating a minimum number of recipients and restricting people from designating frequent collaborators, or colleagues at the same institution. Counteracting gender, age and prestige biases that plague conventional peer review might even be easier in SOFA because they are measurable.
Here is the Johan Bollen piece in Nature.
I was very happy with how this turned out, here is the audio and transcript. Here is how the CWTeam summarized it:
Michael Pollan has long been fascinated by nature and the ways we connect and clash with it, with decades of writing covering food, farming, cooking, and architecture. Pollan’s latest fascination? Our widespread and ancient desire to use nature to change our consciousness.
He joins Tyler to discuss his research and experience with psychedelics, including what kinds of people most benefit from them, what it can teach us about profundity, how it can change your personality and political views, the importance of culture in shaping the experience, the proper way to integrate it into mainstream practice, and — most importantly of all — whether it’s any fun.
He argues that LSD is underrated, I think it may be good for depression but for casual use it is rapidly becoming overrated. Here is one exchange of relevance:
COWEN: Let me try a very philosophical question. Let’s say I could take a pill or a substance, and it would make everything seem profound. My receptivity to finding things profound would go up greatly. I could do very small events, and it would seem profound to me.
Is that, in fact, real profundity that I’m experiencing? Doesn’t real profundity somehow require excavating or experiencing things from actual society? Are psychedelics like taking this pill? They don’t give you real profundity. You just feel that many things are profound, but at the end of the experience, you don’t really have . . .
POLLAN: It depends. If you define profundity or the profound as exceptional, you have a point.
One of the things that’s very interesting about psychedelics is that our brains are tuned for novelty, and for good reason. It’s very adaptive to respond to new things in the environment, changes in your environment, threats in your environment. We’re tuned to disregard the familiar or take it for granted, which is indeed what most of us do.
One of the things that happens on psychedelics, and on cannabis interestingly enough — and there’s some science on it in the case of cannabis; I don’t think we’ve done the science yet with psychedelics — is that the familiar suddenly takes on greater weight, and there’s an appreciation of the familiar. I think a lot of familiar things are profound if looked at in the proper way.
The feelings of love I have for people in my family are profound, but I don’t always feel that profundity. Psychedelics change that balance. I talk in the book about having emotions that could be on Hallmark cards. We don’t think of Hallmark cards as being profound, but in fact, a lot of those sentiments are, properly regarded.
Yes, there are those moments you’ve smoked cannabis, and you’re looking at your hand, and you go, “Man, hands, they’re f — ing incredible.” You’re just taken with this. Is that profound or not? It sounds really goofy, but I think the line between profundity and banality is a lot finer than we think.
COWEN: I’ve never myself tried psychedelics. But I’ve asked the question, if I were to try, how would I think about what is the stopping point?
For my own life, I like, actually, to do the same things over and over again. Read books. Eat food. Spend time with friends. You can just keep on doing them, basically, till you die. I feel I’m in a very good groove on all of those.
If you take it once, and say you find it entrancing or interesting or attractive, what’s the thought process? How do you model what happens next?
POLLAN: That’s one of the really interesting things about them. You have this big experience, often positive, not always though. I had, on balance . . . all the experiences I described in the book, with one notable exception, were very positive experiences.
But I did not have a powerful desire to do it again. It doesn’t have that self-reinforcing quality, the dopamine release, I don’t know what it is, that comes with things that we like doing: eating and sex and sleep, all this kind of stuff. Your first thought after a big psychedelic experience is not “When can I do it again?” It’s like, “Do I ever have to do it again?”
COWEN: It doesn’t sound fun, though. What am I missing?
POLLAN: It’s not fun. For me, it’s not fun. I think there are doses where that might apply — low dose, so-called recreational dose, when people take some mushrooms and go to a concert, and they’re high essentially.
But the kind of experience I’m describing is a lot more — I won’t use the word profound because we’ve charged that one — that is a very internal and difficult journey that has moments of incredible beauty and lucidity, but also has dark moments, moments of contemplating death. Nothing you would describe as recreational except in the actual meaning of the word, which is never used. It’s not addictive, and I think that’s one of the reasons.
I did just talk to someone, though, who came up to me at a book signing, a guy probably in his 70s. He said, “I’ve got to tell you about the time I took LSD 16 days in a row.” That was striking. You can meet plenty of people who have marijuana or a drink 16 days in a row. But that was extraordinary. I don’t know why he did it. I’m curious to find out exactly what he got out of it.
In general, there’s a lot of space that passes. For the Grateful Dead, I don’t know. Maybe it was a nightly thing for them. But for most people, it doesn’t seem to be.
COWEN: Say I tried it, and I found it fascinating but not fun. Shouldn’t I then think there’s something wrong with me that the fascinating is not fun? Shouldn’t I downgrade my curiosity?
POLLAN: [laughs] Aren’t there many fascinating things that aren’t fun?
COWEN: All the ones I know, I find fun. This is what’s striking to me about your answer. It’s very surprising.
W even talk about LSD and sex, and why a writer’s second book is the key book for understanding that writer. Toward the end we cover the economics of food, and, of course, the Michael Pollan production function:
COWEN: What skill do you tell them to invest in?
POLLAN: I tell them to read a lot. I’m amazed how many writing students don’t read. It’s criminal. Also, read better writers than you are. In other words, read great fiction. Cultivate your ear. Writing is a form of music, and we don’t pay enough attention to that.
When I’m drafting, there’s a period where I’m reading lots of research, and scientific articles, and history, and undistinguished prose, but as soon as I’m done with that and I’ve started drafting a chapter or an article, I stop reading that kind of stuff.
Before I go to bed, I read a novel every night. I read several pages of really good fiction. That’s because you do a lot of work in your sleep, and I want my brain to be in a rhythm of good prose.
Defininitely recommended, as is Michael’s latest book How to Change Your Mind: What the New Science of Psychedelics Teaches Us About Consciousness, Dying, Addiction, Depression, and Transcendence.
That is a new paper by Mikko Packalen and Jay Bhattacharya, here is the abstract:
The National Institutes of Health (NIH) plays a critical role in funding scientific endeavors in biomedicine that would be difficult to finance via private sources. One important mandate of the NIH is to fund innovative science that tries out new ideas, but many have questioned the NIH’s ability to fulfill this aim. We examine whether the NIH succeeds in funding work that tries out novel ideas. We find that novel science is more often NIH funded than is less innovative science but this positive result comes with several caveats. First, despite the implementation of initiatives to support edge science, the preference for funding novel science is mostly limited to work that builds on novel basic science ideas; projects that build on novel clinical ideas are not favored by the NIH over projects that build on well-established clinical knowledge. Second, NIH’s general preference for funding work that builds on basic science ideas, regardless of its novelty or application area, is a large contributor to the overall positive link between novelty and NIH funding. If funding rates for work that builds on basic science ideas and work that builds on clinical ideas had been equal, NIH’s funding rates for novel and traditional science would have been the same. Third, NIH’s propensity to fund projects that build on the most recent advances has declined over the last several decades. Thus, in this regard NIH funding has become more conservative despite initiatives to increase funding for innovative projects.
Models developed for gross domestic product (GDP) growth forecasting tend to be extremely complex, relying on a large number of variables and parameters. Such complexity is not always to the benefit of the accuracy of the forecast. Economic complexity constitutes a framework that builds on methods developed for the study of complex systems to construct approaches that are less demanding than standard macroeconomic ones in terms of data requirements, but whose accuracy remains to be systematically benchmarked. Here we develop a forecasting scheme that is shown to outperform the accuracy of the five-year forecast issued by the International Monetary Fund (IMF) by more than 25% on the available data. The model is based on effectively representing economic growth as a two-dimensional dynamical system, defined by GDP per capita and ‘fitness’, a variable computed using only publicly available product-level export data. We show that forecasting errors produced by the method are generally predictable and are also uncorrelated to IMF errors, suggesting that our method is extracting information that is complementary to standard approaches. We believe that our findings are of a very general nature and we plan to extend our validations on larger datasets in future works.
That is from A. Tacchella, D. Mazzilli, and L Pietronero in Nature. Here is a Chris Lee story about the piece. Via John Chamberlin.
Here is the whole post, here is one excerpt:
If you’re 10–20: These are prime years!
- Go deep on things. Become an expert.
- In particular, try to go deep on multiple things. (To varying degrees, I tried to go deep on languages, programming, writing, physics, math. Some of those stuck more than others.) One of the main things you should try to achieve by age 20 is some sense for which kinds of things you enjoy doing. This probably won’t change a lot throughout your life and so you should try to discover the shape of that space as quickly as you can.
- Don’t stress out too much about how valuable the things you’re going deep on are… but don’t ignore it either. It should be a factor you weigh but not by itself dispositive.
- To the extent that you enjoy working hard, do. Subject to that constraint, it’s not clear that the returns to effort ever diminish substantially. If you’re lucky enough to enjoy it a lot, be grateful and take full advantage!
- Make friends over the internet with people who are great at things you’re interested in. The internet is one of the biggest advantages you have over prior generations. Leverage it.
- Aim to read a lot.
- If you think something is important but people older than you don’t hold it in high regard, there’s a decent chance that you’re right and they’re wrong. Status lags by a generation or more.
- Above all else, don’t make the mistake of judging your success based on your current peer group. By all means make friends but being weird as a teenager is generally good.
This paper explores the physics of the what-if question “what if the entire Earth was instantaneously replaced with an equal volume of closely packed, but uncompressed blueberries?”. While the assumption may be absurd, the consequences can be explored rigorously using elementary physics. The result is not entirely dissimilar to a small ocean-world exo-planet.
Here is the full analysis, via M.
Here is the transcript and audio, I am very pleased (and honored) to have been able to do this. She is an autism researcher, and so most of the discussion concerned autism, here is one excerpt:
COWEN: What would be the best understanding of autism, from your perspective?
DAWSON: The best understanding is seeing autism as atypical brain functioning, resulting in atypical processing of all information. So that’s information across domains — social, nonsocial; across modalities — visual, auditory; whatever its source, whether it’s information from your memory, information coming from the outside world, that is atypical. So that is very domain-general atypicality.
What autistic brains do with information is atypical. How it’s atypical, in my view, involves what I’ve called cognitive versatility and less mandatory hierarchies in how the brain works, such that, for example, an autistic brain will consider more possibilities, will nonstrategically combine information across levels and scales without losing large parts of it, and so on. And that applies to all information.
That is strictly my view. I’m not sure anyone would agree with me.
COWEN: Now often, in popular discourse, you’ll hear autism or Asperger’s associated with a series of personality traits or features of personality psychology — a kind of introversion or people being nerdy in some regard. In your approach, do you see any connection between personality traits and autism at all?
DAWSON: There is a small literature that shows some connection. I think it’s very weak, and I say no, I don’t think autism is about personality. Autism is sort of orthogonal to personality. The two are not related. Whatever relation there is does not . . . arises from some third factor, let’s say. If there is one — and again, the evidence is, I think, very weak connecting autism to personality — so just say that maybe, if there’s something, let’s say that personality in autistics might be more high variance. That would be my totally wild guess, but I don’t think autism itself is about personality.
And here is Michelle again:
We don’t — I hope we don’t look at a blind person who is a successful lawyer and assume that he is only very mildly blind or barely blind at all, and then look at a blind person who has a very bad outcome and assume that they must be very severely blind.
We do make those kinds of judgments in autism, saying, “The more atypical the person is, the worse they must be in some sense.” That kind of bias has not only harmed a lot of autistic people, it really has impeded research.
Here is Michelle on Twitter. We discuss and link to some of her research in the discussion.
It is a short essay, here are a few scattered bits:
The real output of the US manufacturing sector is at a lower level than before the 2008 recession; that means that there has not been real growth in US manufacturing for an entire decade. (In fact, this measure may be too rosy—the ITIF has put forward an argument that manufacturing output measures are skewed by excessive quality adjustments in computer speeds. Take away computers, which fewer and fewer people are buying these days, and US real output in manufacturing would be meaningfully lower.) Manufacturing employment peaked in 1979 at nearly 20 million workers; it fell to 17 million in 2000, 14 million in 2008, and stands at 12 million today. The US population has grown by 40% since 1979, while the number of manufacturing workers has nearly halved.
I think we should try to hold on to process knowledge.
Japan’s Ise Grand Shrine is an extraordinary example in that genre. Every 20 years, caretakers completely tear down the shrine and build it anew. The wooden shrine has been rebuilt again and again for 1,200 years. Locals want to make sure that they don’t ever forget the production knowledge that goes into constructing the shrine. There’s a very clear sense that the older generation wants to teach the building techniques to the younger generation: “I will leave these duties to you next time.”
There’s an entertaining line in the Brad Setser piece I linked to earlier. He tells us that one of the reasons that the US has such a high surplus in the services trade is that Americans have a low propensity to travel abroad. I don’t view that as a great way to earn a trade surplus.
There is much more at the link.
Namely the fear of owing other people, or institutions, a favor, or maybe just the possible perception of such?:
The researchers believe reciprocity anxiety is likely to be greater the bigger a favour and the more public its receipt. They think it’s a trait that companies should take an interest in – while loyalty schemes, vouchers and other freebies have obvious appeal to many customers, results from two initial studies suggested that these marketing strategies are actually likely to deter others…
In a follow-up study, volunteers imagined a shop attendant offering them a free drink and plate full of snacks. Afterwards, high scorers in reciprocity anxiety scored lower for customer satisfaction and they said they would be less willing to visit the store again and less willing to spread a good word about the shop.
“Reciprocity works to establish a psychological bond” between customer and firm, the researchers said, but the discomfort it causes can backfire among those high in reciprocity anxiety, especially if they feel the benefits reflect badly on them or that they will struggle to reciprocate (around 18 per cent of people tested in these new studies scored highly in the trait; age and gender were unrelated).
…I wonder how it might impact the ways that people manage their friendships and other relationships – perhaps high scorers in reciprocity anxiety are inclined to turn down invitations, seek help or receive other friendly favours, putting them at risk of loneliness and isolation.
The average length of a published economics paper has more than tripled over the past four decades, and some academics are sick of wading through them. At this year’s American Economics Association conference, Massachusetts Institute of Technology professor David Autor compared a 94-page working paper about the minimum wage to “being bludgeoned to death with a Nerf bat” and started a Twitter hashtag, #ThePaperIsTooDamnedLong.
…Between 1970 and 2017, the average length of papers published in five top-ranked economics journals swelled from 16 pages to 50 pages, according to an analysis by University of California, Berkeley economists Stefano DellaVigna and David Card.
Longer papers can include more-robust statistical analysis, engage in multifaceted arguments or address complex topics. Some economists speculate paper inflation is also the product of the laborious peer-review process, in which other economists act as referees and read drafts, then demand any number of additions before publication…
Economists also tend to write defensively, including redundant material even in early versions of papers to head off possible quibbles that might come up during the review process, said Samuel Bazzi, an economics professor at Boston University.
That is from Ben Jeubsdorf at the WSJ. One question is whether longer papers are better from a scientific point of view. A second and more important question is whether long papers are better for attracting genius talent to the economics profession.
For the pointer I thank the excellent Samir Varma.
Of course, poor kids can still soar in school, and rich ones can flunk out, but few would deny that money is a powerful influence on people’s futures. Now, consider that household income explains just 7 percent of the variation in educational attainment, which is less than what genes can now account for. “Most social scientists wouldn’t do a study without accounting for socioeconomic status, even if that’s not what they’re interested in,” says Harden. The same ought to be true of our genes.
“Education needs to start taking these developments very seriously,” says Kathryn Asbury from the University of York, who studies education and genetics. “Any factor that can explain 11 percent of the variance in how a child performs in school is very significant and needs to be carefully explored and understood.”
The researchers are to the point:
What policy lessons or practical advice do you draw from this study?
None whatsoever. Any practical response—individual or policy-level—to this or similar research would be extremely premature and unsupported by the science.
Formal training programs, which can be called education, enhance cognition in human and nonhuman animals alike. However, even informal exposure to human contact in human environments can enhance cognition. We review selected literature to compare animals’ behavior with objects among keas and great apes, the taxa that best allow systematic comparison of the behavior of wild animals with that of those in human environments such as homes, zoos, and rehabilitation centers. In all cases, we find that animals in human environments do much more with objects. Following and expanding on the explanations of several previous authors, we propose that living in human environments and the opportunities to observe and manipulate human-made objects help to develop motor skills, embodied cognition, and the use of objects to extend cognition in the animals. Living in a human world also furnishes the animals with more time for such activities, in that the time needed for foraging for food is reduced, and furnishes opportunities for social learning, including emulation, an attempt to achieve the goals of a model, and program-level imitation, in which the imitator reproduces the organizational structure of goal-directed actions without necessarily copying all the details. All these factors let these animals learn about the affordances of many objects and make them better able to come up with solutions to physical problems.
Obviously his talents in crypto and programming are well-known, but he is also a first-rate thinker on both economics and what you broadly might call sociology. You could take away the crypto contributions altogether, and he still would be one of the very smartest people I have met. Here is the audio and transcript. The CWT team summarized it as follows:
Tyler sat down with Vitalik to discuss the many things he’s thinking about and working on, including the nascent field of cryptoeconomics, the best analogy for understanding the blockchain, his desire for more social science fiction, why belief in progress is our most useful delusion, best places to visit in time and space, how he picks up languages, why centralization’s not all bad, the best ways to value crypto assets, whether P = NP, and much more.
Here is one excerpt:
COWEN: If you could go back into the distant past for a year, a time and place of your choosing, you have the linguistic skills and immunity against disease to the extent you need it, maybe some money in your pocket, where would you pick to satisfy your own curiosity?
BUTERIN: Where would I pick? To do what? To spend a year there, or . . . ?
COWEN: Spend a year as a “tourist.” You could pick ancient Athens or preconquest Mexico or medieval Russia. It’s a kind of social science fiction, right?
BUTERIN: Yeah, totally. Let’s see. Possibly first year of World War II — obviously, one of those areas that’s close to it but still reasonably safe from it…
Basically, experience more of what human behavior and what collective human behavior would look like once you pushed humans further into extremes, and people aren’t as comfortable as they are today.
I started the whole dialogue with this:
I went back and I reread all of the papers on your home page. I found it quite striking that there were two very important economics results, one based on menu costs associated with the name of Greg Mankiw. Another is a paper on the indeterminacy of monetary equilibrium associated with Fischer Black.
These are famous papers. On your own, you appear to rediscover these results without knowing about the papers at all. So how would you describe how you teach yourself economics?
Highly recommended, whether or not you understand blockchain. Oh, and there is this:
COWEN: If you had to explain blockchain to a very smart person from 40 years ago, who knew computers but had no idea of crypto, what would be the best short explanation you could give them, basically, for what you do?
BUTERIN: Sure. One of the analogies I keep going back to is this idea of a “world computer.” The idea, basically, is that a blockchain, as a whole, functions like a computer. It has a hard drive, and on that hard drive, it stores what all the accounts are.
It stores what the code of all the smart contracts is, what the memory of all these smart contracts is. It accepts incoming instructions — and these incoming instructions are signed transactions sent by a bunch of different users — and processes them according to a set of rules.
Wealthier countries allocate a greater proportion of their workers to science and engineering, fields which produce ideas that often benefit everyone. This is one reason why we all gain when other countries become rich. It’s not just the number of scientists and engineers that matters, however. In a clever paper, Agarwal and Gaule demonstrate that equally talented people are more productive in wealthier countries.
Agarwal and Gaule collect the scores of thousands of teenagers who entered the International Math Olympiad between 1981 and 2000 and they follow their careers. Every additional point earned at the Olympiad increases the likelihood that a participant will later earn a math PhD, be heavily cited, even earn a Fields medal. But Olympians from poorer countries are less likely to contribute to the mathematical frontier than equally talented teens from richer countries. It could be that smart teens from poorer countries are less likely to pursue a math career–and that could well be optimal–but Agarwal and Gaule find that many of the talented kids from poorer countries simply disappear off the world’s radar. Their talent is wasted.
The post-Olympiad loss is not the largest loss. Most of the potentially great mathematicians from poorer countries are lost to the world long before the opportunity to participate in an Olympiad. But it is frustrating that even after talent has been identified, it does not always bloom. We are, however, starting to do better.
You can see from the graph that upper-middle income countries are as good as turning their talent into results as high-income countries. Agarwal and Gaule also find some evidence that the low-income penalty is diminishing over time.
As incomes increase around the world it’s as if the entire world’s processing power is coming online for the first time in human history. That, at least, is one reason for optimism.
Hat tip: Floridan Ederer.
Scientific output is not a linear function of amounts of federal grant support to individual investigators. As funding per investigator increases beyond a certain point, productivity decreases. This study reports that such diminishing marginal returns also apply for National Institutes of Health (NIH) research project grant funding to institutions. Analyses of data (2006-2015) for a representative cross-section of institutions, whose amounts of funding ranged from $3 million to $440 million per year, revealed robust inverse correlations between funding (per institution, per award, per investigator) and scientific output (publication productivity and citation impact productivity). Interestingly, prestigious institutions had on average 65% higher grant application success rates and 50% larger award sizes, whereas less-prestigious institutions produced 65% more publications and had a 35% higher citation impact per dollar of funding. These findings suggest that implicit biases and social prestige mechanisms (e.g., the Matthew effect) have a powerful impact on where NIH grant dollars go and the net return on taxpayers investments. They support evidence-based changes in funding policy geared towards a more equitable, more diverse and more productive distribution of federal support for scientific research. Success rate/productivity metrics developed for this study provide an impartial, empirically based mechanism to do so.