Nick Beckstead’s conversation with Tyler Cowen

Nick is a philosopher at Oxford and he has worked with Larry Temkin and Nick Bostrom.  He typed up his version of our conversation (pdf), it starts with this:

Purpose of the conversation: I contacted Tyler to learn about his perspectives on existential risk and other long-run issues for humanity, the long-run consequences of economic growth, and the effective altruism movement.

Here are a few excerpts:

Tyler is optimistic about growth in the coming decades, but he doesn’t think we’ll become uploads or survive for a million years. Some considerations in favor of his views were:

1. The Fermi paradox is some evidence that humans will not colonize the stars.
2. Almost all species go extinct.
3. Natural disasters—even a supervolcano—could destroy humanity.
4. Normally, it’s easier to destroy than to build. And, in the future, it will probably become increasingly possible for smaller groups to cause severe global damage (along the lines suggested by Martin Rees).

The most optimistic view that Tyler would entertain—though he doubts it—is that humans would survive at subsistence level for a very long time; that’s what we’ve had for most of human history.

And:

People doing philosophical work to try to reduce existential risk are largely wasting their time. Tyler doesn’t think it’s a serious effort, though it may be good publicity for something that will pay off later. A serious effort looks more like the parts of the US government that trained people to infiltrate the post-collapse Soviet Union and then locate and neutralize nuclear weapons. There was also a serious effort by the people who set up hotlines between leaders to be used to quickly communicate about nuclear attacks (e.g., to help quickly convince a leader in country A that a fishy object on their radar isn’t an incoming nuclear attack).This has been fixed in other countries (e.g. US and China), but it hasn’t been fixed in other cases (e.g. Israel and Iran). There is more that we could do in this area. In contrast, the philosophical side of this seems like ineffective posturing.

Tyler wouldn’t necessarily recommend that these people switch to other areas of focus because people[‘s] motivation and personal interests are major constraints on getting anywhere. For Tyler, his own interest in these issues is a form of consumption, though one he values highly.

And:

Tyler thinks about the future and philosophical issues from a historicist perspective. When considering the future of humanity, this makes him focus on war, conquest, plagues, and the environment, rather than future technology.

He acquired this perspective by reading a lot of history and spending a lot of time around people in poor countries, including in rural areas. Spending time with people in poor countries shaped Tyler’s views a lot. It made him see rational choice ethics as more contingent. People in rural areas care most about things like fights with local villages over watermelon patches. And that’s how we are, but we’re living in a fog about it.

And:

The truths of literature and what you might call “the Straussian truths of the great books”—what you get from Homer or Plato—are at least as important rational choice ethics. But the people who do rational choice ethics don’t think that. If the two perspectives aren’t integrated, it leads to absurdities—problems like fanaticism, the Repugnant Conclusion, and so on. Right now though, rational choice ethics is the best we have—the problems of, e.g., Kantian ethics seem much, much worse.

If rational choice ethics were integrated with the “Straussian truths of the great books,” would it lead to different decisions? Maybe not—maybe it would lead to the same decisions with a different attitude. We might come to see rational choice ethics as an imperfect construct, a flawed bubble of meaning that we created for ourselves, and shouldn’t expect to keep working in unusual circumstances.

I’m on a plane for much of today, so you are getting Nick’s version of me, for a while at least.  You will find Nick’s other conversations here.

Comments

Comments for this post are closed