Category: Science
Okie-dokie, solve for the equilibrium
One of the grand challenges of artificial general intelligence is developing agents capable of conducting scientific research and discovering new knowledge. While frontier models have already been used as aids to human scientists, e.g. for brainstorming ideas, writing code, or prediction tasks, they still conduct only a small part of the scientific process. This paper presents the first comprehensive framework for fully automatic scientific discovery, enabling frontier large language models to perform research independently and communicate their findings. We introduce The AI Scientist, which generates novel research ideas, writes code, executes experiments, visualizes results, describes its findings by writing a full scientific paper, and then runs a simulated review process for evaluation. In principle, this process can be repeated to iteratively develop ideas in an open-ended fashion, acting like the human scientific community. We demonstrate its versatility by applying it to three distinct subfields of machine learning: diffusion modeling, transformer-based language modeling, and learning dynamics. Each idea is implemented and developed into a full paper at a cost of less than $15 per paper. To evaluate the generated papers, we design and validate an automated reviewer, which we show achieves near-human performance in evaluating paper scores. The AI Scientist can produce papers that exceed the acceptance threshold at a top machine learning conference as judged by our automated reviewer. This approach signifies the beginning of a new era in scientific discovery in machine learning: bringing the transformative benefits of AI agents to the entire research process of AI itself, and taking us closer to a world where endless affordable creativity and innovation can be unleashed on the world’s most challenging problems. Our code is open-sourced at this https URL
That is from a new paper by Chris Lu, Cong Lu, Robert Tjarko Lange, Jakob Foerster, Jeff Clune, David Ha. Note this is related to some earlier work in economics by Benjamin Manning of MIT (with co-authors).
I’ve said it before, and I’ll say it again. The marginal product of LLMs is when they are interacting with well-prepared, intricately cooperating humans at their peak, not when you pose them random queries for fun.
Beware research in large teams
Teamwork has become more important in recent decades. We show that larger teams generate an unintended side effect: individuals who finish their PhD when the average team in their field is larger have worse career prospects. Our analysis combines data on career outcomes from the Survey of Doctorate Recipients with publication data that measures team size from ISI Web of Science. As average team size in a field increased over time, junior academic scientists became less likely to secure research funding or obtain tenure and were more likely to leave academia relative to their older counterparts. The team size effect can fully account for the observed decline in tenure prospects in academic science. The rise in team size was not associated with the end of mandatory retirement. However, the doubling of the NIH budget was associated with a significant increase in team size. Our results demonstrate that academic science has not adjusted its reward structure, which is largely individual, in response to team science. Failing to address these concerns means a significant loss as junior scientists exit after a costly and specialized education in science.
That is from a new NBER working paper by
My excellent Conversation with Paul Bloom
Here is the audio, video, and transcript. Here is part of the episode summary:
Together Paul and Tyler explore whether psychologists understand day-to-day human behavior any better than normal folk, how babies can tell if you’re a jerk, at what age children have the capacity to believe in God, why the trend in religion is toward monotheism, the morality of getting paid to strangle cats, whether disgust should be built into LLMs, the possibilities of AI therapists, the best test for a theory of mind, why people overestimate Paul’s (and Tyler’s) intelligence, why flattery is undersupplied, why we should train flattery and tax empathy, Carl Jung, Big Five personality theory, Principles of Psychology by William James, the social psychology of the Hebrew Bible, his most successful unusual work habit, what he’ll work on next, and more.
And here is one excerpt:
COWEN: I have some questions about intelligence for you. If we think of large language models, should we let them feel disgust so that they avoid left-wing bias?
BLOOM: [laughs] Why would disgust make them avoid left-wing bias?
COWEN: Maybe we’re not sure it would, but there are various claims in the literature that for people on the right, disgust is a more fundamental emotion, and that a greater capacity to feel disgust encourages people in some ways to be more socially conservative. Debatable, but I don’t think it’s a crazy view. So, if you build LLMs, and you give them, say, a lot of empathy and not much or any disgust, you’re going to get left-leaning LLMs, which you might say, “Well, that was my goal.” But obviously, not everyone will accept that conclusion either.
BLOOM: I wouldn’t want woke LLMs. I think there’s a lot in extreme —
COWEN: You’ve got them, of course.
BLOOM: I’ve got them. I think Gemini is the one, if I wanted to go — the woke LLM of choice. Because I think the doctrine called wokeness leads to a lot of moral problems and makes the world worse in certain ways, but I wouldn’t mind left-wing LLMs.
In fact, I’m not a fan of disgust. You’re right that disgust is often associated with right-wing, but in the very worst instantiation of it. Disgust is what drives hatred towards gay people. It involves hatred of interracial marriage, the exclusion of immigrants, the exclusion of other races. If there’s one emotion I would take away from people, it would be disgust, at least disgust in the moral realm. They could keep their disgust towards rotten food and that sort of thing. That’s the one thing I wouldn’t put into LLMs. I’d rather put anger, pity, gratitude. Disgust is the one thing I’d keep away.
COWEN: So, you wouldn’t just cut back on it at the margin. You would just take disgust out of people if you could?
And:
COWEN: I think at the margin, I’ve moved against empathy more being a podcast host, that I’ll ask a question —
BLOOM: Wait. Why being a podcast host?
COWEN: Well, I’ll ask a question, and a lot of guests think it’s high status simply to signal empathy rather than giving a substantive answer. The signaling-empathy answers I find quite uninteresting, and I think a lot of my listeners do, too. Yet people will just keep on doing this, and I get frustrated. Then I think, “Well, Tyler, you should turn a bit more against empathy for this reason.” And I think that’s correct.
Paul cannot be blamed for doing that, however. So substantive, interesting, and entertaining throughout.
Dark oxygen: jubiliant for others, cry for yourself and your kin
To summarize the new results:
An international team of researchers recently discovered that oxygen is being made by potato-shaped metallic nodules deep under the surface of the Pacific Ocean. In July, their findings, which throw into dispute the concepts of oxygen production, were published in the Nature Geoscience jonal. The discovery could lead to a reconsideration of the origins of complex life on Earth.
The findings from a team of researchers led by Professor Andrew Sweetman at the U.K.’s Scottish Association for Marine Science, show that oxygen is being produced at around 4,000 metres below the surface of the ocean in complete darkness. This contradicts previous scientific assumptions that only living organisms, including plants and algae, can use energy to create oxygen through photosynthesis, using sunlight for the reaction.
As Julian Gough suggests, most life probably is on icy moons. This means a lot more life! Over a time slice, it could mean billions of additional lives out there. Did you pop up the champagne?
The bad news is that the chance that Robin Hanson’s “Great Filter” lies behind us is somewhat smaller. Which boosts the chance that it may lie in our near future. Did you pull out the tissues?
On net, did this news change your mood at all? Why or why not?
The Unseen Fallout: Chernobyl’s Deadly Air Pollution Legacy
A fascinating new paper The Political Economic Determinants of Nuclear Power: Evidence from Chernobyl by Makarin, Qian, and Wang was recently presented at the NBER Pol. Economy conference. The paper is nominally about how fossil fuel companies and coal miners in the US and UK used the Chernobyl disaster to successfully lobby against building more nuclear power plants. The data collection here is impressive but that is just how democracy works. I found the political economy section less interesting than some of the background material.
First, the Chernobyl disaster ended nuclear power plant (NPP) construction in the United States (top-left panel), the country with the most NPPs in the world . Surprisingly, the Three Mile Island accident in 1979 (much less serious than Chernobyl) had very little effect on construction; albeit the 1-2 punch with Chernobyl in 1986 surely didn’t help. The same pattern is very clear across all countries and also all democracies (top-right panel). The bottom two panels show the same data but looking at new plants rather than the cumulative total–there was a sharp break in 1986 with growth quickly converging to zero new plants per year.
Fewer nuclear plants than otherwise would have been the case might have made a disaster less likely but there were countervailing forces:
We document that the decline in new NPPs in democracies after Chernobyl was accompanied by an increase in the average age of the NPPs in use. To satisfy the rise in energy demand, reactors built prior to Chernobyl continued operating past their initially scheduled retirement dates. Using data on NPP incident reports, we show that such plants are more likely to have accidents. The data imply that Chernobyl resulted in the continued operation of older and more dangerous NPPs in the democracies.
Moreover, safety declined because the existing plants got older but in addition “the slowdown of new NPP construction…delayed the adoption of new safer plants.” This is a point about innovation that I have often emphasized (see also here)
The key to innovation is continuous refinement and improvement…. Learning by doing requires doing….Thus, when considering innovation today, it’s essential to think about not only the current state of technology but also about the entire trajectory of development. A treatment that’s marginally better today may be much better tomorrow.
Regulation increased costs substantially:
The U.S. NRC requires six-to-seven-years to approve NPPs. The total construction time afterwards ranges from decades to indefinite. Cost overruns and changing regulatory requirements during the construction process sometime forces construction to be abandoned after significant sunk costs have been made. This often leads investors to abandon construction after already sunk billions of dollars of investment. Worldwide, companies have stopped construction on 90 reactors since the 1980s. 40 of those were in the U.S. alone. For example, in 2017, two South Carolina utilities abandoned two unfinished Westinghouse AP1000 reactors due to significant construction delays and cost overruns. At the time, this left two other U.S. AP1000 reactors under construction in Georgia. The original cost estimate of $14 billion for these two reactors rose to $23 billion. Construction only continued when the U.S. federal government promised financial support. These were the first new reactors in the U.S. in decades. In contrast, recent NPPs in China have taken only four to six years and $2 billion dollars per reactor. When considering the choice of investing in nuclear energy versus fossil fuel energy, note that a typical natural gas plant takes approximately two years to construct (Lovering et al., 2016).
Chernobyl, to be clear, was a very costly disaster
The initial emergency response, together with later decontamination of the environment, required more than 500,000 personnel and an estimated US$68 billion (2019 USD). Between five and seven percent of government spending in Ukraine is still related to Chernobyl. (emphasis added, AT) In Belarus, Chernobyl-related expenses fell from twenty-two percent of the national budget in 1991 to six percent by 2002.
The biggest safety effect of the decline in nuclear power plants was the increase in air pollution. The authors use satellite date on ambient particles to show that when a new nuclear plant comes online pollution in nearby cities declines significantly. Second, they use the decline in pollution to create preliminary estimates of the effect of pollution on health:
According to our calculations, the construction of an additional NPP, by reducing the total suspended particles (TSP) in the ambient environment, could on average save 816,058 additional life years.
According to our baseline estimates (Table 1), over the past 38 years, Chernobyl reduced the total number of NPPs worldwide by 389, which is almost entirely driven by the slowdown of new construction in democracies. Our calculations thus suggest that, globally, more than 318 million expected life years have been lost in democratic countries due to the decline in NPP growth in these countries after Chernobyl.
The authors use the Air Quality Life Index from the University of Chicago which I think is on the high side of estimates. Nevertheless, as you know, I think the new air pollution literature is credible (also here) so I think the bottom line is almost certainly correct. Namely, Chernobyl caused many more deaths by reducing nuclear power plant construction and increasing air pollution than by its direct effects which were small albeit not negligible.
Operation Warp Speed for Cows
The UK Health Security Agency has raised their pandemic threat level for H5N1 bird flu from a 3 to a 4 on a 6 point scale.
My takeaway is that we have completely failed to stem the outbreak in cattle, there has been animal to human transmission which we are surely undercounting, but so far the virus has not mutated in a way to make it very adaptable to humans.
The failure to stem the outbreak in cattle is concerning because it suggests we would not be able to stem a human outbreak. We can easily test, quarantine and cull cattle!
It is absolutely outrageous that dairy farmers are refusing to cooperate on testing:
To date dairy farmers have, in large measure, refused to cooperate with efforts to chart how deeply the virus has infiltrated U.S. herds, seeing the possible stigma of admitting they have H5N1-infected cows as a greater risk than the virus itself.
We should be testing at much higher rates and quarantining and culling. The dairy farmers should be and are being compensated but frankly the farmers should have no say in the matter of testing. Externalities! Preventing a pandemic is much cheaper both in resources and in restrictions on liberty than dealing with one.
And how about an Operation Warp Speed for a vaccine for cows? Vaccinate. Vacca! It’s right there in the name! If only we could come up with a clever acronym for an Operation Warp Speed for COWS.
Developing a vaccine for cows would also speed up a human vaccine if one were needed.
Here are some key points from the UK HSA:
There is ongoing transmission of influenza A(H5N1) in the US, primarily through dairy cattle but with multispecies involvement including poultry, wild birds, other mammals (cats, rodents, wild mammals) and humans (1, 2). There is high uncertainty regarding the trajectory of the outbreak and there is no apparent reduction in transmission in response to the biosecurity measures that have been introduced to date. There is ongoing debate about whether the current outbreak should be described as sustained transmission given that transmission is likely to be facilitated by animal farming activities (3). However, given that this is a permanent context, the majority of the group considered this outbreak as sustained transmission with the associated risks.
…There is evidence of zoonotic transmission (human cases acquired from animals). There is likely to be under-ascertainment of mild zoonotic cases.
..Overall, there is no evidence of change in HA which is suggestive of human adaptation through these acquired mutations. Although genomic surveillance data are likely to lag behind infections, the lack of evidence of viral adaptation to α2,6SA receptors after thousands of dairy cattle infected may suggest that transmission within cows does not strongly predispose to human receptor adaptation. Evidence of which sialic acid receptors are present in cows, which is needed to support this hypothesis, is still preliminary and requires confirmation.
Global warming, and rate effects vs. level effects
There is a very interesting new paper on this topic byIshan B. Nath, Valerie A. Ramey, and Peter J. Klenow. Here is the abstract:
Does a permanent rise in temperature decrease the level or growth rate of GDP in affected countries? Differing answers to this question lead prominent estimates of climate damages to diverge by an order of magnitude. This paper combines indirect evidence on economic growth with new empirical estimates of the dynamic effects of temperature on GDP to argue that warming has persistent, but not permanent, effects on growth. We start by presenting a range of evidence that technology flows tether country growth rates together, preventing temperature changes from causing growth rates to diverge permanently. We then use data from a panel of countries to show that temperature shocks have large and persistent effects on GDP, driven in part by persistence in temperature itself. These estimates imply projected future impacts that are three to five times larger than level effect estimates and two to four times smaller than permanent growth effect estimates, with larger discrepancies for initially hot and cold countries.
Here is one key part of the intuition:
We present a range of evidence that global growth is tied together across countries, which suggests that country-specific shocks are unlikely to cause permanent changes in country-level growth rates…Relatedly, we find that differences in levels of income across countries persist strongly, while growth differences tend to be transitory.
Another way to make the point is that one’s model of the process should be consistent with a pre-carbon explosion model of income differences (have you ever seen those media articles about how heat from climate change supposedly is making us stupider, with no thought as to further possible implications of that result? Mood affiliation at work there, of course).
After the authors go through all of their final calculations, 3.7 degrees Centigrade of warming reduces global gdp by 7 to 12 percent by 2099, relative to no warming at all. For sub-Saharan Africa, gdp falls by 21 percent, but for Europe gdp rises by 0.6 percent, again by 2099.
The authors also work through just how sensitive the results are to what is a level effect and what is a growth effect. For instance, if a warmer Europe leads to a permanent growth-effect projection, Europe would see a near-doubling of income, compared to the no warming scenario. The reduction in African gdp would be 88 percent, not just 21 percent.
By the way, the authors suggest the growth bliss point for a country (rat’ mal!) is thirteen degrees Centigrade.
This paper has many distinct moving parts, and thus it is difficult to pin down what is exactly the right answer, a point the authors stress rather than try to hide. In any case it represents a major advance of thought in this very difficult area.
From Google DeepMind (it’s happening)
We’re presenting the first AI to solve International Mathematical Olympiad problems at a silver medalist level. It combines AlphaProof, a new breakthrough model for formal reasoning, and AlphaGeometry 2, an improved version of our previous system.
Here is further information.
From the NYT three days ago “A.I. Can Write Poetry, but It Struggles With Math.” From the NYT today: “Move Over, Mathematicians, Here Comes AlphaProof.” And here is one opinion: “This type of A.I. learns by itself and can scale indefinitely, said Dr. Silver, who is Google DeepMind’s vice-president of reinforcement learning.” Okie-dokie!
Not Lost In Translation: How Barbarian Books Laid the Foundation for Japan’s Industrial Revoluton
Japan’s growth miracle after World War II is well known but that was Japan’s second miracle. The first was perhaps even more miraculous. At the end of the 19th century, under the Meiji Restoration, Japan transformed itself almost overnight from a peasant economy to an industrial powerhouse.
After centuries of resisting economic and social change, Japan transformed from a relatively poor, predominantly agricultural economy specialized in the exports of unprocessed, primary products to an economy specialized in the export of manufactures in under fifteen years.
In a remarkable new paper, Juhász, Sakabe, and Weinstein show how the key to this transformation was a massive effort to translate and codify technical information in the Japanese language. This state-led initiative made cutting-edge industrial knowledge accessible to Japanese entrepreneurs and workers in a way that was unparalleled among non-Western countries at the time.
Here’s an amazing graph which tells much of the story. In both 1870 and 1910 most of the technical knowledge of the world is in French, English, Italian and German but look at what happens in Japan–basically no technical books in 1870 to on par with English in 1910. Moreover, no other country did this.
Translating a technical document today is much easier than in the past because the words already exist. Translating technical documents in the late 19th century, however, required the creation and standardization of entirely new words.
…the Institute of Barbarian Books (Bansho Torishirabesho)…was tasked with developing English-Japanese dictionaries to facilitate technical translations. This project was the first step in what would become a massive government effort to codify and absorb Western science. Linguists and lexicographers have written extensively on the difficulty of scientific translation, which explains why little codification of knowledge happened in languages other than English and its close cognates: French and German (c.f. Kokawa et al. 1994; Lippert 2001; Clark 2009). The linguistic problem was two-fold. First, no words existed in Japanese for canonical Industrial Revolution products such as the railroad, steam engine, or telegraph, and using phonetic representations of all untranslatable jargon in a technical book resulted in transliteration of the text, not translation. Second, translations needed to be standardized so that all translators would translate a given foreign word into the same Japanese one.
Solving these two problems became one of the Institute’s main objectives.
Here’s a graph showing the creation of new words in Japan by year. You can see the explosion in new words in the late 19th century. Note that this happened well after the Perry Mission. The words didn’t simply evolve, the authors argue new words were created as a form of industrial policy.
By the way, AstralCodexTen points us to an interesting biography of a translator at the time who works on economics books:
[Fukuzawa] makes great progress on a number of translations. Among them is the first Western economics book translated into Japanese. In the course of this work, he encounters difficulties with the concept of “competition.” He decides to coin a new Japanese word, kyoso, derived from the words for “race and fight.” His patron, a Confucian, is unimpressed with this translation. He suggests other renderings. Why not “love of the nation shown in connection with trade”? Or “open generosity from a merchant in times of national stress”? But Fukuzawa insists on kyoso, and now the word is the first result on Google Translate.
There is a lot more in this paper. In particular, showing how the translation of documents lead to productivity growth on an industry by industry basis and a demonstration of the importance of this mechanism for economic growth across the world.
The bottom line for me is this: What caused the industrial revolution is a perennial question–was it coal, freedom, literacy?–but this is the first paper which gives what I think is a truly compelling answer for one particular case. Japan’s rapid industrialization under the Meiji Restoration was driven by its unprecedented effort to translate, codify, and disseminate Western technical knowledge in the Japanese language.
Disappearing polymorphs
Here’s a wild phenomena I wasn’t previously aware of: In crystallography and materials science, a polymorph is a solid material that can exist in more than one crystal structure while maintaining the same chemical composition. Diamond and graphite are two polymorphs of carbon. Diamond is carbon crystalized with an isometric structure and graphite is carbon crystalized with a hexagonal structure. Now imagine that one day your spouse’s diamond ring turns to graphite! That’s unlikely with carbon but it happens with other polymorphs when a metastable (locally) stable version becomes seeded with a stable version.
The drug ritonavir originally used for AIDS (and also a component of the COVID medication Paxlovid), for example, was created in 1996 but in 1998 it couldn’t be produced any longer. Despite the best efforts of the manufacturer, Abbott, every time they tried to create the old ritonavir a new crystalized version (form II) was produced which was not medically effective. The problem was that once form II exists it’s almost impossible to get rid of it and microscopic particles of form II ritonavir seeded any attempt to create form I.
Form II was of sufficiently lower energy that it became impossible to produce Form I in any laboratory where Form II was introduced, even indirectly. Scientists who had been exposed to Form II in the past seemingly contaminated entire manufacturing plants by their presence, probably because they carried over microscopic seed crystals of the new polymorph.
Wikipedia continues:
In the 1963 novel Cat’s Cradle, by Kurt Vonnegut, the narrator learns about Ice-nine, an alternative structure of water that is solid at room temperature and acts as a seed crystal upon contact with ordinary liquid water, causing that liquid water to instantly freeze and transform into more Ice-nine. Later in the book, a character frozen in Ice-nine falls into the sea. Instantly, all the water in the world’s seas, rivers, and groundwater transforms into solid Ice-nine, leading to a climactic doomsday scenario.
Given the last point you will perhaps not be surprised to learn that the hat tip goes to Eliezer Yudkowsky who worries about such things.
Why isn’t there an economics of animal welfare field?
On Friday I was keynote speaker at a quite good Brown University conference on this topic. I, like some of the other people there, wondered why animal welfare does not have its own economics journal, own association, own JEL code, and own mini-field, much as cultural economics or defense economics developed several decades ago. How about its own blog or Twitter feed? You might even say there is a theorem of sorts: if an economics subfield can exist, it will. And so I think this subfield is indeed on the way. Perhaps it needs one or two name-recognized economists to publish a paper on the topic in a top five journal? Whoever writes such a breakthrough piece will be cited for a long time to come, even if many of those citations will not be in top-tier journals. Will that person be you?
I do understand there is plenty about animal welfare in ag econ journals and departments, but somehow the way the world is tiered that just doesn’t count. Yes that is unfair, but the point remains that this subfield remains an underexploited intellectual profit opportunity.
Addendum: Here is a new piece by Cass Sunstein.
The class gap in academic career progression
There is a new and excellent paper by Anna Stansbury and Kyra Rodriguez on this topic:
Unlike gender or race, class is rarely a focus of research or DEI efforts in elite US occupations. Should it be? In this paper, we document a large class gap in career progression in one labor market: US tenure-track academia. Using parental education to proxy for socioeconomic background, we compare career outcomes of people who got their PhDs in the same institution and field (excluding those with PhD parents). First-generation college graduates are 13% less likely to end up tenured at an R1, and are on average tenured at institutions ranked 9% lower, than their PhD classmates with a parent with a (non-PhD) graduate degree. We explore three sets of mechanisms: (1) research productivity, (2) networks, and (3) preferences. Research productivity can explain less than a third of the class gap, and preferences explain almost none. Our analyses of coauthor characteristics suggest networks likely play a role. Finally, examining PhDs who work in industry we find a class gap in pay and in managerial responsibilities which widens over the career. This means a class gap in career progression exists in other US occupations beyond academia.
Here is a first-rate tweet storm by Stansbury on the paper. Via Aidan Finley.
The intelligent chicken culture that is Canada
A British Columbia chicken earned a Guinness World Record by identifying different numbers, colors and letters.
Gabriola Island veterinarian Emily Carrington said she bought five hyline chickens last year to produce eggs, and she soon started training the hens to identify magnetic letters and numbers.
“Their job was to only peck the number or letter that I taught them to peck and ignore the other ones. Even if I add a whole bunch of other letters that aren’t the letter they are supposed to peck, they will just peck the letter that I trained them to peck,” Carrington told the Nanaimo News Bulletin.
Carrington decided to have all of her chickens attempt the Guinness World Records title for the most tricks by a chicken in one minute.
One of the chickens, Lacy, emerged as the clear winner of the flock, correctly identifying 6 letters, numbers and colors in one minute.
The focused nature of the tricks led Guinness World Records to create a new category for Lacy: the most identifications by a chicken in one minute.
Here is the full story, via the excellent Samir Varma.
How Many Workers Did It Take to Build the Great Pyramid of Giza?
The Great Pyramid of Giza was built circa 2600 BC and was the world’s tallest structure for nearly 4000 years. It consists of an estimated 2.3 million blocks with a weight on the order of 6-7 million tons. How many people did it take to construct the Great Pyramid? Vaclav Smil in Numbers Don’t Lie gives an interesting method of calculation:
The Great Pyramid’s potential energy (what is required to lift the mass above ground level) is about 2.4 trillion joules. Calculating this is fairly easy: it is simply the product of the acceleration due to gravity, the pyramid’s mass, and its center of mass (a quarter of its height)…I am assuming a mean of 2.6 tons per cubic meter and hence a total mass of about 6.75 million tons.
People are able to convert about 20 percent of food energy into useful work, and for hard-working men that amounts to about 440 kilojoules a day. Lifting the stones would thus require about 5.5 million labor days (2.4 trillion/44000), or about 275,000 days a year during [a] 20 year period, and about 900 people could deliver that by working 10 hours a day for 300 days a year. A similar number might be needed to emplace the stones in the rising structure and then smooth the cladding blocks…And in order to cut 2.6 million cubic meters of stone in 20 years, the project would have required about 1,500 quarrymen working 300 days per year and producing 0.25 cubic meters of stone per capita…the grand total would then be some 3,300 workers. Even if we were to double that in order to account for designers, organizers and overseers etc. etc….the total would be still fewer than 7,000 workers.
…During the time of the pyramid’s construction, the total population of Egypt was 1.5-1.6 million people, and hence the deployed force of less than 10,000 would not have amounted to any extraordinary imposition on the country’s economy.
I was surprised at the low number and pleased at the unusual method of calculation. Archeological evidence from the nearby worker’s village suggests 4,000-5,000 on site workers, not including the quarrymen, transporters and designers and support staff. Thus, Smil’s calculation looks very good.
What other unusual calculations do you know?

Nuclear is Not Best Everywhere
Australia is having a debate over nuclear power. Hamilton and Heeney weigh in with an important perspective:
On the basis of many conversations about Australian energy policy over the years, we can divide the proponents of nuclear energy into three groups.
The first might be called the “ideologues”. They favour nuclear not because of its zero emissions, but despite it. Indeed, many are climate sceptics. They hate renewables because the left loves them, and they favour nuclear because the left hates it.
The second might be called the “engineers”. They favour nuclear energy because it’s cool. Like a Ferrari, they marvel at its performance and stability. They see it as the energy source of the future. The stuff of science fiction.
The third might be called the “pragmatists”. They are not super attentive or highly informed about the intricacies of energy policy. They superficially believe nuclear can serve as a common-sense antidote to the practical shortcomings of renewables.
Conspicuously absent are those who might be called the “economists”. They couldn’t care less about exactly how electrons are produced. They simply want the cheapest possible energy that meets a minimum standard of reliability and emissions.
On the basis of the economics, Hamilton and Heeney conclude that nuclear is expensive for Australia:
The CSIRO estimates the cost of 90 per cent renewables, with firming, transmission, and integration costs included, at $109 per megawatt hour. Based on South Korean costs (roughly one-third of the US and Europe), a 60-year lifespan, a 60 per cent economic utilisation rate (as per coal today), and an eight-year build time (as per the global average), nuclear would cost $200 per megawatt hour – nearly double.
The same electrons delivered with the same reliability, just twice as expensive under what is a fairly optimistic scenario.
Note–this is taking into account that nuclear is available when the sun doesn’t shine and the winds don’t blow–so are batteries.
I suspect that Hamilton and Heeney are right on the numbers but it’s this argument that I find most compelling:
If you need external validation of these basic economics, look no further than the opposition’s own announcement. Rather than lift the moratorium and allow private firms to supply nuclear energy if it’s commercially viable, the opposition has opted for government to be the owner and operator. A smoking gun of economic unviability if ever there were one.
I am optimistic about the potential of small modular reactors (SMRs) based on innovative designs. These reactors can ideally be located near AI facilities. As I argued in the Marginal Revolution Theory of Innovation, innovation is a dynamic process; success rarely comes on the first attempt. The key to innovation is continuous refinement and improvement. These small reactors based on different technologies give as an opportunity to refine and improve. To achieve this, we must overhaul our regulatory framework, which has disproportionately burdened nuclear energy—our greenest power source—with excessive regulation compared to more hazardous and less environmentally friendly technologies.
Electrons are electrons. We should allow all electricity generation technologies to compete in the market on an equal footing. Let the best technologies win.