I read your piece on novels as models many years ago, and I’ve been reflecting on it with the advent of LLMs. I wrote a piece (substack) making the case that the data required for AGIs is probably embedded within the human textual corpus, and leaned on your old writing as evidence. I think you would really like it. I would also be curious for a future MR post if you have any retrospective thoughts on your 2005 article.
From @cauchyfriend. Putting the AGI issue aside, my longstanding view has been that there are more “models” embedded in text than most people realize, a point relevant for economic method as well. I see LLMs as having established this case, in fact far more definitively than I ever would have expected.
Big takeaway from the GPT paradigm is that the world of text is a far more complete description of the human experience than almost anyone anticipated.
— Greg Brockman (@gdb) January 6, 2023