One look at our future

The following comes from a deleted tweet, so I have paraphrased in some places and also removed references to a few specific individuals.  If the author wishes to reclaim ownership of these ideas, I will write a separate blog post crediting him:

1. AGI and Fusion are what matter.  Fusion accelerates AGI, since AGI is just exaflops spent on training.

2. GPT 3.5 (ChatGPT) is civilization-altering.  GPT-4, which is 10x better, will be launched in the second quarter of next year.

3. Google is worried, but Microsoft is all-in, and is building many more data centers to lead the charge.  Bing search is getting GPT integration next year.

4. Model configuration and training parameters don’t matter.  Intelligence is just GPU exaflops spent on training.

5. If #4 is true, civilization becomes a decentralized crypto network, where computers are contributed for training and earning tokens.  Querying the model costs tokens.

6. One last centralizing force — gradient descent is synchronous.  Needs high GPU coordination and fast network bandwidth.  Current trend is civilization centralizing with Microsoft laying 10x bigger Open AI data centers.

7. Gradient descent is the process of error correction, where an N-layer model predicts an output.  When that’s far away from the target, we correct all the layer weights slightly to re-aim.

8. Some other batch of stuff I didn’t understand and cannot paraphrase.

9. A lot of the configurations work equally fine.  If you throw the same GPU exaflops at the model — they perform more or less the same.  Probably that is why it is evolutionarily easy to invent the brain.  Open AI is at 10 exaflops right now vs. 1000 for the human brain.  Probably going to equalize in five years.

10. Models are so good already that only expert training matters anymore.  Co-pilot for X is in play.  Anyone building a Co-pilot for my browser?  Browsers are largely text-based, which GPT fully understands.

There you go!  Speculative, of course.

Comments

Comments for this post are closed