If we find out in 2024, was o1's Transformer base trained on 10+x as much compute as GPT-4's?
1
Ṁ50Jan 2
48%
chance
1D
1W
1M
ALL
Get Ṁ1,000 play money
Related questions
Related questions
Will an open source model beat GPT-4 in 2024?
49% chance
Will there be an LLM (as good as GPT-4) that was trained with 1/100th the energy consumed to train GPT-4, by 2026?
53% chance
Will xAI release an LLM with BIG-Bench score as good as GPT-4 Turbo before the end of 2024?
70% chance
Will there be a OpenAI LLM known as GPT-4.5? by 2033
35% chance
Will GPT-4 be trained (roughly) compute-optimally using the best-known scaling laws at the time?
30% chance
Will GPT-4 be trained on more than 10T text tokens?
36% chance
How much compute will be used to train GPT-5?
In yottaFLOPs (10^24), how much compute will GPT-4 be trained with?
22
Will inflection AI have a model that is 10X the size of original GPT-4 at the end of Q1, 2025?
26% chance
Will it be possible to disentangle most of the features learned by a model comparable to GPT-4 this decade?
39% chance