
Will we find tree search in a vanilla transformer trained on chess games by 2026?
Mini
7
Ṁ1232026
22%
chance
1D
1W
1M
ALL
Inspired by https://adamkarvonen.github.io/machine_learning/2024/01/03/chess-world-models.html
I won't bet in this market.
Get Ṁ1,000 play money
Sort by:
https://arxiv.org/abs/2402.04494
TLDR:
... we train a 270M parameter transformer model [on chess]
Lichess blitz Elo of 2895 against humansWe also show that our model outperforms AlphaZero's policy and value networks (**without MCTS**) and GPT-3.5-turbo-instruct.
Related questions
Related questions
Before Feb 2026, will a transformer based reasoning model >1800 elo be able to explain 3+ chess lines at any position?
53% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
79% chance
Will chess be solved by 2100?
30% chance
Will chess be solved by 2040?
19% chance
Will AI be able to describe the state of a chessboard from an image before 2026?
95% chance
Will we get "chess AGI" by 2030?
80% chance
By the start of 2026, will I still think that transformers are the main architecture for tasks related to natural language processing?
68% chance
Will we have AI that can explain chess moves logic in human language by 2030?
94% chance
Will AI be able to generate correct images of a chess game in 2024?
9% chance
Will a Language Model under 10B parameters play chess at Grandmaster level by 2050?
63% chance