Will transformers still be the dominant DL architecture in 2026?
20
แน10352026
58%
chance
1D
1W
1M
ALL
Resolves true if I judge, based on the common opinion among deep learning researchers, that transformers remain the most popular architecture in deep learning research at the start of 2026. If the answer is not clear, resolves true if at least 50% of arXiv papers from 2025 on A.I. mention transformers, otherwise resolves false.
Get แน1,000 play money
Sort by:
Apparently there is already a very similar market!
https://manifold.markets/LeoGao/will-transformer-based-architecture
Related questions
Related questions
Will Transformer based architectures still be SOTA for language modelling by 2026?
69% chance
Will superposition in transformers be mostly solved by 2026?
62% chance
On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmark
59% chance
Will the most capable, public multimodal model at the end of 2027 in my judgement use a transformer-like architecture?
55% chance
6) An alternative to the transformer architecture will see meaningful adoption.
69% chance
Will a big transformer LM compose these facts without chain of thought by 2026?
75% chance
Will there be over 1000 Optimus robots working at Tesla before 2026?
35% chance
Will a big transformer LM compose these facts without chain of thought by 2026? (harder question version)
53% chance
By EOY 2026, will it seem as if deep learning hit a wall by EOY 2025?
23% chance
When will a non-Transformer model become the top open source LLM?