
A major ML paper demonstrates symbolic-enhanced transformer successor outperforming standard transformers by March 2025
Plus
16
áš2436Mar 31
16%
chance
1D
1W
1M
ALL
Will a published machine learning paper demonstrate a new architecture that combines transformers with symbolic methods (category theory, programming language theory, or logic theory) and achieves superior performance on standard benchmarks compared to traditional transformer-only architectures?
Resolution criteria: Paper must be published on arXiv, and very much preferably a major ML conference/journal, and show statistically significant improvements over baseline transformers on multiple standard tasks.
Get áš1,000 play money
Related questions
Related questions
Will an AI achieve >85% performance on the FrontierMath benchmark before 2028?
81% chance
On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmark
80% chance
Will an AI achieve >80% performance on the FrontierMath benchmark before 2027?
82% chance
If OpenAI makes a transformer sized advancement in the next 5 years, will they publish an accompanying paper?
45% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
79% chance
Will a big transformer LM compose these facts without chain of thought by 2026?
64% chance
Will superposition in transformers be mostly solved by 2026?
73% chance
Will transformers still be the dominant DL architecture in 2026?
80% chance
Will the most capable, public multimodal model at the end of 2027 in my judgement use a transformer-like architecture?
55% chance
Will a big transformer LM compose these facts without chain of thought by 2026? (harder question version)
43% chance