
How many non-Transformer based models will be in the top 10 on HuggingFace Leaderboard in the 7B range by July?
Plus
16
Ṁ769Jul 2
1D
1W
1M
ALL
39%
0
32%
1-2
24%
3-5
4%
6+
For resolution I’ll go to the HuggingFace leaderboard and select the ~7B and uncheck everything else.
I’ll refrain from participating in the market to stay neutral in case a hybrid case comes up. I’d count both Mamba and StripedHyena as non-Transformers.
Get Ṁ1,000 play money
Sort by:
@HanchiSun I’d say sliding window is a type of attention. I’d consider LongFormers as a type of Transformer.
@HanchiSun Out of curiosity, would you bet differently if it was for the 3B category rather than 7B?
@KLiamSmith Good question. it is definitely harder to experiment with 7b than 3b. but even for 3b, i doubt more than 2 non-attention architecture will be better
Related questions
Related questions
Will China have a model in the top 10 on LMSYS Chatbot Arena on March 1, 2025?
98% chance
Who will have the highest ranking model on web.lmarena.ai by March 2025?
Will the top model by OpenAI rank 3rd (or lower) behind 2 other model families at any point before 2026?
41% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
79% chance
Will any model get above human level on the Simple Bench benchmark before September 1st, 2025.
68% chance
On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmark
80% chance
When will a non-Transformer model become the top open source LLM?
Will Mistral's next model make it to the top 10 models in LLM Arena by the end of 2025?
45% chance
Who will have the highest ranking model on web.lmarena.ai by end of June 2025?
Number of public models on 🤗Hugging Face by EOY 2024