MANIFOLD
BrowseUS ElectionNewsAbout
Will Aidan McLau's claim that very large models are "refusing instruction tuning" be validated by 2030?
Mini
4
Ṁ55
2030
42%
chance
1D
1W
1M
ALL

https://x.com/aidan_mclau/status/1859444783850156258 According to Aidan McLau the reason large models are not being released is because these models are resisting instruct tuning. Resolves yes if a current or former AI researcher at Google, OpenAI, Anthropic, or Meta validates this claim or it confirmed independently by research.

x.com
#️ Technology
#AI
#OpenAI
#Technical AI Timelines
#LLMs
Get Ṁ1,000 play money
Comments

Related questions

AI: Will someone train a $10B model by 2030?
85% chance
AI: Will someone train a $1T model by 2030?
25% chance
AI: Will someone train a $1T model by 2050?
81% chance
AI: Will someone train a $100B model by 2050?
80% chance
Limits on AI model size by 2026?
15% chance
AI: Will someone train a $10T model by 2100?
59% chance
AI: Will someone train a $10B model by 2050?
87% chance
AI: Will someone train a $1T model by 2080?
69% chance
AI: Will someone train a $1B model by 2030?
89% chance
Will a model costing >$30M be intentionally trained to be more mechanistically interpretable by end of 2027? (see desc)
60% chance

Related questions

AI: Will someone train a $10B model by 2030?
85% chance
AI: Will someone train a $10T model by 2100?
59% chance
AI: Will someone train a $1T model by 2030?
25% chance
AI: Will someone train a $10B model by 2050?
87% chance
AI: Will someone train a $1T model by 2050?
81% chance
AI: Will someone train a $1T model by 2080?
69% chance
AI: Will someone train a $100B model by 2050?
80% chance
AI: Will someone train a $1B model by 2030?
89% chance
Limits on AI model size by 2026?
15% chance
Will a model costing >$30M be intentionally trained to be more mechanistically interpretable by end of 2027? (see desc)
60% chance
BrowseElectionNewsAbout