
Will there be a successful application of diffusion-like weight modification in LLMs before 2027?
Plus
1
Ṁ12027
50%
chance
1D
1W
1M
ALL
This market resolves YES if a peer-reviewed research paper or major tech company publication (e.g. from Google, Meta, X.AI, OpenAI, or Anthropic) by 1/1/2027 demonstrates successful application of diffusion-like techniques (noising and trained denoising) to directly modify LLM weights usefully toward some safety or capabilities-relevant downstream purpose. Resolution will be based on papers indexed on arXiv.org or published on major AI research blogs.
Get Ṁ1,000 play money
Related questions
Related questions
Will researchers extract a novel program from the weights of an LLM into a Procedural/OO programming language by 2026?
27% chance
Will RL work for LLMs "spill over" to the rest of RL by 2026?
40% chance
Will LLMs be able to formally verify non-trivial programs by the end of 2025?
31% chance
Will the best LLM in 2027 have <1 trillion parameters?
26% chance
OpenAI to release model weights by EOY?
68% chance
Will the best LLM in 2025 have <1 trillion parameters?
38% chance
Will the best LLM in 2026 have <1 trillion parameters?
40% chance
Will top open-weight LLMs in 2025 reason opaquely?
37% chance
By 2028 will we be able to identify distinct submodules/algorithms within LLMs?
76% chance
Will the best LLM in 2025 have <500 billion parameters?
24% chance