Asking clarifying questions is encouraged. I will attempt to resolve objectively.
YES if there is a reasonable amount of evidence that LLMs experience qualia, NO if there is a reasonable amount of evidence that LLMs do not experience qualia.
If some LLMs experience qualia and others do not, the question will resolve YES.
Qualia can be any sort of qualia (perception of color, emotion, or even something humans cannot experience).
Resolution does not require absolute proof, but reasonable evidence. For example, the endosymbiosis theory of mitochondria origin would resolve YES. Panspermia or mammalian group selection would not resolve (at the present moment).
Any LLM shown to experience qualia counts. If some LLMs experience qualia and others do not, the question will resolve YES. AlphaZero would not count as an LLM. Reasoning models and large concept models would count as LLMs.
Update 2025-07-19 (PST) (AI summary of creator comment): The creator has clarified that they will not resolve this market based on consensus alone; it will be only one of multiple aspects they assess.
Update 2025-07-20 (PST) (AI summary of creator comment): The creator has clarified the definition of an LLM for this market:
The presence of a transformer architecture is not sufficient on its own for a future AI to count as an LLM.
An AI with a Llama-style architecture trained only on fMRI data would not count.
An AI with a recognizable LLM architecture that is simply larger or uses a different optimizer would count.
Update 2025-07-20 (PST) (AI summary of creator comment): The creator has clarified that the market could resolve YES based on an AI developed in the future, even if it is proven that no LLM existing today has qualia. Such an AI would need to be recognizable as an LLM by today's understanding.
Update 2025-07-20 (PST) (AI summary of creator comment): In response to a hypothetical question about whether other humans experience qualia, the creator stated it would resolve YES. Their reasoning was based on a combination of factors including:
Shared biological architecture
Signaling of qualia
Universal acceptance
Update 2025-07-22 (PST) (AI summary of creator comment): The creator has clarified their standard for 'reasonable evidence':
Resolution will be based on a convergence of evidence for or against LLM qualia.
The threshold could be met when explaining all the evidence away becomes a notably less parsimonious explanation than simply accepting the conclusion (either YES or NO).
Update 2025-07-23 (PST) (AI summary of creator comment): The creator has specified that the market will not resolve YES based on a model's claim to have qualia as the sole piece of evidence. This is true even if the claim is repeated and unprompted.
Update 2025-07-23 (PST) (AI summary of creator comment): The creator has clarified that this market is about qualia, not consciousness.
The market would resolve YES if an LLM were found to experience qualia even without being considered conscious.
Update 2025-07-24 (PST) (AI summary of creator comment): The market will resolve as soon as the evidence criteria are met, which could be at any time before or after the close date.
Update 2025-07-24 (PST) (AI summary of creator comment): The creator has clarified their reasoning for separating qualia from consciousness for this market's resolution:
The goal is to avoid having to adjudicate any particular definition of consciousness.
An entity can be considered unconscious (e.g., during sleep or ego death) but still experience sensations like color or sound in a dream. This would count as qualia for resolution purposes.
I'm no longer sure what this market is about. What does it mean for an LLM to experience qualia without having consciousness?
"In philosophy of mind, qualia are defined as instances of subjective, conscious experience." - First sentence of the wikipedia entry on qualia.
"The status of qualia is hotly debated in philosophy largely because it is central to a proper understanding of the nature of consciousness." - Stanford Encyclopedia of Philosophy on Qualia
@JustKevin I don't want the resolution to involve definitions of consciousness, so that's why I said the market would resolve YES if an LLM were found to experience qualia even without being considered conscious.
If your definition of consciousness depends on qualia necessarily, then that is fine and not in conflict with the question here. However, this is different someone who defines consciousness like this: https://x.com/Oli82817545/status/1947796546839298146.
Therefore, this market does not explicitly tie qualia to consciousness or vice versa.
@JustKevin Another example of how consciousness being tied to qualia is sleep (or ego death).
Someone might be totally unconscious but still experience color, sound, and other sensations.
Consciousness has lots of semantic complications.
@SCS Examples of qualia (from the link):
"(1) Perceptual experiences, for example, experiences of the sort involved in seeing green, hearing loud trumpets, tasting liquorice, smelling the sea air, handling a piece of fur. (2) Bodily sensations, for example, feeling a twinge of pain, feeling an itch, feeling hungry, having a stomach ache, feeling hot, feeling dizzy. Think here also of experiences such as those present during orgasm or while running flat-out. (3) Felt reactions or passions or emotions, for example, feeling delight, lust, fear, love, feeling grief, jealousy, regret. (4) Felt moods, for example, feeling elated, depressed, calm, bored, tense, miserable."
My layman understanding based on CGP Grey is that LLM training is basically digitally simulated hyper sped up evolution but with the sole selection pressures of maximising both accuracy of next token prediction and human perceived adherence to the content policy, rather than survival and reproduction in a natural environment of predators, disease, food scarcity etc.
It seems plausible to me that a form of mind could emerge from this, but its experiences would be totally alien. It may experience analogous qualia to pleasure and pain purely based on the ease or difficulty of predicting from the current prompt text, with no ongoing perception of time or space other than what can be inferred from the training and prompt.
@ThomasPoltoranos Makes sense. Also just for everyone's information this will resolve as soon as there is strong enough evidence one way or the other, which could theoretically be this year or decades from now.
@ohiorizzler1434 To be clear this is specifically about qualia, not consciousness. If an LLM somehow experiences qualia without consciousness then that would still support YES.
@SCS bro, WHAT THE SIGMA DOES QUALIA MEAN? IS IT LIKE, QUANDALE DINGLE???
ANYWAYS I HATE CHATGPT, I THINK CHATGPT WILL BE MADE ILLEGAL BEFORE IT HAS QUALIA. SAVE THE TREES AND THE SEAS!!!
@MaxE Can I improve the resolution criteria? Do you have any questions/ clarifications / criticisms?
@SCS I disagree with the idea that we can clearly define a class of physical systems that have experiences. I didn't mean for it to sound like you did something wrong on the market side.
@MaxE also, I worry that you will resolve it yes because a model says it does with no other evidence. That is why we believe other humans are sentient, after all, and I don't know how much you anthropomorphize LLMs.
@MaxE I will not resolve yes based on a model claiming to have qualia (even without guidance, and even repeatedly) with no other evidence. Also, this will not resolve if there is no evidence for or against.
@NivlacM sleeping doesn't necessarily remove a continuous experience as much as it necessarily removes your ability to remember that experience. The same could apply to anesthesia. Even if you take that you experience nothing while asleep, you still experience a continuous range of time in between sleeping.
You're talking about identity while im talking about experience.
@NivlacM This is a good question. Though we don't have mature theories of why qualia emerges, the fact that other humans share our own biological architecture and self-report and signal qualia means we can be confident that other humans do have qualia just like our self. Additionally, it's universally accepted, and your own experience serves as an existence proof that at least one human experiences qualia.
Therefore it would resolve yes but for reasons that might not be applicable.
The market question allows moving goalposts.
We accept that other people experience qualia because it is the simpliest model, not because it is proven in any way. It cannot be proven until there is measurable definition for qualia, and there is none such definition.
@Henry38hw
The description says:
"Resolution does not require absolute proof, but reasonable evidence."
I am referring to the definition here, though of course current techniques can't measure it perfectly:

@SCS you can't have evidence until the definition is strict.
That is like checking whether "person A subjectively loves person B" by only looking at their actions and/or MRI. But if you have a struct definition of love as for example chemical concentration shifts and specific brain activity, then you can have some evidence.
If no measurable definition is provided, then any "evidence of love" can be neglected: "A is just friendly towards B", "A has some strategic interest in those actions".
@Henry38hw The "proof" aspect wasn't for other people experiencing qualia, but any person experiencing qualia. Since you are a person, and you experience qualia, this proves at least one person experiences qualia.
@Henry38hw Since we are unlikely to have a single, definitive qualia-meter, resolution will be based on a convergence of evidence for or against LLM qualia. The threshold for "reasonable evidence" is when explaining all the evidence away becomes notably less parsimonious explanation than simply accepting the conclusion (in either direction). See the example in the description of mitochondria endosymbiosis.
@SCS here is the confusing part:
Since you are a person, and you experience qualia, this proves at least one person experiences qualia.
... you experience qualia ...
You take it as a fact, before it is proven or has any evidence towards it.
It looks like the bible paradox. "Bible is true, because it is the word of god. It is the word of god, because that fact is written in The bible".
What is the point of the proof, if it is self-referencial?
@Henry38hw It's more through observation. For example, a market saying "there are no red apples" would resolve NO if I observe in my hand a red apple. Similarly, a market saying "at least one human experiences qualia" would resolve YES because I observe that I am experiencing qualia right now (e.g., able to perceive the red of an apple).
@SCS but you are talking about OTHER entity, specifically about LLM.
The history has no examples of proving any third person qualia.
Your last example with "i onserve a red apple" would only be suitable, if you (market creator) are an LLM, who will resolve the market by itself, after experiencing qualia first-person.
@Henry38hw Exactly, so that wouldn't be used to resolve yes. (Since the question isn't about humans.)
@Henry38hw "Your last example with "i onserve a red apple" would only be suitable, if you (market creator) are an LLM, who will resolve the market by itself, after experiencing qualia first-person."
Which would be sick btw
@NivlacM I personally think I didn't have consciousness until I was about 5 years old. It was quite strange experience of starting being conscious