Consciousness and Subjective Experience
Internal Experience
Inverted Spectrum
If one person sees green where others see red, and sees red where others see green, neither the person nor others would be able to tell the difference. This is the Inverted Spectrum thought experiment.
This problem brings up a key concept: Qualia (subjective experience).
The colors that humans perceive involve two levels:
- The physical level, including the wavelength of light, retinal cone signals, and neural signal transmission
- The psychological level, including "the feeling of seeing red" and "the feeling of seeing green"
Let us organize the inverted spectrum experiment as follows:
| Stimulus | A's Experience | B's Experience |
|---|---|---|
| Leaves | Green | Red |
| Blood | Red | Green |
However, due to language learning:
- Both people call "leaves" "green"
- Both call "blood" "red"
| Stimulus | A's Experience | B's Experience | Language Learning |
|---|---|---|---|
| Leaves | Green | Red | Green |
| Blood | Red | Green | Red |
We can see that behavior and language are entirely consistent, yet the internal experiences of A and B are completely different.
Qualia
Qualia are subjective experiences.
Definition:
What it is like to have a certain experience.
Examples:
- The visual experience of red
- The feeling of pain
- The feeling of listening to music
Characteristics:
- First-person in nature
- Subjective *
The Problem of Other Minds
If the inverted spectrum is possible, then we can naturally infer further:
Every person's color experience may be different
But as long as the following conditions hold:
- Consistent stimulus-response relationships
- Consistent language learning
- Consistent behavioral patterns
Then:
Social communication still functions perfectly normally
Therefore, we can only be certain of:
- The structure of stimuli is consistent
- Behavior is consistent
But we cannot be certain whether experiences are consistent.
So how can we know whether others truly have consciousness?
Behaviorism holds that mental states are simply behavioral dispositions. However, the inverted spectrum problem shows that even when behavior is entirely consistent, internal experience may still differ.
From the inverted spectrum we can observe that subjective experience has the property of being private (privacy of experience). This is because experience can only be directly perceived by the person having it; others can only observe behavior. Thus, we cannot directly compare "the red that A sees" with "the red that B sees."
Internal experience is first-person in nature — you know that you have experiences, yet you still cannot directly observe the experiences of others.
The consciousness and experience of others cannot be fully proven through external behavior.
Behavioral Patterns and Internal Experience
Behavioral patterns are the observable regularities of response exhibited by a system under different inputs:
- Being pricked by a needle → saying "it hurts"
- Seeing red → saying "red"
Their characteristics are third-person observability and amenability to experimental study.
Internal experience (Subjective Experience) is the subjective feeling within a system:
- "The feeling of red"
- "The feeling of pain"
- "The feeling of heat"
Its characteristics are first-person and not directly observable.
In simple terms:
- Behavioral patterns answer how a system responds
- Internal experience answers what a system feels
The Turing Test (Emphasizing Behavior)
The basic idea of the Turing Test is: if a machine can make a human unable to distinguish it from another human in conversation, then it can be considered intelligent.
Human Judge
↓
Through text-based conversation
↓
Determine whether the other party is human or machine
If the machine consistently prevents the judge from identifying it, it passes the test. Therefore, the Turing Test only examines behavioral performance — more specifically, it only examines linguistic behavior.
The Turing Test does not concern itself with internal structure or implementation; it only requires that the output behavior be consistent with that of a human.
The Turing Test supports the behavioral-pattern approach, based on the assumption: if the behavior resembles that of a human, then understanding exists.
The Chinese Room (Critiquing Behavior)
A person who does not understand Chinese at all is inside a room, holding a rule book.
The process:
- People outside pass in Chinese questions
- The person inside matches symbols according to the rules
- The person outputs another string of Chinese symbols
Result: people outside will believe the person inside understands Chinese.
But in reality: the person inside does not understand Chinese.
Searle used this example to demonstrate that even when behavior is entirely correct, understanding does not necessarily exist.
The Chinese Room is a rebuttal of the behavioral-pattern approach: behavior may be mere symbol manipulation and does not imply understanding or experience.
In other words, even if we completely solve the easy problems, the hard problem remains. Even if we can know how the brain processes the wavelength of red light and why red is identified as red, we still cannot explain why the feeling of red arises.
The Alarm Problem and the Emergence of Experience
An alarm also has sensors — does it also "feel" temperature changes? Philosophically, this is typically divided into two levels:
- Information detection: the system receives input, processes it, and responds according to rules. An alarm clearly has this level.
- Subjective feeling: the system is not merely processing information but also has the "feeling" of heat. It is generally held that an alarm does not have this level.
In search of the dividing line between information detection and subjective feeling, philosophers and scientists have proposed two types of "emergence":
- Weak emergence: the properties of the whole arise from the complex interactions of lower-level parts, but can in principle still be explained by lower-level laws. Examples include the fluidity of water and ant colony behavior.
- Strong emergence: when a system reaches a certain level of complexity, a new property appears that cannot be fully explained by the underlying physics.
If consciousness is an instance of strong emergence, then subjective experience is a new level of phenomenon in nature.
Philosophical Zombies (Same Behavior, Conceivably No Experience)
Philosopher David Chalmers proposed an even more extreme hypothesis: the Philosophical Zombie.
Imagine a being that:
- Behaves identically to a human
- Has an identical brain structure
- Exhibits identical language and emotional expression
But has no subjective experience whatsoever.
Regarding philosophical zombies, the mainstream views include:
- Anti-physicalism: philosophical zombies are possible, therefore consciousness is not a purely physical phenomenon — consciousness may be a fundamental property
- Physicalism: zombies are a product of mistaken imagination; replicating physical structure necessarily replicates consciousness, so zombies cannot exist
The core of this debate is essentially:
Is consciousness entirely determined by physical processes?
If the answer is:
- Consciousness is determined by physical processes: then philosophical zombies cannot exist
- Consciousness is not determined by physical processes: then philosophical zombies may exist
The philosophical zombie argument reveals that explaining behavior is not the same as explaining experience.
It is worth noting that the Chinese Room and the philosophical zombie may seem similar, but:
- The Chinese Room primarily addresses the absence of understanding of meaning
- The philosophical zombie primarily addresses the absence of any experience whatsoever
The logical chain is as follows:
Behavioral Capacity
↓
The Problem of Understanding (Chinese Room)
↓
The Problem of Experience (Philosophical Zombie)
↓
Hard Problem of Consciousness
.
The Problem of Consciousness
The Easy Problems of Consciousness
The so-called easy problems of consciousness refer to:
How to explain a system's cognitive functions and behavioral capacities.
For example:
- How the brain recognizes color
- How attention works
- Why a person says "I see red"
- How the brain integrates sensory information
- Why a person reports pain
Although these problems are technically difficult, they theoretically amount to: explaining information-processing mechanisms.
Structurally, they can be represented as: Stimulus → Neural Processing → Behavior / Report
Cognitive science and neuroscience primarily study these problems.
The Hard Problem of Consciousness
Through the inverted spectrum we know that identical behavior may harbor different experiences, and the philosophical zombie further explores the scenario of consistent behavior with no internal experience whatsoever. From these discussions, we derive the following conclusion:
Consciousness cannot be fully explained through physical or behavioral descriptions
This leads to the core question of modern philosophy of mind: the Hard Problem of Consciousness (by David Chalmers), which asks:
Why do physical processes give rise to subjective experience?
In other words, neural activity can explain information processing, behavioral responses, decision-making, and so on, but it cannot explain "why there is feeling at all." For instance: why does neural activity produce the feeling of red or the feeling of pain, rather than no feeling at all?