少点错误 59分钟前
Making Sense of Consciousness Part 2: Attention
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

本文探讨了意识与潜意识之间的区别,以及大脑如何处理我们未注意到的信息。研究者通过实验观察到,即使人们没有意识到某些刺激,这些刺激仍然会影响他们的行为。文章重点关注了“注意力盲视”现象,通过脑电图(EEG)和脑磁图(MEG)等技术,揭示了大脑在视觉和听觉处理中,意识与非意识状态下的神经活动差异。研究结果表明,大脑不同区域的活动与意识体验密切相关,为理解人类意识提供了新的视角。

👁️‍🗨️ **意识与行为的分离:** 文章区分了我们“有意识地”感知的信息和“潜意识地”影响行为的信息。通过实验,研究者发现,即使人们没有意识到某个刺激,它仍然可以影响他们的行为,例如,在“注意力盲视”实验中,人们可能没有注意到屏幕上的关键刺激,但他们的行为却显示出对该刺激的反应。

🧠 **大脑神经活动差异:** 研究利用脑电图(EEG)和脑磁图(MEG)等技术,对比了人们在“有意识”和“无意识”状态下的大脑活动。例如,在视觉刺激中,当人们注意到某个刺激时,大脑的N170信号(与面孔识别相关)会增强;在听觉刺激中,大脑的听觉皮层活动也会因注意力分散而减弱。

💡 **关键脑区的作用:** 研究揭示了特定脑区在意识体验中的作用。例如,左侧眶额叶皮层在“有意识”和“无意识”条件下活动差异明显,而枕叶皮层和外纹状体皮层可能与视觉意识有关。这些发现有助于我们理解大脑中不同区域如何协同工作,从而产生意识体验。

Published on July 3, 2025 9:20 PM GMT

Midjourney, “consciousness, attention, event-related potential, EEG, magnetoencephalogram, inattentional blindness, N170, fusiform gyrus”

In this part of the sequence I’m trying to distinguish between information we’re consciously aware of, and information that we’re exposed to, and which may still affect our behavior “subconsciously” or “unconsciously”, but which we don’t consciously perceive.

How can you tell those apart?

One decent proxy is communication — can a human subject indicate to the experimenter that they have noticed something? Then they’re probably consciously aware of it.

Of course, this isn’t a perfect indicator. What people say, and what they subjectively experience, aren’t exactly the same. It seems unlikely that many people would lie in a low-stakes psychology experiment with neutral sensory stimuli, but they might consider a very subtle perception “not worth mentioning” to an experimenter, or they might guess at or confabulate a perception they don’t actually experience.

One thing we observed in the last post is that, if you ask people to bet on their degree of confidence in making judgments about subtle sensory perceptions, they’re pretty well calibrated, while if you ask them to rate their confidence, they’re systematically underconfident.

Think of two separate pieces of information:

    the sensory stimulus one is exposed to

    the correlation between one’s behavior and 1).

If your behavior correlates with 1.), your brain is processing the stimulus, and using it to shape behavior. But “you” might not be consciously aware of the stimulus. (see blindsight, subthreshold sensations, etc.)

If your behavior can correlate with 2.), as it does when someone bets correctly on their confidence, then your brain has access to meta-level or second-order information, and is using that to shape behavior. “Some part of you”, at least, “knows” when you see the stimulus and when you don’t.

Is this second-order information a necessary-and-sufficient condition for consciousness? Not sure; maybe the betting is itself an “automatic” process without subjective awareness.

But it’s suggestive that people with blindsight don’t have this kind of second-order information; they don’t bet on their ability to “see”.

From a revealed-preference perspective, where you can be said to “know” things if you coherently shape your behavior around them, people with blindsight “know” where visual stimuli are but don’t “know they know”.

As best I can, I want to keep these distinctions straight going forward. I’ll only use words like “consciousness” or “awareness” or “subjective perception” to refer to things people report experiencing or things they can indicate (e.g. via betting on confidence) knowledge of having experienced; I’ll use “detection” or “processing” to refer to stimuli that measurably affect the brain or behavior but may not be consciously perceived.

Inattentional Blindness

We all have the subjective experience of not noticing things when we’re distracted, a phenomenon known in the literature as “inattentional blindness.”

I think it’s fair to say we are not conscious of stimuli we don’t notice due to inattention. The ambiguity around communication doesn’t really enter here; you can just observe, introspectively, that when you don’t notice a thing it isn’t in your awareness.

So what’s the difference, neurologically, between stimuli we notice and stimuli we don’t?

First of all, there’s a literature demonstrating that stimuli we don’t notice can affect our behavior.

In a standard experimental paradigm, inattentional blindness can be induced by assigning people a “distractor task” that takes up a lot of attention — something like doing arithmetic problems or counting the number of baskets in a video of basketball — and then they’re exposed to a surprising stimulus, like a gorilla running across the basketball court. People focused on the task often don’t notice the gorilla. (Interestingly, they’re more likely to notice if they have ADHD).1 In this paradigm, the role of the gorilla — the surprising stimulus that distracted subjects don’t notice — is known as the “critical stimulus.”

People are significantly better than chance at answering questions about the critical stimulus, even though they don’t report noticing it.2 3 Their behavior is also affected in other ways, such as faster reaction times on recognizing the critical stimulus than the same item if it wasn’t previously presented.4

These findings have survived the replication crisis; more recent studies have replicated the classic findings.5

What does inattentional blindness look like neurologically?

fMRI activity can be compared between three conditions: no critical stimulus, unaware (the critical stimulus is there but not noticed; inattentional blindness), and aware (the critical stimulus is there and noticed.)

The only region that was found to be significantly more active in the aware vs. unaware condition was the left orbitofrontal cortex, a region in the prefrontal cortex just behind the eyes — and this was only after loosening the criteria for significance, so it’s ambiguous whether it’s a “real” effect.6

Lots of other brain areas, including the orbitofrontal cortex, other prefrontal cortex regions, and the cerebellum, are significantly more active in both the aware and unaware conditions than in the no-critical-stimulus condition; they may be involved in subconscious processing of the critical stimulus.

But fMRI isn’t the only tool we have. We can also tell apart aware (undistracted) recognition of images from inattentionally blind (unaware) exposure via EEG signals. Around 200 ms after exposure, there’s a spike in the posterior electrode signals (known as N170) when people recognize familiar shapes and faces, and not when they don’t.7 8

Notably, people who did not exhibit inattentional blindness — the subjects who noticed the critical stimulus — have the same N170 EEG spike pattern as people who initially did have inattentional blindness but were then verbally informed (made “aware”) of the critical stimulus. That is, the spike consistently seems to track the state of being aware of the critical stimulus, no matter how people got there (noticing on their own, or being told.)

The EEG response is faster than fMRI can pick up — fMRI only detects brain activity 4-6 seconds after it happens — so it’s unsurprising that an fMRI couldn’t pick up a difference in occipital lobe activity between aware and unaware states.

This suggests that the N170 pattern — likely caused by activity in the fusiform gyrus, responsible for visually recognizing familiar objects — is involved in conscious perception (as opposed to unconscious processing) of faces and shapes. What about other, less recognizable visual stimuli?

There’s still a difference between aware and unaware conditions in occipital EEG responses when the critical stimulus is a “contour shape” in a field of arrows, but it’s later than N170, more like 300 ms in.9 Moreover, the intensity of this EEG marker of “awareness” was greater the more confident subjects were that they saw the contour shape.

This signal is inferred to be coming from the extrastriate cortex, aka V2-V6, the regions of the visual cortex that do “higher” visual processing (color, shape, texture, object recognition, motion — not just light/dark and location.) This matches the finding in blindsight that you need to have “enough” signal in the extrastriatal cortex before you can consciously “see” anything.

We get the same result in the rare cases where a patient is undergoing brain surgery and it’s possible to do intracranial EEG, which lets us find out exactly where in the brain the signal is coming from. Visual distractions resulted in a reduction of activity in V2.10

As with blindsight, this obviously doesn’t mean that V2 is “where consciousness lives.” It’s a visual processing area; it’s not going to be the defining factor between conscious and unconscious processing of, say, auditory or tactile stimuli. But (enough) extrastriatal cortex activity might be a necessary condition for conscious visual perception, while other senses might have other areas where (sufficient) activity is necessary for consciousness.

There’s also inattentional deafness, in which people miss auditory alarms when they’re distracted by a working-memory-heavy task. (This is a particularly big problem for pilots and healthcare workers, where immediately noticing beeping noises is a life-and-death issue.)

In an auditory flight simulator task, pilots missed more alarms when they were under higher cognitive load, and they had a much higher signal in the N100 evoked potential as measured by EEG, peaking over the fronto-central region of the scalp at around 116 ms after the alarm. This is a well-known auditory processing signal, generated by the auditory cortex in the superior temporal gyrus.11

If you use magnetoencephalography, which can localize brain activity better in time than fMRI and better in space than EEG, again you get reduced auditory evoked potentials at around 100 ms after an auditory stimulus, and the reduced activity is concentrated in the superior temporal sulcus and the posterior middle temporal gyrus — close but not exactly the same location as the primary auditory cortex, but also involved in some auditory processing like speech recognition and audiovisual integration.12

Upshots

In inattentional blindness, just like in subthreshold perception, people’s behavior is affected by sensory information they claim not to notice.

To some extent, people seem to be underconfident in what they’re willing to say they saw (though I couldn’t find a gambling study for inattentional blindness to identify whether people asked to bet on their confidence level can accurately gauge “how much” they’re perceiving.)

But brain states are also measurably different between cases when people “don’t notice” visual and auditory stimuli vs. cases when they do.

And the neural differences are in more or less the regions you’d expect; we see less activity in visual and auditory processing areas respectively, shortly after a visual or auditory stimulus that goes unnoticed because of distraction.

This is still consistent with the hypothesis I formed in the last post that less activity in certain sensory processing regions reduces conscious awareness of the corresponding sensations, even if there’s still enough “signal” getting through to affect behavior somewhat, or to allow people to make better-than-chance guesses about the sensory stimulus.

1

Oktay, Bahadır, and Banu Cangöz. "I thought I saw “Zorro”: An inattentional blindness study." Archives of Neuropsychiatry 55.1 (2018): 59.

2

Kreitz, Carina, Giulia Pugnaghi, and Daniel Memmert. "Guessing right: Preconscious processing in inattentional blindness." Quarterly Journal of Experimental Psychology 73.7 (2020): 1055-1065.

3

Nartker, Makaela, et al. "Sensitivity to visual features in inattentional blindness." eLife 13 (2025): RP100337.

4

Mack, Arien. "Inattentional blindness: Looking without seeing." Current directions in psychological science 12.5 (2003): 180-184.

5

Wood, Katherine, and Daniel J. Simons. "Processing without noticing in inattentional blindness: A replication of Moore and Egeth (1997) and Mack and Rock (1998)." Attention, Perception, & Psychophysics 81 (2019): 1-11.

6

Thakral, Preston P. "The neural substrates associated with inattentional blindness." Consciousness and cognition 20.4 (2011): 1768-1775.

7

Shafto, Juliet P., and Michael A. Pitts. "Neural signatures of conscious face perception in an inattentional blindness paradigm." Journal of Neuroscience 35.31 (2015): 10940-10948.

8

Dellert, Torge, et al. "Dissociating the neural correlates of consciousness and task relevance in face perception using simultaneous EEG-fMRI." Journal of Neuroscience 41.37 (2021): 7864-7875.

9

Pitts, Michael A., Antígona Martínez, and Steven A. Hillyard. "Visual processing of contour patterns under conditions of inattentional blindness." Journal of cognitive neuroscience 24.2 (2012): 287-303.

10

Chatard, Benoit, et al. "Evidence from iEEG of an adaptative transient constriction of spatial attention Attentional suppression of peripheral vision." Annals of neurosciences (2024).

11

Dehais, Frédéric, Raphaëlle N. Roy, and Sébastien Scannella. "Inattentional deafness to auditory alarms: Inter-individual differences, electrophysiological signature and single trial classification." Behavioural brain research 360 (2019): 51-59.

12

Molloy, Katharine, et al. "Inattentional deafness: visual load leads to time-specific suppression of auditory evoked responses." Journal of Neuroscience 35.49 (2015): 16046-16054.



Discuss

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

意识 注意力盲视 神经科学 脑电图
相关文章