450 research outputs found

    Attention modulates the processing of emotional expression triggered by foveal faces

    Get PDF
    To investigate whether the processing of emotional expression for faces presented within foveal vision is modulated by spatial attention, event-related potentials (ERPs) were recorded in response to stimulus arrays containing one fearful or neutral face at fixation, which was flanked by a pair of peripheral bilateral lines. When attention was focused on the central face, an enhanced positivity was elicited by fearful as compared to neutral faces. This effect started at 160 ms post-stimulus, and remained present for the remainder of the 700 ms analysis interval. When attention was directed away from the face towards the line pair, the initial phase of this emotional positivity remained present, but emotional expression effects beyond 220 ms post-stimulus were completely eliminated. These results demonstrate that when faces are presented foveally, the initial rapid stage of emotional expression processing is unaffected by attention. In contrast, attentional task instructions are effective in inhibiting later, more controlled stages of expression analysis

    Why the item will remain the unit of attentional selection in visual search

    Get PDF
    Hulleman & Olivers reject item-based serial models of visual search, and suggest that items are processed equally and globally during each fixation period. However, neuroscientific studies have shown that attentional biases can emerge in parallel but in a spatially selective item-based fashion. Even within a parallel architecture for visual search, the item remains the critical unit of selection

    Lateralized delay period activity marks the focus of spatial attention in working memory: evidence from somatosensory event-related brain potentials

    Get PDF
    The short-term retention of sensory information in working memory (WM) is known to be associated with a sustained enhancement of neural activity. What remains controversial is whether this neural trace indicates the sustained storage of information or the allocation of attention. To evaluate the storage and attention accounts, we examined sustained tactile contralateral delay activity (tCDA component) of the event-related potential. The tCDA manifests over somatosensory cortex contralateral to task-relevant tactile information during stimulus retention. Two tactile sample sets (S1, S2) were presented sequentially, separated by 1.5 s. Each set comprised two stimuli, one per hand. Human participants memorized the location of one task-relevant stimulus per sample set and judged whether one of these locations was stimulated again at memory test. The two relevant pulses were unpredictably located on the same hand (stay trials) or on different hands (shift trials). Initially, tCDA components emerged contralateral to the relevant S1 pulse. Sequential loading of WM enhanced the tCDA after S2 was presented on stay trials. On shift trials, the tCDA's polarity reversed after S2 presentation, resulting in delay activity that was now contralateral to the task-relevant S2 pulse. The disappearance of a lateralized neural trace for the relevant S1 pulse did not impair memory accuracy for this stimulus on shift trials. These results contradict the storage account and suggest that delay period activity indicates the sustained engagement of an attention-based rehearsal mechanism. In conclusion, somatosensory delay period activity marks the current focus of attention in tactile WM

    Object-based target templates guide attention during visual search

    Get PDF
    During visual search, attention is believed to be controlled in a strictly feature-based fashion, without any guidance by object-based target representations. To challenge this received view, we measured electrophysiological markers of attentional selection (N2pc component) and working memory (SPCN) in search tasks where two possible targets were defined by feature conjunctions (e.g., blue circles and green squares). Critically, some search displays also contained nontargets with two target features (incorrect conjunction objects, e.g., blue squares). Because feature-based guidance cannot distinguish these objects from targets, any selective bias for targets will reflect object-based attentional control. In Experiment 1, where search displays always contained only one object with target-matching features, targets and incorrect conjunction objects elicited identical N2pc and SPCN components, demonstrating that attentional guidance was entirely feature-based. In Experiment 2, where targets and incorrect conjunction objects could appear in the same display, clear evidence for object-based attentional control was found. The target N2pc became larger than the N2pc to incorrect conjunction objects from 250 ms post-stimulus, and only targets elicited SPCN components. This demonstrates that after an initial feature-based guidance phase, object-based templates are activated when they are required to distinguish target and nontarget objects. These templates modulate visual processing and control access to working memory, and their activation may coincide with the start of feature integration processes. Results also suggest that while multiple feature templates can be activated concurrently, only a single object-based target template can guide attention at any given time

    Shifts of attention in the early blind: an ERP study of attentional control processes in the absence of visual spatial information

    Get PDF
    To investigate the role of visual spatial information in the control of spatial attention, event-related brain potentials (ERPs) were recorded during a tactile attention task for a group of totally blind participants who were either congenitally blind or had lost vision during infancy, and for an age-matched, sighted control group who performed the task in the dark. Participants had to shift attention to the left or right hand (as indicated by an auditory cue presented at the start of each trial) in order to detect infrequent tactile targets delivered to this hand. Effects of tactile attention on the processing of tactile events, as reflected by attentional modulations of somatosensory ERPs to tactile stimuli, were very similar for early blind and sighted participants, suggesting that the capacity to selectively process tactile information from one hand versus the other does not differ systematically between the blind and the sighted. ERPs measured during the cue–target interval revealed an anterior directing attention negativity (ADAN) that was present for the early blind group as well as for the sighted control group. In contrast, the subsequent posterior late direction attention negativity (LDAP) was absent in both groups. These results suggest that these two components reflect functionally distinct attentional control mechanisms which differ in their dependence on the availability of visually coded representations of external space

    Effects of contrast inversion on face perception depend on gaze location: evidence from the N170 component

    Get PDF
    Face recognition is known to be impaired when the contrast polarity of the eyes is inverted. We studied how contrast affects early perceptual face processing by measuring the face-sensitive N170 component to face images when the contrast of the eyes and of the rest of the face was independently manipulated. Fixation was either located on the eye region or on the lower part of a face. Contrast-reversal of the eyes triggered delayed and enhanced N170 components independently of the contrast of other face parts, and regardless of gaze location. Similar N170 modulations were observed when the rest of a face was contrast-inverted, but only when gaze was directed away from the eyes. Results demonstrate that the contrast of the eyes and of other face parts can both affect face perception, but that the contrast polarity of the eye region has a privileged role during early stages of face processing

    Sustained maintenance of somatotopic information in brain regions recruited by tactile working memory

    Get PDF
    To adaptively guide ongoing behavior, representations in working memory (WM) often have to be modified in line with changing task demands. We used event-related potentials (ERPs) to demonstrate that tactile WM representations are stored in modality-specific cortical regions, that the goal-directed modulation of these representations is mediated through hemispheric-specific activation of somatosensory areas, and that the rehearsal of somatotopic coordinates in memory is accomplished by modality-specific spatial attention mechanisms. Participants encoded two tactile sample stimuli presented simultaneously to the left and right hands, before visual retro-cues indicated which of these stimuli had to be retained to be matched with a subsequent test stimulus on the same hand. Retro-cues triggered a sustained tactile contralateral delay activity component with a scalp topography over somatosensory cortex contralateral to the cued hand. Early somatosensory ERP components to task-irrelevant probe stimuli (that were presented after the retro-cues) and to subsequent test stimuli were enhanced when these stimuli appeared at the currently memorized location relative to other locations on the cued hand, demonstrating that a precise focus of spatial attention was established during the selective maintenance of tactile events in WM. These effects were observed regardless of whether participants performed the matching task with uncrossed or crossed hands, indicating that WM representations in this task were based on somatotopic rather than allocentric spatial coordinates. In conclusion, spatial rehearsal in tactile WM operates within somatotopically organized sensory brain areas that have been recruited for information storage

    The guidance of visual search by shape features and shape configurations

    Get PDF
    Representations of target features (attentional templates) guide attentional object selection during visual search. In many search tasks, targets objects are defined not by a single feature but by the spatial configuration of their component shapes. We used electrophysiological markers of attentional selection processes to determine whether the guidance of shape configuration search is entirely part-based or sensitive to the spatial relationship between shape features. Participants searched for targets defined by the spatial arrangement of two shape components (e.g., hourglass above circle). N2pc components were triggered not only by targets but also by partially matching distractors with one target shape (e.g., hourglass above hexagon) and by distractors that contained both target shapes in the reverse arrangement (e.g., circle above hourglass), in line with part-based attentional control. Target N2pc components were delayed when a reverse distractor was present on the opposite side of the same display, suggesting that early shape-specific attentional guidance processes could not distinguish between targets and reverse distractors. The control of attention then became sensitive to spatial configuration, which resulted in a stronger attentional bias for target objects relative to reverse and partially matching distractors. Results demonstrate that search for target objects defined by the spatial arrangement of their component shapes is initially controlled in a feature-based fashion but can later be guided by templates for spatial configurations

    Independent attention mechanisms control the activation of tactile and visual working memory representations

    Get PDF
    Working memory (WM) is limited in capacity, but it is controversial whether these capacity limitations are domain-general or are generated independently within separate modality-specific memory systems. These alternative accounts were tested in bimodal visual/tactile WM tasks. In Experiment 1, participants memorized the locations of simultaneously presented task-relevant visual and tactile stimuli. Visual and tactile WM load was manipulated independently (1, 2 or 3 items per modality), and one modality was unpredictably tested after each trial. To track the activation of visual and tactile WM representations during the retention interval, the visual and tactile contralateral delay activity (CDA and tCDA) were measured over visual and somatosensory cortex, respectively. CDA and tCDA amplitudes were selectively affected by WM load in the corresponding (tactile or visual) modality. The CDA parametrically increased when visual load increased from 1 to 2 and to 3 items. The tCDA was enhanced when tactile load increased from 1 to 2 items, and showed no further enhancement for 3 tactile items. Critically, these load effects were strictly modality-specific, as substantiated by Bayesian statistics. Increasing tactile load did not affect the visual CDA, and increasing visual load did not modulate the tCDA. Task performance at memory test was also unaffected by WM load in the other (untested) modality. This was confirmed in a second behavioral experiment where tactile and visual loads were either two or four items, unimodal baseline conditions were included, and participants performed a color change detection task in the visual modality. These results show that WM capacity is not limited by a domain-general mechanism that operates across sensory modalities. They suggest instead that WM storage is mediated by distributed modality-specific control mechanisms that are activated independently and in parallel during multisensory WM
    corecore