9 research outputs found
Selective Attention Increases Both Gain and Feature Selectivity of the Human Auditory Cortex
Background. An experienced car mechanic can often deduce what’s wrong with a car by carefully listening to the sound of the ailing engine, despite the presence of multiple sources of noise. Indeed, the ability to select task-relevant sounds for awareness, whilst ignoring irrelevant ones, constitutes one of the most fundamental of human faculties, but the underlying neural mechanisms have remained elusive. While most of the literature explains the neural basis of selective attention by means of an increase in neural gain, a number of papers propose enhancement in neural selectivity as an alternative or a complementary mechanism. Methodology/Principal Findings. Here, to address the question whether pure gain increase alone can explain auditory selective attention in humans, we quantified the auditory cortex frequency selectivity in 20 healthy subjects by masking 1000-Hz tones by continuous noise masker with parametrically varying frequency notches around the tone frequency (i.e., a notched-noise masker). The task of the subjects was, in different conditions, to selectively attend to either occasionally occurring slight increments in tone frequency (1020 Hz), tones of slightly longer duration, or ignore the sounds. In line with previous studies, in the ignore condition, the global field power (GFP) of event-related brain responses at 100 ms from the stimulus onset to the 1000-Hz tones was suppressed as a function of the narrowing of the notch width. During the selective attention conditions, the suppressant effect of the noise notch width on GFP was decreased, but as a function significantly different from a multiplicative one expected on the basis of simple gain model of selective attention. Conclusions/Significance. Our results suggest that auditory selective attention in humans cannot be explained by a gai
Identifying object categories from event-related EEG: Toward decoding of conceptual representations
Contains fulltext :
99404.pdf (publisher's version ) (Open Access)Multivariate pattern analysis is a technique that allows the decoding of conceptual information such as the semantic category of a perceived object from neuroimaging data. Impressive single-trial classification results have been reported in studies that used fMRI. Here, we investigate the possibility to identify conceptual representations from event-related EEG based on the presentation of an object in different modalities: its spoken name, its visual representation and its written name. We used Bayesian logistic regression with a multivariate Laplace prior for classification. Marked differences in classification performance were observed for the tested modalities. Highest accuracies (89% correctly classified trials) were attained when classifying object drawings. In auditory and orthographical modalities, results were lower though still significant for some subjects. The employed classification method allowed for a precise temporal localization of the features that contributed to the performance of the classifier for three modalities. These findings could help to further understand the mechanisms underlying conceptual representations. The study also provides a first step towards the use of concept decoding in the context of real-time brain-computer interface applications.12 p
Processing of auditory stimuli during auditory and visual attention as revealed by event-related potentials
Regularity of unit length boosts statistical learning in verbal and nonverbal artificial languages
Attentional modulation of human auditory cortex
Attention powerfully influences auditory perception, but little is understood about the mechanisms whereby attention sharpens responses to unattended sounds. We used high-resolution surface mapping techniques (using functional magnetic resonance imaging, fMRI) to examine activity in human auditory cortex during an intermodal selective attention task. Stimulus-dependent activations (SDAs), evoked by unattended sounds during demanding visual tasks, were maximal over mesial auditory cortex. They were tuned to sound frequency and location, and showed rapid adaptation to repeated sounds. Attention-related modulations (ARMs) were isolated as response enhancements that occurred when subjects performed pitch-discrimination tasks. In contrast to SDAs, ARMs were localized to lateral auditory cortex, showed broad frequency and location tuning, and increased in amplitude with sound repetition. The results suggest a functional dichotomy of auditory cortical fields: stimulus-determined mesial fields that faithfully transmit acoustic information, and attentionally labile lateral fields that analyze acoustic features of behaviorally relevant sounds
