72 research outputs found

    Using neurophysiological signals that reflect cognitive or affective state: Six recommendations to avoid common pitfalls

    Get PDF
    Estimating cognitive or affective state from neurophysiological signals and designing applications that make use of this information requires expertise in many disciplines such as neurophysiology, machine learning, experimental psychology, and human factors. This makes it difficult to perform research that is strong in all its aspects as well as to judge a study or application on its merits. On the occasion of the special topic “Using neurophysiological signals that reflect cognitive or affective state” we here summarize often occurring pitfalls and recommendations on how to avoid them, both for authors (researchers) and readers. They relate to defining the state of interest, the neurophysiological processes that are expected to be involved in the state of interest, confounding factors, inadvertently “cheating” with classification analyses, insight on what underlies successful state estimation, and finally, the added value of neurophysiological measures in the context of an application. We hope that this paper will support the community in producing high quality studies and well-validated, useful applications

    Competition between auditory and visual spatial cues during visual task performance

    Get PDF
    There is debate in the crossmodal cueing literature as to whether capture of visual attention by means of sound is a fully automatic process. Recent studies show that when visual attention is endogenously focused sound still captures attention. The current study investigated whether there is interaction between exogenous auditory and visual capture. Participants preformed an orthogonal cueing task, in which, the visual target was preceded by both a peripheral visual and auditory cue. When both cues were presented at chance level, visual and auditory capture was observed. However, when the validity of the visual cue was increased to 80% only visual capture and no auditory capture was observed. Furthermore, a highly predictive (80% valid) auditory cue was not able to prevent visual capture. These results demonstrate that crossmodal auditory capture does not occur when a competing predictive visual event is presented and is therefore not a fully automatic process

    Testing the Applicability of a Checklist-Based Startle Management Method in the Simulator

    Get PDF
    Several checklist-based methods have been proposed to help pilots manage startle in unexpected situations. In the current experiment, we tested how pilots reacted to using such a method, which featured the mnemonic COOL: Calm down – Observe – Outline – Lead. Using a motion-based simulator outfitted with a non-linear aerodynamic model of a small twin-propeller aircraft, twelve pilots practiced using the COOL method before performing four test scenarios involving startling events. Application of the full method in the test scenarios was high (90-100%), and pilots rated the method on average as useful (4 on a 1-5 point Likert scale). The first two steps of the method were seen as the “core” of the method. However, pilots also displayed difficulty with prioritizing dealing with immediate threats over executing the method. The results are promising, but they also warn us to be cautious when introducing a startle management method

    The absence of an auditory-visual attentional blink is not due to echoic memory.

    Get PDF
    Als binnen een halve seconde twee visuele items in een serieel aangeboden stroom moeten worden geselecteerd, is de prestatie voor het tweede item vaak relatief slecht (er treedt een “attentional blink” op); wanneer het eerste echter item auditief wordt aangeboden, verdwijnt de blink meestal. We hebben aangetoond dat dit laatste niet wordt veroorzaakt doordat proefpersonen hun echoïsch geheugen gebruiken om de verwerking van het auditieve item uit te stellen tot na het einde van de visuele stroom

    Challenges in understanding and modeling speech perception

    No full text

    Auditory distance perception in rooms

    Full text link

    Effects of pitch, level, and tactile cues on speech segregation

    No full text
    corecore