293 research outputs found

    Motor simulation without motor expertise: enhanced corticospinal excitability in visually experienced dance spectators

    Get PDF
    The human “mirror-system” is suggested to play a crucial role in action observation and execution, and is characterized by activity in the premotor and parietal cortices during the passive observation of movements. The previous motor experience of the observer has been shown to enhance the activity in this network. Yet visual experience could also have a determinant influence when watching more complex actions, as in dance performances. Here we tested the impact visual experience has on motor simulation when watching dance, by measuring changes in corticospinal excitability. We also tested the effects of empathic abilities. To fully match the participants' long-term visual experience with the present experimental setting, we used three live solo dance performances: ballet, Indian dance, and non-dance. Participants were either frequent dance spectators of ballet or Indian dance, or “novices” who never watched dance. None of the spectators had been physically trained in these dance styles. Transcranial magnetic stimulation was used to measure corticospinal excitability by means of motor-evoked potentials (MEPs) in both the hand and the arm, because the hand is specifically used in Indian dance and the arm is frequently engaged in ballet dance movements. We observed that frequent ballet spectators showed larger MEP amplitudes in the arm muscles when watching ballet compared to when they watched other performances. We also found that the higher Indian dance spectators scored on the fantasy subscale of the Interpersonal Reactivity Index, the larger their MEPs were in the arms when watching Indian dance. Our results show that even without physical training, corticospinal excitability can be enhanced as a function of either visual experience or the tendency to imaginatively transpose oneself into fictional characters. We suggest that spectators covertly simulate the movements for which they have acquired visual experience, and that empathic abilities heighten motor resonance during dance observation

    Using humanoid robots to study human behavior

    Get PDF
    Our understanding of human behavior advances as our humanoid robotics work progresses-and vice versa. This team's work focuses on trajectory formation and planning, learning from demonstration, oculomotor control and interactive behaviors. They are programming robotic behavior based on how we humans “program” behavior in-or train-each other

    A Psychophysical Investigation of Differences between Synchrony and Temporal Order Judgments.

    Get PDF
    Synchrony judgments involve deciding whether cues to an event are in synch or out of synch, while temporal order judgments involve deciding which of the cues came first. When the cues come from different sensory modalities these judgments can be used to investigate multisensory integration in the temporal domain. However, evidence indicates that that these two tasks should not be used interchangeably as it is unlikely that they measure the same perceptual mechanism. The current experiment further explores this issue across a variety of different audiovisual stimulus types

    Event-related alpha suppression in response to facial motion

    Get PDF
    This article has been made available through the Brunel Open Access Publishing Fund.While biological motion refers to both face and body movements, little is known about the visual perception of facial motion. We therefore examined alpha wave suppression as a reduction in power is thought to reflect visual activity, in addition to attentional reorienting and memory processes. Nineteen neurologically healthy adults were tested on their ability to discriminate between successive facial motion captures. These animations exhibited both rigid and non-rigid facial motion, as well as speech expressions. The structural and surface appearance of these facial animations did not differ, thus participants decisions were based solely on differences in facial movements. Upright, orientation-inverted and luminance-inverted facial stimuli were compared. At occipital and parieto-occipital regions, upright facial motion evoked a transient increase in alpha which was then followed by a significant reduction. This finding is discussed in terms of neural efficiency, gating mechanisms and neural synchronization. Moreover, there was no difference in the amount of alpha suppression evoked by each facial stimulus at occipital regions, suggesting early visual processing remains unaffected by manipulation paradigms. However, upright facial motion evoked greater suppression at parieto-occipital sites, and did so in the shortest latency. Increased activity within this region may reflect higher attentional reorienting to natural facial motion but also involvement of areas associated with the visual control of body effectors. © 2014 Girges et al

    perceiving animacy and arousal in transformed displays of human interaction

    Get PDF
    When viewing a moving abstract stimulus, people tend to attribute social meaning and purpose to the movement. The classic work of Heider and Simmel [1] investigated how observers would describe movement of simple geometric shapes (circle, triangles, and a square) around a screen. A high proportion of participants reported seeing some form of purposeful interaction between the three abstract objects and defining this interaction as a social encounter. Various papers have subsequently found similar results [2,3] and gone on to show that, as Heider and Simmel suggested, the phenomenon was due more to the relationship in space and time of the objects, rather than any particular object characteristic. The research of Tremoulet and Feldman [4] has shown that the percept of animacy may be elicited with a solitary moving object. They asked observers to rate the movement of a single dot or rectangle for whether it was under the influence of an external force, or whether it was in control of its own motion. At mid-trajectory the shape would change speed or direction, or both. They found that shapes that either changed direction greater than 25 degrees from the original trajectory, or changed speed, were judged to be "more alive" than others. Further discussion and evidence of animacy with one or two small dots can be found in Gelman, Durgin and Kaufman [5] Our aim was to further study this phenomenon by using a different method of stimulus production. Previous methods for producing displays of animate objects have relied either on handcrafted stimuli or on parametric variations of simple motion patterns. It is our aim to work towards a new automatic approach by taking actual human movements, transforming them into basic shapes, and exploring what motion properties need to be preserved to obtain animacy. Though the phenomenon of animacy has been shown for many years, using various different displays, very few specific criteria have been set on the essential characteristics of the displays. Part of this research is to try and establish what movements result in percepts of animacy, and in turn, to give further understanding of essential characteristics of human movement and social interaction. In this paper we discuss two experiments in which we examine how different transformations of an original video of a dance influences perception of animacy. We also examine reports of arousal, Experiment 1, and emotional engagement in Experiment 2

    Self domestication and the evolution of language

    Get PDF
    We set out an account of how self-domestication plays a crucial role in the evolution of language. In doing so, we focus on the growing body of work that treats language structure as emerging from the process ofcultural transmission. We argue that a full recognition of the importance of cultural transmission fundamentally changes the kind ofquestionswe should be asking regarding the biological basis of language structure. If we think of language structure as reflecting an accumulated set of changes in our genome, then we might ask something like, "What are the genetic bases of language structure and why were they selected?" However, if cultural evolution can account for language structure, then this question no longer applies. Instead, we face the task of accounting for the origin of the traits that enabled that process of structure-creating cultural evolution to get started in the first place. In light of work on cultural evolution, then, the new question for biological evolution becomes, "How did those precursor traits evolve?" We identify two key precursor traits: (1) the transmission of the communication system throughlearning; and (2) the ability to infer thecommunicative intentassociated with a signal or action. We then describe two comparative case studies-the Bengalese finch and the domestic dog-in which parallel traits can be seen emerging followingdomestication. Finally, we turn to the role of domestication in human evolution. We argue that the cultural evolution of language structure has its origin in an earlier process of self-domestication.</p

    Wild chimpanzees modify modality of gestures according to the strength of social bonds and personal network size

    Get PDF
    Primates form strong and enduring social bonds with others and these bonds have important fitness consequences. However, how different types of communication are associated with different types of social bonds is poorly understood. Wild chimpanzees have a large repertoire of gestures, from visual gestures to tactile and auditory gestures. We used social network analysis to examine the association between proximity bonds (time spent in close proximity) and rates of gestural communication in pairs of chimpanzees when the intended recipient was within 10 m of the signaller. Pairs of chimpanzees with strong proximity bonds had higher rates of visual gestures, but lower rates of auditory long-range and tactile gestures. However, individual chimpanzees that had a larger number of proximity bonds had higher rates of auditory and tactile gestures and lower rates of visual gestures. These results suggest that visual gestures may be an efficient way to communicate with a small number of regular interaction partners, but that tactile and auditory gestures may be more effective at communicating with larger numbers of weaker bonds. Increasing flexibility of communication may have played an important role in managing differentiated social relationships in groups of increasing size and complexity in both primate and human evolution

    Differences in audiovisual temporal processing in autistic adults are specific to simultaneity judgments

    Get PDF
    Research has shown that children on the autism spectrum and adults with high levels of autistic traits are less sensitive to audiovisual asynchrony compared to their neurotypical peers. However, this evidence has been limited to simultaneity judgments (SJ) which require participants to consider the timing of two cues together. Given evidence of partly divergent perceptual and neural mechanisms involved in making temporal order judgments (TOJ) and SJ, and given that SJ require a more global type of processing which may be impaired in autistic individuals, here we ask whether the observed differences in audiovisual temporal processing are task and stimulus specific. We examined the ability to detect audiovisual asynchrony in a group of 26 autistic adult males and a group of age and IQ-matched neurotypical males. Participants were presented with beep-flash, point-light drumming, and face-voice displays with varying degrees of asynchrony and asked to make SJ and TOJ. The results indicated that autistic participants were less able to detect audiovisual asynchrony compared to the control group, but this effect was specific to SJ and more complex social stimuli (e.g., face-voice) with stronger semantic correspondence between the cues, requiring a more global type of processing. This indicates that audiovisual temporal processing is not generally different in autistic individuals and that a similar level of performance could be achieved by using a more local type of processing, thus informing multisensory integration theory as well as multisensory training aimed to aid perceptual abilities in this population

    Thermal in-car interaction for navigation

    Get PDF
    In this demonstration we show a thermal interaction design on the steering wheel for navigational cues in a car. Participants will be able to use a thermally enhanced steering wheel to follow instructions given in a turn-to-turn based navigation task in a virtual city. The thermal cues will be provided on both sides of the steering wheel and will indicate the turning direction by warming the corresponding side, while the opposite side is being cooled

    Perceived motion in structure from motion: Pointing responses to the axis of rotation

    Full text link
    We investigated the ability to match finger orientation to the direction of the axis of rotation in structure-from-motion displays. Preliminary experiments verified that subjects could accurately use the index finger to report direction. The remainder of the experiments studied the perception of the axis of rotation from full rotations of a group of discrete points, the profiles of a rotating ellipsoid, and two views of a group of discrete points. Subjects' responses were analyzed by decomposing the pointing responses into their slant and tilt components. Overall, the results indicated that subjects were sensitive to both slant and tilt. However, when the axis of rotation was near the viewing direction, subjects had difficulty reporting tilt with profiles and two views and showed a large bias in their slant judgments with two views and full rotations. These results are not entirely consistent with theoretical predictions. The results, particularly for two views, suggest that additional constraints are used by humans in the recovery of structure from motion
    corecore