111 research outputs found

    Pointing to visible and invisible targets

    Get PDF
    We investigated how the visibility of targets influenced the type of point used to provide directions. In Study 1 we asked 605 passersby in three localities for directions to well-known local landmarks. When that landmark was in plain view behind the requester, most respondents pointed with their index fingers, and few respondents pointed more than once. In contrast, when the landmark was not in view, respondents pointed initially with their index fingers, but often elaborated with a whole-hand point. In Study 2, we covertly filmed the responses from 157 passersby we approached for directions, capturing both verbal and gestural responses. As in Study 1, few respondents produced more than one gesture when the target was in plain view and initial points were most likely to be index finger points. Thus, in a Western geographical context in which pointing with the index finger is the dominant form of pointing, a slight change in circumstances elicited a preference for pointing with the whole hand when it was the second or third manual gesture in a sequence

    Increased pain intensity is associated with greater verbal communication difficulty and increased production of speech and co-speech gestures

    Get PDF
    Effective pain communication is essential if adequate treatment and support are to be provided. Pain communication is often multimodal, with sufferers utilising speech, nonverbal behaviours (such as facial expressions), and co-speech gestures (bodily movements, primarily of the hands and arms that accompany speech and can convey semantic information) to communicate their experience. Research suggests that the production of nonverbal pain behaviours is positively associated with pain intensity, but it is not known whether this is also the case for speech and co-speech gestures. The present study explored whether increased pain intensity is associated with greater speech and gesture production during face-to-face communication about acute, experimental pain. Participants (N = 26) were exposed to experimentally elicited pressure pain to the fingernail bed at high and low intensities and took part in video-recorded semi-structured interviews. Despite rating more intense pain as more difficult to communicate (t(25) = 2.21, p = .037), participants produced significantly longer verbal pain descriptions and more co-speech gestures in the high intensity pain condition (Words: t(25) = 3.57, p = .001; Gestures: t(25) = 3.66, p = .001). This suggests that spoken and gestural communication about pain is enhanced when pain is more intense. Thus, in addition to conveying detailed semantic information about pain, speech and co-speech gestures may provide a cue to pain intensity, with implications for the treatment and support received by pain sufferers. Future work should consider whether these findings are applicable within the context of clinical interactions about pain

    An Evolutionary Upgrade of Cognitive Load Theory: Using the Human Motor System and Collaboration to Support the Learning of Complex Cognitive Tasks

    Get PDF
    Cognitive load theory is intended to provide instructional strategies derived from experimental, cognitive load effects. Each effect is based on our knowledge of human cognitive architecture, primarily the limited capacity and duration of a human working memory. These limitations are ameliorated by changes in long-term memory associated with learning. Initially, cognitive load theory's view of human cognitive architecture was assumed to apply to all categories of information. Based on Geary's (Educational Psychologist 43, 179-195 2008; 2011) evolutionary account of educational psychology, this interpretation of human cognitive architecture requires amendment. Working memory limitations may be critical only when acquiring novel information based on culturally important knowledge that we have not specifically evolved to acquire. Cultural knowledge is known as biologically secondary information. Working memory limitations may have reduced significance when acquiring novel

    Designing 'Embodied' Science Learning Experiences for Young Children

    Get PDF
    Research in embodied cognition emphasises the importance of meaningful ‘bodily’ experience, or congruent action, in learning and development. This highlights the need for evidence-based design guidelines for sensorimotor interactions that meaningfully exploit action-based experiences, that are instrumental in shaping the way we conceptualise the world. These sensorimotor experiences are particularly important for young children as they can provide them with an embodied toolkit of resources (independent of language skills or subject specific vocabulary) that they can draw upon to support science ‘think’ and ‘talk’, using their own bodies to develop and express ideas through gesture, that are grounded on sensorimotoric representations from action experiences. Taking an iterative design-based research (DBR) approach, this paper reports the design, development and deployment of a programme of outdoor activities for children aged 4–6 years, that drew on embodied cognition theory to foster meaningful action in relation to ideas of air resistance. This research is relevant to researchers, practitioners and designers. It makes a contribution to learning experience design by making explicit the process of applying key components of embodied cognition theory to the design of science learning activities for early years, and how this can effectively inform digital design

    Telerobotic Pointing Gestures Shape Human Spatial Cognition

    Full text link
    This paper aimed to explore whether human beings can understand gestures produced by telepresence robots. If it were the case, they can derive meaning conveyed in telerobotic gestures when processing spatial information. We conducted two experiments over Skype in the present study. Participants were presented with a robotic interface that had arms, which were teleoperated by an experimenter. The robot could point to virtual locations that represented certain entities. In Experiment 1, the experimenter described spatial locations of fictitious objects sequentially in two conditions: speech condition (SO, verbal descriptions clearly indicated the spatial layout) and speech and gesture condition (SR, verbal descriptions were ambiguous but accompanied by robotic pointing gestures). Participants were then asked to recall the objects' spatial locations. We found that the number of spatial locations recalled in the SR condition was on par with that in the SO condition, suggesting that telerobotic pointing gestures compensated ambiguous speech during the process of spatial information. In Experiment 2, the experimenter described spatial locations non-sequentially in the SR and SO conditions. Surprisingly, the number of spatial locations recalled in the SR condition was even higher than that in the SO condition, suggesting that telerobotic pointing gestures were more powerful than speech in conveying spatial information when information was presented in an unpredictable order. The findings provide evidence that human beings are able to comprehend telerobotic gestures, and importantly, integrate these gestures with co-occurring speech. This work promotes engaging remote collaboration among humans through a robot intermediary.Comment: 27 pages, 7 figure

    Tense and aspect in word problems about motion: diagram, gesture, and the felt experience of time

    Get PDF
    © 2014, Mathematics Education Research Group of Australasia, Inc. Word problems about motion contain various conjugated verb forms. As students and teachers grapple with such word problems, they jointly operationalize diagrams, gestures, and language. Drawing on findings from a 3-year research project examining the social semiotics of classroom interaction, we show how teachers and students use gesture and diagram to make sense of complex verb forms in such word problems. We focus on the grammatical category of “aspect” for how it broadens the concept of verb tense. Aspect conveys duration and completion or frequency of an event. The aspect of a verb defines its temporal flow (or lack thereof) and the location of a vantage point for making sense of this durational process

    From hands to minds: Gestures promote understanding

    Get PDF
    Gestures serve many roles in communication, learning and understanding both for those who view them and those who create them. Gestures are especially effective when they bear resemblance to the thought they represent, an advantage they have over words. Here, we examine the role of conceptually congruent gestures in deepening understanding of dynamic systems. Understanding the structure of dynamic systems is relatively easy, but understanding the actions of dynamic systems can be challenging. We found that seeing gestures representing actions enhanced understanding of the dynamics of a complex system as revealed in invented language, gestures and visual explanations. Gestures can map many meanings more directly than language, representing many concepts congruently. Designing and using gestures congruent with meaning can augment comprehension and learning
    corecore