13 research outputs found

    Timing and correction of stepping movements with a virtual reality avatar

    Get PDF
    Research into the ability to coordinate one’s movements with external cues has focussed on the use of simple rhythmic, auditory and visual stimuli, or interpersonal coordination with another person. Coordinating movements with a virtual avatar has not been explored, in the context of responses to temporal cues. To determine whether cueing of movements using a virtual avatar is effective, people’s ability to accurately coordinate with the stimuli needs to be investigated. Here we focus on temporal cues, as we know from timing studies that visual cues can be difficult to follow in the timing context. Real stepping movements were mapped onto an avatar using motion capture data. Healthy participants were then motion captured whilst stepping in time with the avatar’s movements, as viewed through a virtual reality headset. The timing of one of the avatar step cycles was accelerated or decelerated by 15% to create a temporal perturbation, for which participants would need to correct to, in order to remain in time. Step onset times of participants relative to the corresponding step-onsets of the avatar were used to measure the timing errors (asynchronies) between them. Participants completed either a visual-only condition, or auditory-visual with footstep sounds included, at two stepping tempo conditions (Fast: 400ms interval, Slow: 800ms interval). Participants’ asynchronies exhibited slow drift in the Visual-Only condition, but became stable in the Auditory-Visual condition. Moreover, we observed a clear corrective response to the phase perturbation in both the fast and slow tempo auditory-visual conditions. We conclude that an avatar’s movements can be used to influence a person’s own motion, but should include relevant auditory cues congruent with the movement to ensure a suitable level of entrainment is achieved. This approach has applications in physiotherapy, where virtual avatars present an opportunity to provide the guidance to assist patients in adhering to prescribed exercises

    Multisensory cues facilitate coordination of stepping movements with a virtual reality avatar

    Get PDF
    The effectiveness of simple sensory cues for retraining gait have been demonstrated, yet the feasibility of humanoid avatars for entrainment have yet to be investigated. Here, we describe the development of a novel method of visually cued training, in the form of a virtual partner, and investigate its ability to provide movement guidance in the form of stepping. Real stepping movements were mapped onto an avatar using motion capture data. The trajectory of one of the avatar step cycles was then accelerated or decelerated by 15% to create a perturbation. Healthy participants were motion captured while instructed to step in time to the avatar's movements, as viewed through a virtual reality headset. Step onset times were used to measure the timing errors (asynchronies) between them. Participants completed either a visual-only condition, or auditory-visual with footstep sounds included. Participants' asynchronies exhibited slow drift in the Visual-Only condition, but became stable in the Auditory-Visual condition. Moreover, we observed a clear corrective response to the phase perturbation in both auditory-visual conditions. We conclude that an avatar's movements can be used to influence a person's own gait, but should include relevant auditory cues congruent with the movement to ensure a suitable accuracy is achieved

    Mean relative asynchronies before and after the perturbation (vertical grey bar) at time T for Auditory-Visual cues.

    No full text
    Dotted horizontal line shows the zero relative asynchrony measure, to which participants were expected to correct towards following the perturbation. Separate plots are shown for Fast, shortened (a) and lengthened (b) intervals, and Slow, shortened (c) and lengthened (d) intervals. Error bars show SEM.</p

    Asynchrony between participant step onsets and corresponding Avatar steps pre-perturbation.

    No full text
    a). Mean Asynchrony between participant step onsets and corresponding Avatar steps, for all conditions. This asynchrony reflects baseline performance. Negative asynchronies indicate the participants are, on average, stepping ahead of the Avatar cue. Error bars show SEM. b). Mean standard deviation of asynchrony for all conditions. Error bars show SEM. * represents significance at p<.05, ** is p<.01.</p

    Example of asynchronies before and after phase wrapping removed.

    No full text
    a). Asynchronies observed for a fast tempo trial when matching participant onsets to the nearest Avatar step onsets. Wrapping of the asynchrony can be seen between the intervals of +/-0.4 seconds which does not correspond to an attempt to regain synchrony. b). Asynchronies for the same trial but with phase wrapping removed and steps assigned to ensure asynchronies are continuous. This is a severe example of the drift observed in some fast tempo visual-only trials to demonstrate the unwrapping procedure.</p
    corecore