93 research outputs found

    Vibrotactile feedback as a countermeasure for spatial disorientation

    Get PDF
    Spaceflight can make astronauts susceptible to spatial disorientation which is one of the leading causes of fatal aircraft accidents. In our experiment, blindfolded participants used a joystick to balance themselves while inside a multi-axis rotation device (MARS) in either the vertical or horizontal roll plane. On Day 1, in the vertical roll plane (Earth analog condition) participants could use gravitational cues and therefore had a good sense of their orientation. On Day 2, in the horizontal roll plane (spaceflight analog condition) participants could not use gravitational cues and rapidly became disoriented and showed minimal learning and poor performance. One potential countermeasure for spatial disorientation is vibrotactile feedback that conveys body orientation provided by small vibrating devices applied to the skin. Orientation-dependent vibrotactile feedback provided to one group enhanced performance in the spaceflight condition but the participants reported a conflict between the accurate vibrotactile cues and their erroneous perception of their orientation. Specialized vibrotactile training on Day 1 provided to another group resulted in significantly better learning and performance in the spaceflight analog task with vibrotactile cueing. In this training, participants in the Earth analog condition on Day 1 were required to disengage from the task of aligning with the gravitational vertical encoded by natural vestibular/somatosensory afference and had to align with randomized non-vertical directions of balance signaled by vibrotactile feedback. At the end of Day 2, we deactivated the vibrotactile feedback after both vibration-cued groups had practiced with it in the spaceflight analog condition. They performed as well as the group who did not have any vibrotactile feedback. We conclude that after appropriate training, vibrotactile orientation feedback augments dynamic spatial orientation and does not lead to any negative dependence

    Eye-Hand Coordination during Dynamic Visuomotor Rotations

    Get PDF
    Background for many technology-driven visuomotor tasks such as tele-surgery, human operators face situations in which the frames of reference for vision and action are misaligned and need to be compensated in order to perform the tasks with the necessary precision. The cognitive mechanisms for the selection of appropriate frames of reference are still not fully understood. This study investigated the effect of changing visual and kinesthetic frames of reference during wrist pointing, simulating activities typical for tele-operations. Methods using a robotic manipulandum, subjects had to perform center-out pointing movements to visual targets presented on a computer screen, by coordinating wrist flexion/extension with abduction/adduction. We compared movements in which the frames of reference were aligned (unperturbed condition) with movements performed under different combinations of visual/kinesthetic dynamic perturbations. The visual frame of reference was centered to the computer screen, while the kinesthetic frame was centered around the wrist joint. Both frames changed their orientation dynamically (angular velocity\u200a=\u200a36\ub0/s) with respect to the head-centered frame of reference (the eyes). Perturbations were either unimodal (visual or kinesthetic), or bimodal (visual+kinesthetic). As expected, pointing performance was best in the unperturbed condition. The spatial pointing error dramatically worsened during both unimodal and most bimodal conditions. However, in the bimodal condition, in which both disturbances were in phase, adaptation was very fast and kinematic performance indicators approached the values of the unperturbed condition. Conclusions this result suggests that subjects learned to exploit an \u201caffordance\u201d made available by the invariant phase relation between the visual and kinesthetic frames. It seems that after detecting such invariance, subjects used the kinesthetic input as an informative signal rather than a disturbance, in order to compensate the visual rotation without going through the lengthy process of building an internal adaptation model. Practical implications are discussed as regards the design of advanced, high-performance man-machine interfaces

    Measuring Multi-Joint Stiffness during Single Movements: Numerical Validation of a Novel Time-Frequency Approach

    Get PDF
    This study presents and validates a Time-Frequency technique for measuring 2-dimensional multijoint arm stiffness throughout a single planar movement as well as during static posture. It is proposed as an alternative to current regressive methods which require numerous repetitions to obtain average stiffness on a small segment of the hand trajectory. The method is based on the analysis of the reassigned spectrogram of the arm's response to impulsive perturbations and can estimate arm stiffness on a trial-by-trial basis. Analytic and empirical methods are first derived and tested through modal analysis on synthetic data. The technique's accuracy and robustness are assessed by modeling the estimation of stiffness time profiles changing at different rates and affected by different noise levels. Our method obtains results comparable with two well-known regressive techniques. We also test how the technique can identify the viscoelastic component of non-linear and higher than second order systems with a non-parametrical approach. The technique proposed here is very impervious to noise and can be used easily for both postural and movement tasks. Estimations of stiffness profiles are possible with only one perturbation, making our method a useful tool for estimating limb stiffness during motor learning and adaptation tasks, and for understanding the modulation of stiffness in individuals with neurodegenerative diseases

    The Proprioceptive Map of the Arm Is Systematic and Stable, but Idiosyncratic

    Get PDF
    Visual and somatosensory signals participate together in providing an estimate of the hand's spatial location. While the ability of subjects to identify the spatial location of their hand based on visual and proprioceptive signals has previously been characterized, relatively few studies have examined in detail the spatial structure of the proprioceptive map of the arm. Here, we reconstructed and analyzed the spatial structure of the estimation errors that resulted when subjects reported the location of their unseen hand across a 2D horizontal workspace. Hand position estimation was mapped under four conditions: with and without tactile feedback, and with the right and left hands. In the task, we moved each subject's hand to one of 100 targets in the workspace while their eyes were closed. Then, we either a) applied tactile stimulation to the fingertip by allowing the index finger to touch the target or b) as a control, hovered the fingertip 2 cm above the target. After returning the hand to a neutral position, subjects opened their eyes to verbally report where their fingertip had been. We measured and analyzed both the direction and magnitude of the resulting estimation errors. Tactile feedback reduced the magnitude of these estimation errors, but did not change their overall structure. In addition, the spatial structure of these errors was idiosyncratic: each subject had a unique pattern of errors that was stable between hands and over time. Finally, we found that at the population level the magnitude of the estimation errors had a characteristic distribution over the workspace: errors were smallest closer to the body. The stability of estimation errors across conditions and time suggests the brain constructs a proprioceptive map that is reliable, even if it is not necessarily accurate. The idiosyncrasy across subjects emphasizes that each individual constructs a map that is unique to their own experiences

    Velocity storage: its multiple roles

    Full text link
    Our research described in this article was motivated by the puzzling finding of the Skylab M131 experiments: head movements made while rotating that are nauseogenic and disorienting on Earth are innocuous in a weightless, 0- g environment. We describe a series of parabolic flight experiments that directly addressed this puzzle and discovered the gravity-dependent responses to semicircular canal stimulation, consistent with the principles of velocity storage. We describe a line of research that started in a different direction, investigating dynamic balancing, but ended up pointing to the gravity dependence of angular velocity-to-position integration of semicircular canal signals. Together, these lines of research and the theoretical framework of velocity storage provide an answer to at least part of the M131 puzzle. We also describe recently discovered neural circuits by which active, dynamic vestibular, multisensory, and motor signals are interpreted as either appropriate for action and orientation or as conflicts evoking motion sickness and disorientation.</jats:p

    The Role of Reafference in Recalibration of Limb Movement Control and Locomotion

    Full text link
    The reafference model has frequently been used to explain spatial constancy during eye and head movements. We have found that its basic concepts also form part of the information processing necessary for the control and recalibration of reaching movements. Reaching was studied in a novel force environment–a rotating room that creates centripetal forces of the type that could someday substitute for gravity in space flight, and Coriolis forces which are side effects of rotation. We found that inertial, noncontacting Coriolis forces deviate the path and endpoint of reaching movements, a finding that shows the inadequacy of equilibrium position models of movement control. Repeated movements in the rotating room quickly lead to normal movement patterns and to a failure to perceive the perturbing forces. The first movements made after rotation stops, without Coriolis forces present, show mirror-image deviations and evoke perception of a perturbing force even though none is present. These patterns of sensorimotor control and adaptation can largely be explained on the basis of comparisons of efference copy, reafferent muscle spindle, and cutaneous mechanoreceptor signals. We also describe experiments on human iocomotion using an apparatus similar to that which Mittelstaedt used to study the optomotor response of the Eristalis fly. These results show that the reafference principle relates as well to the perception of the forces acting on and exerted by the body during voluntary locomotion.</jats:p

    Multisensory, Cognitive, and Motor Influences on Human Spatial Orientation in Weightlessness

    Full text link
    Exposure to weightlessness affects the control and appreciation of body position and orientation. In free fall the perception of one’s own orientation and that of the surroundings is dependent on the presence or absence of contact cues, whether part of the body is visible in relation to the architecturally defined verticals of the space craft, cognitive factors, and exposure history. Sensations of falling are not elicited in free fall when the eyes are closed or the visual field is stabilized. This indicates that visual and cognitive factors as well as vestibular ones must be implicated in the genesis of such sensations under normal circumstances. Position sense of the limbs is also degraded in free fall. This may be due to alterations in skeletal muscle spindle gain owing to a decreased otolith-spinal activation. We provide evidence that during initial exposure to weightlessness there is a decrease in muscle stiffness which affects movement accuracy. The altered loading of the skeletal muscles due to the head and body being weightless are shown to be significant etiological factors in space motion sickness.</jats:p
    corecore