53 research outputs found

    Dual Extended Kalman Filter for the Identification of Time-Varying Human Manual Control Behavior

    Get PDF
    A Dual Extended Kalman Filter was implemented for the identification of time-varying human manual control behavior. Two filters that run concurrently were used, a state filter that estimates the equalization dynamics, and a parameter filter that estimates the neuromuscular parameters and time delay. Time-varying parameters were modeled as a random walk. The filter successfully estimated time-varying human control behavior in both simulated and experimental data. Simple guidelines are proposed for the tuning of the process and measurement covariance matrices and the initial parameter estimates. The tuning was performed on simulation data, and when applied on experimental data, only an increase in measurement process noise power was required in order for the filter to converge and estimate all parameters. A sensitivity analysis to initial parameter estimates showed that the filter is more sensitive to poor initial choices of neuromuscular parameters than equalization parameters, and bad choices for initial parameters can result in divergence, slow convergence, or parameter estimates that do not have a real physical interpretation. The promising results when applied to experimental data, together with its simple tuning and low dimension of the state-space, make the use of the Dual Extended Kalman Filter a viable option for identifying time-varying human control parameters in manual tracking tasks, which could be used in real-time human state monitoring and adaptive human-vehicle haptic interfaces

    Effects of Eye Measures on Human Controller Remnant and Control Behavior

    Get PDF
    The aim of the current research was to investigate the possible relation between changes in eye activity parameters, variations in human remnant at the perceptual level and changes in human operator model parameters. Fourteen subjects performed a pitch tracking task, in which the display brightness was varied by changing the background color around a simplified primary flight display, in order to create a controlled, quasilinear change in the pupil diameter through the pupillary light reflex. Pupil diameter, blink, eye opening, and opening and closing amplitudes and speeds were recorded using an eye tracker. Participants controlled single integrator-like and double integrator-like dynamics. The variation in pupil diameter did not introduce significant differences in neither remnant characteristics nor the human operator model parameters. An interesting effect occurred in the human controllers time delay for the single integrator task, where the time delay was significantly higher for the darkest brightness compared to the other levels of brightness. This effect was not observed for the double integrator dynamics. Data suggested that the more difficult controlled dynamics induced a squinting effect, visible in smaller eye opening, and smaller eye opening and closing amplitudes. These results suggest that performance, and control behavior are invariant to the display brightness. Moreover, monitoring changes in the eye activity could represent a method of predicting variations in human remnant characteristics and human controller model parameters, introduced by task difficulty

    Modelling individual motion sickness accumulation in vehicles and driving simulators

    Full text link
    Users of automated vehicles will move away from being drivers to passengers, preferably engaged in other activities such as reading or using laptops and smartphones, which will strongly increase susceptibility to motion sickness. Similarly, in driving simulators, the presented visual motion with scaled or even without any physical motion causes an illusion of passive motion, creating a conflict between perceived and expected motion, and eliciting motion sickness. Given the very large differences in sickness susceptibility between individuals, we need to consider sickness at an individual level. This paper combines a group-averaged sensory conflict model with an individualized accumulation model to capture individual differences in motion sickness susceptibility across various vision conditions. The model framework can be used to develop personalized models for users of automated vehicles and improve the design of new motion cueing algorithms for simulators. The feasibility and accuracy of this model framework are verified using two existing datasets with sickening. Both datasets involve passive motion, representative of being driven by an automated vehicle. The model is able to fit an individuals motion sickness responses using only 2 parameters (gain K1 and time constant T1), as opposed to the 5 parameters in the original model. This ensures unique parameters for each individual. Better fits, on average by a factor of 1.7 of an individuals motion sickness levels, are achieved as compared to using only the group-averaged model. Thus, we find that models predicting group-averaged sickness incidence cannot be used to predict sickness at an individual level. On the other hand, the proposed combined model approach predicts individual motion sickness levels and thus can be used to control sickness.Comment: 8 pages, 9 figure

    Impact of whole-body vibrations on electrovibration perception varies with target stimulus duration

    Full text link
    This study explores the impact of whole-body vibrations induced by external vehicle perturbations, such as aircraft turbulence, on the perception of electrovibration displayed on touchscreens. Electrovibration holds promise as a technology for providing tactile feedback on future touchscreens, addressing usability challenges in vehicle cockpits. However, its performance under dynamic conditions, such as during whole-body vibrations induced by turbulence, still needs to be explored. We measured the absolute detection thresholds of 15 human participants for short- and long-duration electrovibration stimuli displayed on a touchscreen, both in the absence and presence of two types of turbulence motion generated by a motion simulator. Concurrently, we measured participants' applied contact force and finger scan speeds. Significantly higher (38%) absolute detection thresholds were observed for short electrovibration stimuli than for long stimuli. Finger scan speeds in the direction of turbulence, applied forces, and force fluctuation rates increased during whole-body vibrations due to biodynamic feedthrough. As a result, turbulence also significantly increased the perception thresholds, but only for short-duration electrovibration stimuli. The results reveal that whole-body vibrations can impede the perception of short-duration electrovibration stimuli, due to involuntary finger movements and increased normal force fluctuations. Our findings offer valuable insights for the future design of touchscreens with tactile feedback in vehicle cockpits.Comment: 28 pages; 7 figures, journa

    Personalizing motion sickness models: estimation and statistical modeling of individual-specific parameters

    Get PDF
    As users transition from drivers to passengers in automated vehicles, they often take their eyes off the road to engage in non-driving activities. In driving simulators, visual motion is presented with scaled or without physical motion, leading to a mismatch between expected and perceived motion. Both conditions elicit motion sickness, calling for enhanced vehicle and simulator motion control strategies. Given the large differences in sickness susceptibility between individuals, effective countermeasures must address this at a personal level. This paper combines a group-averaged sensory conflict model with an individualized Accumulation Model (AM) to capture individual differences in motion sickness susceptibility across various conditions. The feasibility of this framework is verified using three datasets involving sickening conditions: (1) vehicle experiments with and without outside vision, (2) corresponding vehicle and driving simulator experiments, and (3) vehicle experiments with various non-driving-related tasks. All datasets involve passive motion, mirroring experience in automated vehicles. The preferred model (AM2) can fit individual motion sickness responses across conditions using only two individualized parameters (gain K1 and time constant T1) instead of the original five, ensuring unique parameters for each participant and generalisability across conditions. An average improvement factor of 1.7 in fitting individual motion sickness responses is achieved with the AM2 model compared to the group-averaged AM0 model. This framework demonstrates robustness by accurately modeling distinct motion and vision conditions. A Gaussian mixture model of the parameter distribution across a population is developed, which predicts motion sickness in an unseen dataset with an average RMSE of 0.47. This model reduces the need for large-scale population experiments, accelerating research and development

    Multimodal Pilot Behavior in Multi-Axis Tracking Tasks with Time-Varying Motion Cueing Gains

    Full text link
    corecore