63 research outputs found
A Pilot Study on Facial Expression Recognition Ability of Autistic Children Using Ryan, a Rear-Projected Humanoid Robot
Rear-projected robots use computer graphics technology to create facial animations and project them on a mask to show the robot’s facial cues and expressions. These types of robots are becoming commercially available, though more research is required to understand how they can be effectively used as a socially assistive robotic agent. This paper presents the results of a pilot study on comparing the facial expression recognition abilities of children with Autism Spectrum Disorder (ASD) with typically developing (TD) children using a rear-projected humanoid robot called Ryan. Six children with ASD and six TD children participated in this research, where Ryan showed them six basic expressions (i.e. anger, disgust, fear, happiness, sadness, and surprise) with different intensity levels. Participants were asked to identify the expressions portrayed by Ryan. The results of our study show that there is not any general impairment in expression recognition ability of the ASD group comparing to the TD control group; however, both groups showed deficiencies in identifying disgust and fear. Increasing the intensity of Ryan’s facial expressions significantly improved the expression recognition accuracy. Both groups were successful to recognize the expressions demonstrated by Ryan with high average accuracy
Simultaneous shape repulsion and global assimilation in the perception of aspect ratio
Although local interactions involving orientation and spatial frequency are well understood, less is known about spatial interactions involving higher level pattern features. We examined interactive coding of aspect ratio, a prevalent twodimensional feature. We measured perception of two simultaneously flashed ellipses by randomly post-cueing one of them and having observers indicate its aspect ratio. Aspect ratios interacted in two ways. One manifested as an aspect-ratiorepulsion effect. For example, when a slightly tall ellipse and a taller ellipse were simultaneously flashed, the less tall ellipse appeared flatter and the taller ellipse appeared even taller. This repulsive interaction was long range, occurring even when the ellipses were presented in different visual hemifields. The other interaction manifested as a global assimilation effect. An ellipse appeared taller when it was a part of a global vertical organization than when it was a part of a global horizontal organization. The repulsion and assimilation effects temporally dissociated as the former slightly strengthened, and the latter disappeared when the ellipse-to-mask stimulus onset asynchrony was increased from 40 to 140 ms. These results are consistent with the idea that shape perception emerges from rapid lateral and hierarchical neural interactions
Sensitive perception of a person’s direction of walking by 4-year-old children.
Watch any crowded intersection, and you will see how adept people are at reading the subtle movements of one another. While adults can readily discriminate small differences in the direction of a moving person, it is unclear if this sensitivity is in place early in development. Here, we present evidence that 4-year-old children are sensitive to small differences in a person's direction of walking (ϳ7°) far beyond what has been previously shown. This sensitivity only occurred for perception of an upright walker, consistent with the recruitment of high-level visual areas. Even at 4 years of age, children's sensitivity approached that of adults'. This suggests that the sophisticated mechanisms adults use to perceive a person's direction of movement are in place and developing early in childhood. Although the neural mechanisms for perceiving biological motion develop slowly, they are refined enough by age 4 to support subtle perceptual judgments of heading. These judgments may be useful for predicting a person's future location or even their intentions and goals
The center of attention: Metamers, sensitivity, and bias in the emergent perception of gaze.
Recommended from our members
Perceiving Crowd Attention
In nearly every interpersonal encounter, people readily gather socio-visual cues to guide their behavior. Intriguingly, social information is most effective in directing behavior when it is perceived in crowds. For example, the shared gaze of a crowd is more likely to direct attention than is a single person's gaze. Are people equipped with mechanisms to perceive a crowd's gaze as an ensemble? Here, we provide the first evidence that the visual system extracts a summary representation of a crowd's attention; observers rapidly pooled information from multiple crowd members to perceive the direction of a group's collective gaze. This pooling occurred in high-level stages of visual processing, with gaze perceived as a global-level combination of information from head and pupil rotation. These findings reveal an important and efficient mechanism for assessing crowd gaze, which could underlie the ability to perceive group intentions, orchestrate joint attention, and guide behavior
Recommended from our members
The center of attention: Metamers, sensitivity, and bias in the emergent perception of gaze.
A person's gaze reveals much about their focus of attention and intentions. Sensitive perception of gaze is thus highly relevant for social interaction, especially when it is directed toward the viewer. Yet observers also tend to overestimate the likelihood that gaze is directed toward them. How might the visual system balance these competing goals, maximizing sensitivity for discriminating gazes that are relatively direct, while at the same time allowing many gazes to appear as if they look toward the viewer? Perceiving gaze is an emergent visual process that involves integrating information from the eyes with the rotation of the head. Here, we examined whether the visual system leverages emergent representation to balance these competing goals. We measured perceived gaze for a large range of pupil and head combinations and found that head rotation has a nonlinear influence on a person's apparent direction of looking, especially when pupil rotations are relatively direct. These perceptual distortions could serve to expand representational space and thereby enhance discriminability of gazes that are relatively direct. We also found that the emergent perception of gaze supports an abundance of direct gaze metamers-different combinations of head and pupil rotations that combine to generate the appearance of gaze directed toward the observer. Our results thus demonstrate a way in which the visual system flexibly integrates information from facial features to optimize social perception. Many gazes can be made to look toward you, yet similar gazes need not appear alike
- …
