Doctor of Philosophy, The Ohio State University, 2007, Biomedical Engineering
Faces provide a wide range of information about a person's identity, race, sex, age, and emotional state. The perception of facial expressions of emotion is generally assumed to correspond to underlying muscle movement. However, the current work demonstrates that the static configuration of facial components biases the perception of the face. Furthermore, configural deviations from the population average influence the process of emotional attribution in the perception of neutral faces. Specifically, it is shown that changes in the relative position of the nose, mouth, eyes, and eyebrows affect the perception of emotional expression in an otherwise neutral face. The results of several experiments are presented wherein subjects were shown pairs of face images in which the relationship among the eyes, mouth, nose, and eyebrows was modified. Subjects were asked to respond by key-press to indicate their perception of a difference in emotional expression between the paired images. Results consistently showed that an increase in the distance between the eyes and mouth results in an increased perception of sadness. Similarly, a decrease in the distance between the eyes and mouth is perceived as an increase in anger. These perceptions occur in the absence of any changes in underlying facial musculature. The experiments confirm the notion that this perceptual bias is mediated by configural, rather than featural, processing. The results also support the contention that the center of the psychological face space for emotional expression represents the average of the faces most frequently experienced by the individual. A computational model is derived to simulate human perception of emotion in neutral faces.
Committee: Aleix Martinez (Advisor)
Subjects: