Skip to Main Content

Basic Search

Skip to Search Results
 
 
 

Left Column

Filters

Right Column

Search Results

Search Results

(Total results 6)

Mini-Tools

 
 

Search Report

  • 1. Ayyalasomayajula, Meghana Image Emotion Analysis: Facial Expressions vs. Perceived Expressions

    Master of Computer Science (M.C.S.), University of Dayton, 2022, Computer Science

    A picture is worth a thousand words. A single image has the power to influence individuals and change their behaviour, whereas a single word does not. Even a barely visible image, displayed on a screen for only a few milliseconds, appears to be capable of changing one's behaviour. In this thesis, we experimentally investigated the relationship between facial expressions and perceived emotions. To this end, we built two datasets, namely, the image dataset for image emotion analysis and the face dataset for expression recognition. During the annotation of the image dataset, both facial expressions and perceived emotions are recorded via a mobile application. We then use a classifier trained on the face dataset to recognize the user's expression and compare it with the perceived emotion.

    Committee: Tam Nguyen (Advisor) Subjects: Computer Science
  • 2. Liu, Xiao AUTOMATED FACIAL EMOTION RECOGNITION: DEVELOPMENT AND APPLICATION TO HUMAN-ROBOT INTERACTION

    Master of Sciences, Case Western Reserve University, 0, EMC - Mechanical Engineering

    This thesis presents two image processing algorithms for facial emotion recognition (FER). The first method uses two pre-processing filters (pre-filters), i.e., brightness and contrast filter and edge extraction filter, combined with Convolutional Neural Network (CNN) and Support Vector Machine (SVM). By using optimal pre-filter parameters in the pre-processing of the training images, the classification of FER could reach 98.19\% accuracy using CNN with 3,500 epochs for 3,589 face images from the FER2013 datasets. The second approach introduces two geometrical facial features based on action units -- landmark curvatures and vectorized landmarks. This method first detects facial landmarks and extracts action unit (AU) features. The extracted facial segments based on the action units are classified into five groups and input to a SVM. The presented method show how individual parameters, including detected landmarks, AU group selection, and parameters used in the SVM, can be examined and systematically selected for the optimal performance in FER. The results after parameter optimization showed 98.38\% test accuracy with training using 1,479 labeled frames of Cohn-Kanade (CK+) database, and 98.11\% test accuracy with training using 1,710 labeled frames of Multimedia Understanding Group (MUG) database for 6-emotion classification. This technique also shows the real-time processing speed of 6.67 frames per second (fps) for images with a 640x480 resolution. The novelty of the first approach is combining image processing filters with CNN to enhance CNN performance. As for the second approach, it systematically analyzed the effectiveness of proposed geometric features and implemented FER in real-time. The demonstrated algorithms have been applied on human-robot interaction (HRI) application platform - social robot ``Woody" for testing. The presented algorithms have been made publicly available.

    Committee: Kiju Lee (Committee Chair); Kathryn Daltorio (Committee Member); Frank Merat (Committee Member) Subjects: Computer Science; Mechanical Engineering; Robotics
  • 3. Copps, Emily Interpersonal Functions of Non-Suicidal Self-Injury and Their Relationship to Facial Emotion Recognition and Social Problem-Solving

    Doctor of Psychology (Psy.D.), Xavier University, 2019, Psychology

    Non-suicidal self-injury (NSSI) is a growing area of concern in both clinical and non-clinical populations. Understanding the motivations for engaging in this behavior as well as the characteristics of individuals who engage in NSSI are crucial for developing maximally effective interventions. Previous research has indicated that while nearly all self-injurers report doing so as a way of regulating emotions, a slightly smaller proportion (approximately 85%) of individuals who engage in NSSI report doing so for interpersonal reasons – for example, as a way of communicating with others (Turner, Chapman, & Layden, 2012). The current study sought to examine characteristics of individuals who endorse interpersonal functions of self-injury in comparison to self-injurers who do not endorse interpersonal functions of self-injury and to non-self-injuring control participants. It was hypothesized that individuals who endorsed interpersonal NSSI would have greater deficits in social problem-solving and facial emotion recognition compared to self-injurers who do not endorse interpersonal NSSI and to control participants. There were no significant differences between the three groups on facial emotion recognition abilities. A one-way MANOVA indicated that both groups of self-injuring participants had poorer social problem-solving abilities compared to control participants. It may be that individuals with NSSI utilize self-injury as a coping mechanism to the detriment of more effective social problem-solving strategies.

    Committee: Nicholas Salsman Ph.D. (Advisor) Subjects: Clinical Psychology; Mental Health; Psychology
  • 4. Mehling, Margaret Differential Impact of Drama-Based versus Traditional Social Skills Intervention on the Brain-Basis and Behavioral Expression of Social Communication Skills in Children with Autism Spectrum Disorder

    Doctor of Philosophy, The Ohio State University, 2017, Psychology

    This study examines the differential impact a traditional social skills curriculum, SkillStreaming, and a novel, drama-based social skills intervention, the Hunter Heartbeat Method (HHM), on the core social skills deficits associated with autism spectrum disorder (ASD) and the brain-basis of these deficits. Forty children aged 8-14 years with ASD were recruited to participate in a 12-week social skills intervention. Participants were randomly assigned to receive drama-based or traditional social skills intervention once weekly. Previous research on SkillStreaming and HHM in children with ASD had reported improvement in behavioral measures of social functioning from pre- to post-intervention. Despite evidence of treatment response for both interventions, clear differences between these two types of social skills intervention exist. SkillStreaming is a highly structured, curriculum-based intervention during which specific social skills, such as making a friend or having a conversation are explicitly taught via didactic instruction, modeling, and rehearsal. Conversely, in the HHM, although core elements of modeling and repeated practice with feedback are used, there are no “skills” taught, rather, children learn drama-games that implicitly target core deficits associated with ASD (e.g., eye contact, facial emotion recognition, integration of speech and gesture). Research on drama-based interventions for children with ASD is an emerging literature; previous research has indicated that this treatment is well liked by children and, like traditional treatments, is associated with measurable improvement from pre- to post-intervention. Little is known, however, about how differences in treatment modality (didactic versus experiential) and skills taught (higher-level versus foundational) impact skill acquisition and generalization. It is possible that these key differences impact the neurological substrate of social learning, which may have downstream consequences for skill (open full item for complete abstract)

    Committee: Marc Tassé PhD (Advisor); Zhong-Lin Lu PhD (Committee Member); Luc Lecavalier PhD (Committee Member) Subjects: Psychology
  • 5. Getz, Glen FACIAL AFFECT RECOGNITON DEFICITS IN BIPOLAR DISORDER

    MA, University of Cincinnati, 2001, Arts and Sciences : Psychology

    Patients diagnosed with bipolar disorder (BPD), by definition, have problems with emotional regulation. However, it remains uncertain whether these patients are also deficient at processing other people's emotions, particularly while in the manic state. The present study examined the ability of 25 manic patients and 25 healthy participants on tasks of facial recognition and facial affect recognition at three different presentation durations: 500ms, 750ms, and 1000ms. The groups did not differ in terms of age, education, sex, race or estimated IQ. In terms of facial recognition, the groups did not differ significantly on either a novel computerized facial recognition task or the Benton Facial Recognition task. In contrast, the BPD group performed significantly more poorly than did the comparison group on a novel facial affect discrimination task and a novel facial affect labeling task at 500ms presentation duration. Facial affect processing was not impaired at longer presentation durations. Further, the patient group slower on all three computerized tasks. This study indicates that patients with BPD may need more time to examine facial affect, but exhibit a normal ability to recognize faces.

    Committee: Stephen Strakowski (Advisor) Subjects: Psychology, General
  • 6. Linardatos, Eftihia FACIAL EMOTION RECOGNITION IN GENERALIZED ANXIETY DISORDER AND DEPRESSION: ASSESSING FOR UNIQUE AND COMMON RESPONSES TO EMOTIONS AND NEUTRALITY

    PHD, Kent State University, 2011, College of Arts and Sciences / Department of Psychological Sciences

    Facial emotion recognition has a central role in human communication. Generalized anxiety disorder (GAD) and major depressive disorder (MDD) have been associated with deficits in social and interpersonal functioning raising the question as to whether these conditions are also associated with deficits in facial emotion recognition. In addition to being associated with interpersonal difficulties, GAD and MDD overlap substantially at the genotypic and phenotypic level. However, these mental health conditions differ at the cognitive level in that GAD is associated with thoughts revolving around threatening information, whereas thoughts in depression are related to loss, failure, and sadness. These unique cognitive mechanisms may also play a role in the process of facial emotion recognition resulting in differential patterns of responses to facial expressions of emotions for GAD and depression. Although facial emotion recognition has been investigated in MDD, no studies to date have examined this process in GAD. The goals of the present study were threefold: 1) Examine the overall accuracy of facial emotion recognition as well as that for specific emotions in GAD, MDD, and comorbid MDD+GAD, 2) Examine misattributions in facial expression recognition in response to anger, sadness, and neutral expressions in GAD, MDD, and comorbid MDD+GAD, and 3) Investigate the relationship of facial emotion recognition and interpersonal functioning in the context of GAD and MDD. A sample of 90 participants with GAD, MDD, comorbid MDD+GAD, and healthy controls completed a facial emotion recognition task and a battery of self-report measures. The findings did not support a general or specific deficit in facial emotion recognition in MDD and GAD. Further, individuals with MDD and GAD did not differ in their responses to neutral facial expressions nor to other basic emotions. The findings are discussed in the context of future clinical and research directions.

    Committee: David Fresco PhD (Advisor); John Gunstad PhD (Committee Member); John Updegraff PhD (Committee Member); David Hussey PhD (Committee Member); William Kalkhoff PhD (Committee Member) Subjects: