Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Blended Emotions can be Accurately Recognized from Dynamic Facial and Vocal Expressions
Stockholm University, Faculty of Social Sciences, Department of Psychology, Perception and psychophysics.
Stockholm University, Faculty of Social Sciences, Department of Psychology.
Stockholm University, Faculty of Social Sciences, Department of Psychology, Perception and psychophysics.ORCID iD: 0000-0001-8771-6818
Number of Authors: 32023 (English)In: Journal of Nonverbal Behavior, ISSN 0191-5886, Vol. 47, no 3, p. 267-284Article in journal (Refereed) Published
Abstract [en]

People frequently report feeling more than one emotion at the same time (i.e., blended emotions), but studies on nonverbal communication of such complex states remain scarce. Actors (N = 18) expressed blended emotions consisting of all pairwise combinations of anger, disgust, fear, happiness, and sadness – using facial gestures, body movement, and vocal sounds – with the intention that both emotions should be equally prominent in the resulting expression. Accuracy of blended emotion recognition was assessed in two preregistered studies using a combined forced-choice and rating scale task. For each recording, participants were instructed to choose two scales (out of 5 available scales: anger, disgust, fear, happiness, and sadness) that best described their perception of the emotional content and judge how clearly each of the two chosen emotions were perceived. Study 1 (N = 38) showed that all emotion combinations were accurately recognized from multimodal (facial/bodily/vocal) expressions, with significantly higher ratings on scales corresponding to intended vs. non-intended emotions. Study 2 (N = 51) showed that all emotion combinations were also accurately perceived when the recordings were presented in unimodal visual (facial/bodily) and auditory (vocal) conditions, although accuracy was lower in the auditory condition. To summarize, results suggest that blended emotions, including combinations of both same-valence and other-valence emotions, can be accurately recognized from dynamic facial/bodily and vocal expressions. The validated recordings of blended emotion expressions are freely available for research purposes. 

Place, publisher, year, edition, pages
Springer Nature, 2023. Vol. 47, no 3, p. 267-284
Keywords [en]
blended emotions, compound emotions, facial expression, mixed emotions, multimodal expression, non-linguistic vocalizations
National Category
Psychology (excluding Applied Psychology)
Research subject
Psychology
Identifiers
URN: urn:nbn:se:su:diva-220275DOI: 10.1007/s10919-023-00426-9ISI: 000988996500001Scopus ID: 2-s2.0-85159703037OAI: oai:DiVA.org:su-220275DiVA, id: diva2:1789911
Note

This research was supported by the Marianne and Marcus Wallenberg Foundation (MMW 2018.0059). Open access funding provided by Stockholm University.

Available from: 2023-08-21 Created: 2023-08-21 Last updated: 2025-06-24Bibliographically approved
In thesis
1. Understanding the Recognition of Dynamic Multimodal Expressions of Single and Blended Emotions
Open this publication in new window or tab >>Understanding the Recognition of Dynamic Multimodal Expressions of Single and Blended Emotions
2025 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Nonverbal emotion expressions are multimodal patterns of behavior that unfold dynamically. However, several key questions remain about the nature of dynamic multimodal emotion expressions. This thesis aimed to better understand how such nonverbal expressions—both single and blended emotions—are recognized.

Study I investigated individual differences in the ability to recognize single emotions in the general population using the ERAM test. Study 1 focused on emotion recognition in relation to emotional competencies, personality, and socio-emotional dysfunction, and evaluation of the psychometric properties of the ERAM test. Study 2 examined emotion recognition in relation to metacognitive judgments using an online version of the ERAM test. Overall, results suggested that emotion recognition is related to empathy, emotion understanding, openness, and alexithymia, and also with metacognitive skills. The results further revealed that accuracy was highest in the multimodal condition and positively correlated across modalities. Lastly, no differences were found between the in-lab and online versions of the test, indicating that the ERAM can be reliably administered online.

Study II investigated the ability to recognize blended emotions (pairwise combinations of 5 emotions). Recordings of actors displaying two emotions in equal proportions (50:50 expressions) were used. The test combined a rating scale and a forced-choice task, and participants were instructed to choose 2 out of 5 available scales. Study 1 examined multimodal emotion recognition, whereas Study 2 examined emotion recognition in unimodal conditions (video-only and audio-only). Recognition accuracy was highest when expressions were presented multimodally, and also higher in the video-only condition, consistent with the findings on single emotions in Study I. Both studies further showed that all combinations were recognized above chance level, regardless of presentation modality.

Study III investigated the recognition of blended emotions with varying proportions (70:30, 50:50, 30:70 expressions), using recordings from the 6 best-recognized actors in Study II. Study 1 examined emotion recognition using a restricted response format (prompted to use 2 out of 5 scales, as in Study II), while Study 2 used an unrestricted version of the same test (free to choose any number of the 5 scales). The results showed that the majority of blends across all proportions were recognized above chance level in both formats (restricted vs. unrestricted). Results further revealed that the more prominent emotion received higher ratings than the less prominent in most combinations. These findings replicated and extended those in Study II, and suggest that both the quality and quantity of emotions can be recognized.

Together, the results showed that, in the general population, individual differences in emotion recognition of single emotions are related to broader affective, personality, and metacognitive processes. They also revealed that recognition accuracy of both single and blended emotions is highest when dynamic multimodal expressions are presented multimodally. This thesis contributes to a growing body of work that underscores the significance of studying emotions conveyed dynamically through the face, voice, and body, and emphasizes the need to increase the number and complexity of emotions under study, as emotion recognition ability appears to be more nuanced and flexible than previously thought.

Place, publisher, year, edition, pages
Stockholm: Department of Psychology, Stockholm University, 2025. p. 71
Keywords
blended emotions, emotion recognition ability, facial expression, individual difference, multimodal expression, non-linguistic vocalizations, single emotions
National Category
Psychology
Research subject
Psychology
Identifiers
urn:nbn:se:su:diva-244601 (URN)978-91-8107-314-0 (ISBN)978-91-8107-315-7 (ISBN)
Public defence
2025-09-12, Lärosal 19, Albano, hus 2, Albanovägen 18 and online via Zoom, public link is available at the department website, stockholm, 13:00 (English)
Opponent
Supervisors
Available from: 2025-08-20 Created: 2025-06-24 Last updated: 2025-09-04Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Israelsson, AlexandraLaukka, Petri

Search in DiVA

By author/editor
Israelsson, AlexandraLaukka, Petri
By organisation
Perception and psychophysicsDepartment of Psychology
Psychology (excluding Applied Psychology)

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 686 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf