Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Decisional-Emotional Support System for a Synthetic Agent: Influence of Emotions in Decision-Making Toward the Participation of Automata in Society
Stockholm University, Faculty of Social Sciences, Department of Computer and Systems Sciences. (DECIDE research group meeting)
2015 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Emotion influences our actions, and this means that emotion has subjective decision value. Emotions, properly interpreted and understood, of those affected by decisions provide feedback to actions and, as such, serve as a basis for decisions. Accordingly, "affective computing" represents a wide range of technological opportunities toward the implementation of emotions to improve human-computer interaction, which also includes insights across a range of contexts of computational sciences into how we can design computer systems to communicate and recognize the emotional states provided by humans. Today, emotional systems such as software-only agents and embodied robots seem to improve every day at managing large volumes of information, and they remain emotionally incapable to read our feelings and react according to them. From a computational viewpoint, technology has made significant steps in determining how an emotional behavior model could be built; such a model is intended to be used for the purpose of intelligent assistance and support to humans. Human emotions are engines that allow people to generate useful responses to the current situation, taking into account the emotional states of others. Recovering the emotional cues emanating from the natural behavior of humans such as facial expressions and bodily kinetics could help to develop systems that allow recognition, interpretation, processing, simulation, and basing decisions on human emotions. Currently, there is a need to create emotional systems able to develop an emotional bond with users, reacting emotionally to encountered situations with the ability to help, assisting users to make their daily life easier. Handling emotions and their influence on decisions can improve the human-machine communication with a wider vision. The present thesis strives to provide an emotional architecture applicable for an agent, based on a group of decision-making models influenced by external emotional information provided by humans, acquired through a group of classification techniques from machine learning algorithms. The system can form positive bonds with the people it encounters when proceeding according to their emotional behavior. The agent embodied in the emotional architecture will interact with a user, facilitating their adoption in application areas such as caregiving to provide emotional support to the elderly. The agent's architecture uses an adversarial structure based on an Adversarial Risk Analysis framework with a decision analytic flavor that includes models forecasting a human's behavior and their impact on the surrounding environment. The agent perceives its environment and the actions performed by an individual, which constitute the resources needed to execute the agent's decision during the interaction. The agent's decision that is carried out from the adversarial structure is also affected by the information of emotional states provided by a classifiers-ensemble system, giving rise to a "decision with emotional connotation" included in the group of affective decisions. The performance of different well-known classifiers was compared in order to select the best result and build the ensemble system, based on feature selection methods that were introduced to predict the emotion. These methods are based on facial expression, bodily gestures, and speech, with satisfactory accuracy long before the final system.

Place, publisher, year, edition, pages
Stockholm: Department of Computer and Systems Sciences, Stockholm University , 2015. , 146 p.
Series
Report Series / Department of Computer & Systems Sciences, ISSN 1101-8526 ; 15-019
Keyword [en]
Affective Computing; Machine Learning; Adversarial Risk Analysis; Broaden and Build Theory; Facial Expression Recognition; Speech Emotion Recognition; Detection of Emotional Information; Emotional self-regulation
National Category
Human Computer Interaction
Research subject
Computer and Systems Sciences
Identifiers
URN: urn:nbn:se:su:diva-122084ISBN: 978-91-7649-291-8 (print)OAI: oai:DiVA.org:su-122084DiVA: diva2:864043
Public defence
2015-12-14, room L70, NOD Building, Borgarfjordsgatan 12, Kista, 13:00 (English)
Opponent
Supervisors
Note

At the time of the doctoral defense, the following paper was unpublished and had a status as follows: Paper 8: Accepted.

Available from: 2015-11-20 Created: 2015-10-23 Last updated: 2015-11-30Bibliographically approved
List of papers
1. An Adversarial Risk Analysis Model for an Emotional Based Decision Agent
Open this publication in new window or tab >>An Adversarial Risk Analysis Model for an Emotional Based Decision Agent
2011 (English)In: The 2nd International Workshop on Decision Making with Multiple Imperfect Decision Makers, Institute of Information Theory and Automation , 2011, 1-6 p.Conference paper, Published paper (Refereed)
Abstract [en]

We introduce a model that describes the decision making process of an autonomoussynthetic agent which interacts with another agent and is influencedby affective mechanisms. This model would reproduce patterns similar to humansand regulate the behavior of agents providing them with some kind of emotionalintelligence and improving interaction experience. We sketch the implementationof our model with an edutainment robot.

Place, publisher, year, edition, pages
Institute of Information Theory and Automation, 2011
Keyword
Adversarial Risk Analysis (ARA), Expected Utility, robotics
National Category
Human Computer Interaction
Research subject
Computer and Systems Sciences
Identifiers
urn:nbn:se:su:diva-122135 (URN)978-80-903834-6-3 (ISBN)
Conference
The 25th Annual Conference on Neural Information Processing Systems (NIPS-11), December 16, 2011, Sierra Nevada, Spain
Available from: 2015-10-26 Created: 2015-10-26 Last updated: 2015-11-02Bibliographically approved
2. An Adversarial Risk Analysis Model for an Autonomous Imperfect Decision Agent
Open this publication in new window or tab >>An Adversarial Risk Analysis Model for an Autonomous Imperfect Decision Agent
2013 (English)In: Decision Making and Imperfection / [ed] Tatiana V. Guy, Miroslav Karny, David Wolpert, Springer Berlin/Heidelberg, 2013, 163-187 p.Chapter in book (Refereed)
Abstract [en]

Machines that perform intelligent tasks interacting with humans in aseamless manner are becoming a reality. A key element in their design is their abilityto make decisions based on a reasonable value system, and the perception of thesurrounding environment, including the incumbent persons. In this chapter, we providea model that supports the decision making process of an autonomous agent thatimperfectly perceives its environment and the actions performed by a person, whichwe shall designate user. The approach has a decision analytic flavour, but includesmodels forecasting the user’s behaviour and its impact over the surrounding environment.We describe the implementation of the model with an edutainment robotwith sensors that capture information about the world around it, which may serve asa cognitive personal assistant, may be used with kids for educational, recreationaland therapeutic purposes and with elderly people for companion purposes.

Place, publisher, year, edition, pages
Springer Berlin/Heidelberg, 2013
Series
Studies in Computational Intelligence, ISSN 1860-949X ; 474
Keyword
Adversarial Risk Analysis (ARA), Game theory, Expected Utility, Maslow’s hierarchy of needs, robotics, multi-attribute utility function
National Category
Computer Science
Research subject
Computer and Systems Sciences
Identifiers
urn:nbn:se:su:diva-122136 (URN)10.1007/978-3-642-36406-8_6 (DOI)978-3-642-36405-1 (ISBN)978-3-642-36406-8 (ISBN)
Available from: 2015-10-26 Created: 2015-10-26 Last updated: 2015-11-02Bibliographically approved
3. An Adversarial Risk Analysis Model for a Decision Agent facing Multiple Users
Open this publication in new window or tab >>An Adversarial Risk Analysis Model for a Decision Agent facing Multiple Users
2012 (English)In: 2012 3rd International Workshop on Cognitive Information Processing (CIP), IEEE Computer Society, 2012, 1-6 p.Conference paper, Published paper (Refereed)
Abstract [en]

We provide a model supporting the decision making process of an autonomous synthetic agent which interacts with several users. The approach is decision analytic and incorporates models forecasting the users' behavior. We sketch the implementation of our model with an edutainment robot.

Place, publisher, year, edition, pages
IEEE Computer Society, 2012
National Category
Computer Science
Research subject
Computer and Systems Sciences
Identifiers
urn:nbn:se:su:diva-122137 (URN)10.1109/CIP.2012.6232915 (DOI)978-1-4673-1877-8 (ISBN)
Conference
2012 3rd International Workshop on Cognitive Information Processing (CIP), 28-30 May 2012, Baiona, Spain
Available from: 2015-10-26 Created: 2015-10-26 Last updated: 2015-11-02Bibliographically approved
4. Automatic Emotion Recognition through Facial Expression Analysis in Merged Images Based on an Artificial Neural Network
Open this publication in new window or tab >>Automatic Emotion Recognition through Facial Expression Analysis in Merged Images Based on an Artificial Neural Network
2013 (English)In: 2013 12th Mexican International Conference on Artificial Intelligence (MICAI): Proceedings, IEEE Computer Society, 2013, 85-96 p.Conference paper, Published paper (Refereed)
Abstract [en]

This paper focuses on a system of recognizing human’s emotion from a detected human’s face. The analyzed information is conveyed by the regions of the eye and the mouth into a merged new image in various facial expressions pertaining to six universal basic facial emotions. The output information obtained could be fed as an input to a machine capable to interact with social skills, in the context of building socially intelligent systems. The methodology uses a classification technique of information into a new fused image which is composed of two blocks integrated by the area of the eyes and mouth, very sensitive areas to changes human’s expression and that are particularly relevant for the decoding of emotional expressions. Finally we use the merged image as an input to a feed-forward neural network trained by back-propagation. Such analysis of merged images makes it possible, obtain relevant information through the combination of proper data in the same image and reduce the training set time while preserved classification rate. It is shown by experimental results that the proposed algorithm can detect emotion with good accuracy.

Place, publisher, year, edition, pages
IEEE Computer Society, 2013
Keyword
Artificial Neural Network, Merged Images, Facial Expression Recognition, Emotions, Detection of Emotional Information.
National Category
Information Systems
Research subject
Computer and Systems Sciences
Identifiers
urn:nbn:se:su:diva-97708 (URN)10.1109/MICAI.2013.16 (DOI)978-1-4799-2605-3 (ISBN)
Conference
12th Mexican International Conference on Artificial Intelligence, November 24-30, 2013, Mexico City, Mexico
Available from: 2013-12-17 Created: 2013-12-17 Last updated: 2015-12-14Bibliographically approved
5. Effect of emotional feedback in a decision-making system for an autonomous agent
Open this publication in new window or tab >>Effect of emotional feedback in a decision-making system for an autonomous agent
2014 (English)In: Advances in Artificial Intelligence - IBERAMIA 2014: Proceedings / [ed] Ana L.C. Bazzan, Karim Pichara, Springer, 2014, 613-624 p.Conference paper, Published paper (Refereed)
Abstract [en]

The point of view of Isaac Asimov is unlikely in a close future, but machines that develop tasks in a sensible manner are already a fact. In light of this remark, recent research tries to understand the requirements and design options that imply providing an autonomous agent with means for detecting emotions. If we think about of exporting this model to machines, it is possible that they become capable to evolve emotionally according to such models and would take part in the society more or less cooperatively, according to the perceived emotional state. The main purpose of this research is the implementation of a decision model affected by emotional feedback in a cognitive robotic assistant that can capture information about the world around it. The robot will use multi-modal communication to assist the societal participation of persons deprived of conventional modes of communication. The aim is a machine that can predict what the user will do next and be ready to give the best possible assistance, taking in account the emotional factor. The results indicate the benefits and importance of emotional feedback in the closed loop human-robot interaction framework. Cognitive agents are shown to be capable of adapting to emotional information from humans.

Place, publisher, year, edition, pages
Springer, 2014
Series
Lecture Notes in Computer Science, ISSN 0302-9743 ; 8864
Keyword
Affective Computing, Artificial Neural Network, Facial Expression Recognition, Detection of Emotional Information, Adversarial Risk Analysis, Broaden and Build Theory
National Category
Information Systems
Research subject
Computer and Systems Sciences
Identifiers
urn:nbn:se:su:diva-108681 (URN)10.1007/978-3-319-12027-0_49 (DOI)978-3-319-12026-3 (ISBN)978-3-319-12027-0 (ISBN)
Conference
14th Ibero-American Conference on AI, Santiago de Chile, Chile, November 24-27, 2014
Available from: 2014-11-03 Created: 2014-11-03 Last updated: 2015-12-14Bibliographically approved
6. Recognition of emotions by the emotional feedback through behavioral human poses
Open this publication in new window or tab >>Recognition of emotions by the emotional feedback through behavioral human poses
Show others...
2015 (English)In: International Journal of Computer Science Issues, ISSN 1694-0784, E-ISSN 1694-0814, Vol. 12, no 1, 7-17 p.Article in journal (Refereed) Published
Abstract [en]

The sensory perceptions from humans are intertwined channels,which assemble diverse data in order to decrypt emotionalinformation. Just by associations, humans can mix emotionalinformation, i.e. emotion detection through facial expressionscriteria, emotional speech, and the challenging field of emotionalbody language over the body poses and motion. In this work, wepresent an approach that can predict six basic universal emotionscollected by responses linked to human body poses, from acomputational perspective. The emotional outputs could be fedas inputs to a synthetic socially skilled agent capable ofinteraction, in the context of socially intelligent systems. Themethodology uses a classification technique of information fromsix images extracted from a video, entirely developed using themotion sensing input device of Xbox 360 by Microsoft. We aretaking into account that the emotional body language containsadvantageous information about the emotional state of humans,especially when bodily reaction brings about consciousemotional experiences. The body parts are windows that showemotions and they would be particularly suitable to decodingaffective states. The group of extracted images is merged in oneimage with all the relevant information. The recovered image willserve as input to the classifiers. The analysis of images fromhuman body poses makes it possible to obtain relevantinformation through the combination of proper data in the sameimage. It is shown by experimental results that the SVM candetect emotion with good accuracy compared to other classifiers.

Keyword
Detection of Emotional Information, Affective Computing, Body Gesture Analysis, Robotics, Classification, Machine Learning.
National Category
Computer Science
Research subject
Computer and Systems Sciences
Identifiers
urn:nbn:se:su:diva-122138 (URN)
Available from: 2015-10-26 Created: 2015-10-26 Last updated: 2017-12-01Bibliographically approved
7. Speech emotion recognition in emotional feedbackfor Human-Robot Interaction
Open this publication in new window or tab >>Speech emotion recognition in emotional feedbackfor Human-Robot Interaction
Show others...
2015 (English)In: International Journal of Advanced Research in Artificial Intelligence (IJARAI), ISSN 2165-4050, E-ISSN 2165-4069, Vol. 4, no 2, 20-27 p.Article in journal (Refereed) Published
Abstract [en]

For robots to plan their actions autonomously and interact with people, recognizing human emotions is crucial. For most humans nonverbal cues such as pitch, loudness, spectrum, speech rate are efficient carriers of emotions. The features of the sound of a spoken voice probably contains crucial information on the emotional state of the speaker, within this framework, a machine might use such properties of sound to recognize emotions. This work evaluated six different kinds of classifiers to predict six basic universal emotions from non-verbal features of human speech. The classification techniques used information from six audio files extracted from the eNTERFACE05 audio-visual emotion database. The information gain from a decision tree was also used in order to choose the most significant speech features, from a set of acoustic features commonly extracted in emotion analysis. The classifiers were evaluated with the proposed features and the features selected by the decision tree. With this feature selection could be observed that each one of compared classifiers increased the global accuracy and the recall. The best performance was obtained with Support Vector Machine and bayesNet.

Keyword
Affective Computing, Detection of Emotional Information, Machine Learning, Speech Emotion Recognition
National Category
Computer Science
Research subject
Computer and Systems Sciences
Identifiers
urn:nbn:se:su:diva-122139 (URN)10.14569/IJARAI.2015.040204 (DOI)
Available from: 2015-10-26 Created: 2015-10-26 Last updated: 2017-12-01Bibliographically approved
8. Decision-making content of an agent affected by emotional feedback provided by capture of human’s emotions through a Bimodal System
Open this publication in new window or tab >>Decision-making content of an agent affected by emotional feedback provided by capture of human’s emotions through a Bimodal System
2015 (English)In: International Journal of Computer Science Issues, ISSN 1694-0784, E-ISSN 1694-0814, Vol. 12, no 6Article in journal (Refereed) Accepted
Abstract [en]

Affective computing allows for widening the view of the complex world in human-machine interaction through the comprehension of emotions, which allows an enriched coexistence of natural interactions between them. Corporal features such as facial expression, kinetics, structural components of the voice or vision, to mention just a few, provide us with valid information of how a human behaves. Among all the carriers of emotional information we may point out two, voice and facial gestures as holders of an ample potential for identifying emotions with a high degree of accuracy. This paper focuses on the development of a system that will track a human’s affective state using facial expressions and speech signals with the purpose of modifying the actions of an autonomous agent. The system uses a fusion of two baseline unimodal classifiers based on bayes Net giving rise to a multi-classifier. The union of the three classifiers forms a bimodal scheme of emotion classification. The outputs from the baseline unimodal classifiers are combined together through a probability fusion framework applied in the general multi-classifier. The system classifies six universal basic emotions using audiovisual data extracted from the eNTERFACE05 audiovisual emotion database. The emotional information obtained could provide an agent with the basis for taking an affective decision. It is shown by experimental results that the proposed system can detect emotions with good accuracy achieving the change of the emotional behavior of the agent faced with a human.

Keyword
Affective Computing, Machine Learning, Adversarial Risk Analysis, Broaden and Build Theory, Facial Expression Recognition, Speech Emotion Recognition, Detection of Emotional Information, Emotional self-regulation
National Category
Computer Science
Research subject
Computer and Systems Sciences
Identifiers
urn:nbn:se:su:diva-122142 (URN)
Available from: 2015-10-26 Created: 2015-10-26 Last updated: 2017-12-01Bibliographically approved

Open Access in DiVA

Decisional-Emotional Support System for a Synthetic Agent(4193 kB)338 downloads
File information
File name FULLTEXT02.pdfFile size 4193 kBChecksum SHA-512
94a5c4e6ed2b4bba016aca338a8a7bd617b0107d89517536d96302f3acc234a266050ba340a62874a38bb58299ff2721612c50f0b4ad2e27cae734e3f8c33e78
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Guerrero Razuri, Javier Francisco
By organisation
Department of Computer and Systems Sciences
Human Computer Interaction

Search outside of DiVA

GoogleGoogle Scholar
Total: 338 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 1787 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf