Emotion assessment for affective computing based on physiological responses (original) (raw)

Towards a standardization in the use of physiological signals for affective recognition systems

2008

The implementation of physiological signals, as an approach for emotion recognition in computer systems, is not a straight forward task. This paper discusses five main areas that lack of standards and guided principles, which have led Human-Computer Interaction (HCI) researchers to take critical decisions about (i) models, (ii) stimulus, (iii) measures, (iv) features and (v) algorithms with some degree of uncertainty about their results. Methodology standardization would allow comparison of results, reusability of findings and easier integration of the various affective recognition systems created. The background theory is given for each of the five areas and the related work from psychology is briefly reviewed. A comparison table of the HCI common approaches of the five discussed areas is presented, and finally some considerations to take the best decisions are discussed. The aim of this paper is to provide directions on which the future research efforts for affective recognition in HCI should be focused on.

Emotion estimation using physiological signals

TENCON 2008 - 2008 IEEE Region 10 Conference, 2008

Human thought is inherently emotional and emotions are an essential and productive aspect of human thought and action. Emotions though supposed to be erratic from the early days, are proved to be a mechanism that encompasses readiness to act, expectations, focus to goals, appraisal-of self and others and the resulting reactions. Cognition by itself means the perception, experience and expression of emotions. While experiencing the emotion, there are also physiological changes[1,7] taking place in the human body, like variations in the heart rate(ECG/HRV), skin conductance(GSR), breathing rate(BR), blood volume pulse(BVP),brain waves (EEG), temperature and muscle tension, and these are some of the metrics to sense emotive coefficient. The subjects experiencing emotions in a higher magnitude differ from those who can regulate these emotional experiences and such a factor is named as Emotional Intelligence. This difference in experience is due a triadvalue , expectation and reality; which forms an emotional selfstructure. Though there is always an emotional experience in some magnitude, the expression may be evident or completely concealed or the subject is deceptive or tries to hide the emotion by not expressing. This paper looks at the ways in which the stimuli for triggering off the emotional state of the subject are identified, applied and hence perceived by the subject(s) and parameters like GSR and BVP or PR [23] and their variations discovered to conclude the emotional state of the subject. The subject also gives his feedback about the emotions he is undergoing and hence facilitates validation.

A More Complete Picture of Emotion Using Electrocardiogram and Electrodermal Activity to Complement Cognitive Data

Lecture Notes in Computer Science, 2016

We describe a method of achieving emotion classification using ECG and EDA data. There have been many studies conducted on usage of heart rate and EDA data to quantify the arousal level of a user [1-3]. Researchers have identified a connection between a person's ECG data and the positivity or negativity of their emotional state [4]. The goal of this work is to extend this idea to human computer interaction domain. We will explore whether the valence/arousal level of a subject's response to computer based stimuli is predictable using ECG and EDA, and whether or not that information can complement recordings of participants' cognitive data to form a more accurate depiction of emotional state. The experiment consists of presenting three types of stimuli, both interactive and noninteractive, to 9 subjects and recording their physiological response via ECG and EDA data as well as fNIRS device. The stimuli were selected from validated methods of inducing emotion including DEAP dataset [5], Multi Attribute Task Battery [6] and Tetris video game [7]. The participants' responses were captured using Self-Assessment Manikin [8] surveys which were used as the ground truth labels. The resulting data was analyzed using Machine Learning. The results provide new avenues of research in combining physiological data to classify emotion.

Emotion assessment tool for human-machine interfaces—using EEG data and multimedia stimuli towards emotion classification

Int conf on signal process and multimed appl (SIGMAP 2008), 2008

The identification and assessment of human being emotional states belongs to one of the primordial objectives of the scientific research in disparate areas such as artificial intelligence, medicine or psychology. The main objective of this project is related to automatic assessment of a subject's basic emotional states by using electroencephalography as a source for biometric data acquisition. This evaluation is based on predefined mechanisms of emotional induction, as well as specific methods and tools capable of data analysis and processing. From the experimental results attained in several experimental sessions and through the support tools developed, the most pertinent conclusion extracted from this work refers to the capability of effectively performing automatic classification of the subject's predominant emotional state. The emotional conditions were induced through the presentation of specific visual multimedia contents. The success rate of this tool, compared against the self assessment interviews carried out immediately after the experimental session, was approximately 75%. It was also experimentally concluded that female subjects are emotionally more demonstrative than the male ones.

Physiological Signals and Their Use in Augmenting Emotion Recognition for Human–Machine Interaction

Cognitive Technologies, 2010

In this chapter we introduce the concept of using physiological signals as an indicator of emotional state. We review the ambulatory techniques for physiological measurement of the autonomic and central nervous system as they might be used in human-machine interaction. A brief history of using human physiology in HCI leads to a discussion of the state of the art of multimodal pattern recognition of physiological signals. The overarching question of whether results obtained in a laboratory can be applied to ecological HCI remains unanswered.

Emotion Recognition in Conversations Using Brain and Physiological Signals

27th International Conference on Intelligent User Interfaces, 2022

Emotions are complicated psycho-physiological processes that are related to numerous external and internal changes in the body. They play an essential role in human-human interaction and can be important for human-machine interfaces. Automatically recognizing emotions in conversation could be applied in many application domains like health-care, education, social interactions, entertainment, and more. Facial expressions, speech, and body gestures are primary cues that have been widely used for recognizing emotions in conversation. However, these cues can be ineffective as they cannot reveal underlying emotions when people involuntarily or deliberately conceal their emotions. Researchers have shown that analyzing brain activity and physiological signals can lead to more reliable emotion recognition since they generally cannot be controlled. However, these body responses in emotional situations have been rarely explored in interactive tasks like conversations. This paper explores and discusses the performance and challenges of using brain activity and other physiological signals in recognizing emotions in a face-to-face conversation. We present an experimental setup for stimulating spontaneous emotions using a face-to-face conversation and creating a dataset of the brain

A Review, Current Challenges, and Future Possibilities on Emotion Recognition using Machine Learning and Physiological Signals

IEEE Access

The seminal work on Affective Computing in 1995 by Picard set the base for computing that relates to, arises from, or influences emotions. Affective computing is a multidisciplinary field of research spanning the areas of computer science, psychology, and cognitive science. Potential applications include automated driver assistance, healthcare, human-computer interaction, entertainment, marketing, teaching and many others. Thus, quickly, the field acquired high interest, with an enormous growth of the number of papers published on the topic since its inception. This paper aims to (1) Present an introduction to the field of affective computing though the description of key theoretical concepts; (2) Describe the current state-of-the-art of emotion recognition, tracing the developments that helped foster the growth of the field; and lastly, (3) point the literature take-home messages and conclusions, evidencing the main challenges and future opportunities that lie ahead, in particular for the development of novel machine learning (ML) algorithms in the context of emotion recognition using physiological signals.

Emotion Charting Using Real-time Monitoring of Physiological Signals

Emotions are fundamental to humans. They affect perception and everyday activities such as communication, learning and decision making. Various emotion recognition devices have been developed incorporating facial expressions, body postures and speech recognitions as a means of recognition. The accuracy of most of the existing devices is dependent on the expressiveness of the user and can be fairly manipulated. We proposed a physiological signal based solution to provide reliable emotion classification without possible manipulation and user expressiveness. Electrocardiogram (ECG) and Galvanic Skin Response (GSR) signals are extracted using shimmer sensors and are used for recognition of seven basic human emotions (happy, fear, sad, anger, neutral, disgust and surprise). Classification of emotions is performed using Convolutional Neural Network. Using AlexNet architecture and ECG signals, emotion classification accuracy of 91.5% for AMIGOS dataset and 64.2% for a real-time dataset is achieved. Similarly, the accuracy of 92.7% for AMIGOS dataset and 68% for a real-time dataset is achieved using GSR signals. Through combining both ECG and GSR signals the accuracy of both, AMIGOS and real-time datasets is improved to 93% and 68.5% respectively.

Emotion assessment for affective-computing based on brain and peripheral signals

University of Geneva, 2009

R��sum�� Les Interfaces Homme-Machine actuelles manquent ��d'intelligence ��motionnelle��: elles ne sont pas capables d'identifier les ��motions humaines et de prendre cette information en compte pour choisir les actions �� ex��cuter. Le but de l'informatique affective ou affective computing est de combler ce manque en d��tectant les indices ��motionnels se produisant durant l'interaction avec la machine et en synth��tisant les r��ponses ��motionnelles ad��quates. Durant ces derni��res ann��es, la plupart des ��tudes s' ...