Towards a robust real-time emotion detection system for intelligent buildings (original) (raw)
Related papers
A user-independent real-time emotion recognition system for software agents in domestic environments
Engineering Applications of Artificial Intelligence, 2007
The mystery surrounding emotions, how they work and how they affect our lives has not yet been unravelled. Scientists still debate the real nature of emotions, whether they are evolutionary, physiological or cognitive are just a few of the different approaches used to explain affective states. Regardless of the various emotional paradigms, neurologists have made progress in demonstrating that emotion is as, or more, important than reason in the process of making decisions and deciding actions. The significance of these findings should not be overlooked in a world that is increasingly reliant on computers to accommodate to user needs. In this paper, a novel approach for recognizing and classifying positive and negative emotional changes in real time using physiological signals is presented. Based on sequential analysis and autoassociative networks, the emotion detection system outlined here is potentially capable of operating on any individual regardless of their physical state and emotional intensity without requiring an arduous adaptation or pre-analysis phase. Results from applying this methodology on real-time data collected from a single subject demonstrated a recognition level of 71.4% which is comparable to the best results achieved by others through off-line analysis. It is suggested that the detection mechanism outlined in this paper has all the characteristics needed to perform emotion recognition in pervasive computing.
Real-time detection of emotional changes for inhabited environments
Computers & Graphics, 2004
The utilisation of emotional information in computer systems that interact with humans has become more prevalent during the last few years. The various channels through which emotions are expressed provide valuable information about the way humans think and behave and have been successfully employed to assist the inference mechanism of interactive computer applications. In this paper a novel approach to detect changes in the emotional status of a subject is presented. It is argued that the proposed methodology will be able to detect emotional changes in real time utilising physiological measures and a combination of Artificial Neural Networks (ANNs) and statistical mechanisms. Clustering analysis is used to show that the myogram signal was the most suitable attribute to distinguish between two emotional states. Results show that the suggested mechanism is able to accurately distinguish changes from neutral to non-neutral emotional states. Emotional information could be employed to improve user interaction in inhabited environments.
Computer scientists have been slow to become aware of the importance of emotion on human decisions and actions. Recently, however, a considerable amount of research has focused on the utilisation of affective information with the intention of improving both human-machine interaction and artificial human-like inference models. It has been argued that valuable information could be obtained by analysing the way affective states and environment interact and affect human behaviour. A method to improve pattern recognition among four bodily parameters employed for emotion recognition is presented. The utilisation of Autoassociative Neural Networks has proved to be a valuable mechanism to increase inter cluster separation related to emotional polarity (positive or negative). It is suggested that the proposed methodology could improve performance in pattern recognition tasks involving physiological signals. Also, by way of grounding the immediate aims of our research, and providing an insight into the direction of our work, we provide a brief overview of an intelligent-dormitory test bed in which affective computing methods will be applied and compared to non-affective agents.
Real-time Physiological Emotion Detection Mechanisms: Effects of Exercise and Affect Intensity
The development of systems capable of recognizing and categorising emotions is of interest to researchers in various scientific areas including artificial intelligence. The traditional notion that emotions and rationality are two separate realms has gradually been challenged. The work of neurologists has shown the strong relationship between emotional episodes and the way humans think and act. Furthermore, emotions not only regulate human decisions but could also contribute to a more satisfactory response to the environment, i.e., faster and more precise actions. In this paper an analysis of physiological signals employed in real-time emotion detection is presented in the context of Intelligent Inhabited Environments (IIE). Two studies were performed to investigate whether physical exertion has a significant effect on bodily signals stemming from emotional episodes with subjects having various degrees of affect intensity: 1) a statistical analysis using the Wilcoxon Test, and 2) a cluster analysis using the Davies- Bouldin Index. Preliminary results demonstrated that the heart rate and skin resistance consistently showed similar changes regardless of the physical stimuli while blood volume pressure did not show a significant change. It was also found that neither physical stress nor affect intensity played a role in the separation of neutral and non-neutral emotional states.
Optimised Attribute Selection for Emotion Classification Using Physiological Signals
Researchers in medicine and psychology have studied emotions and the way they influence human thinking and behaviour for decades. Recently computer scientists have realised the importance of emotions in human interactions with the environment and a considerable amount of research has been directed towards the identification and utilisation of affective information. Particular interest exists in the detection of emotional states with the intention of improving both human-machine interaction and artificial human-like inference models. Emotion detection has also been employed to explore applications that relate emotional states, habits and ambient conditions inside inhabited environments. Valuable information can be obtained by analysing the way affective states that influence behaviour are altered by environmental changes. In this paper an analysis of the properties of four physiological signals employed in emotion recognition is presented. Class separation analysis was used for determining the best physiological parameters (among those from a list chosen a priori) to use for recognizing emotional states. Results showed that the masseter electromyogram was the best attribute when distinguishing between neutral and non-neutral emotional states. Using Autoassociative Neural Networks for improving cluster separation the gradient of the skin conductance provided the best results when discriminating between positive and negative emotions.
Emotion assessment for affective computing based on physiological responses
2012 IEEE International Conference on Fuzzy Systems, 2012
Fusion and rejection 6.4 Results 6.4.1 Participants reports and protocol validation 6.4.2 Results of single classifiers 6.4.3 Results of feature selection 6.4.4 Results of fusion 6.4.5 Results of rejection 6.5 Conclusions Chapter 7 Assessment of emotions for computer games 7.1 Introduction: the flow theory for games 7.2 Data acquisition 7.2.1 Acquisition protocol 7.2.2 Feature extraction 7.3 Analysis of questionnaires and of physiological features 7.3.1 Elicited emotions 7.3.2 Evolution of emotions in engaged trials 7.4 Classification of the gaming conditions using physiological signals 7.4.1 Classification methods 7.4.2 Peripheral signals 7.4.3 EEG signals 7.4.4 EEG and peripheral signals 7.4.5 Fusion 7.5 Analysis of game-over events 7.5.1 Method 7.5.2 Results 7.6 Conclusion x Chapter 8 Conclusions 8.1 Outcomes 8.2 Future prospects Appendix A Consent form Appendix B Neighborhood table for the Laplacian filter Appendix C List of IAPS images used Appendix D Questionnaire results for the game protocol Appendix E Publications References List of figures List of tables xii Acronyms ANOVA ANalysis Of VAriance ANS Autonomic Nervous System BCI Brain-Computer Interface Chapter 1 In order to obtain such an "emotional machine" it is necessary to include emotions in the humanmachine communication loop (see Figure 1.1). This is what is defined as "affective computing". According to Picard [2], affective computing "proposes to give computers the ability to recognize [and] express […] emotions". Synthetic expression of emotions can be achieved by enabling avatars or simpler agents to have facial expressions, different tones of voice, and empathic behaviors [3, 4]. Detection of human emotions can be realized by monitoring and interpreting the different cues that are given in both verbal and non-verbal communication. However, a system that is "emotionally intelligent" should not only detect and express emotions but it should also take the proper action in response to the detected emotion. Expressing the adequate emotion is thus one of the outputs of this decision making. The proper action to take is dependent on the application but examples of reactions could be to provide help in the case the user feels helpless or to lower task demand in the case he / she is highly stressed. Many applications are detailed in chapter 1.2.2. Accurately assessing emotions is thus a critical step toward affective computing since this will determine the reaction of the system. The present work will focus on this first step of affective computing by trying to reliably assess emotion from several emotional cues. Figure 1.1. Including emotions in the human-machine loop 1.2 Emotion assessment 1.2.1 Multimodal expression of emotion A modality is defined as a path used to carry information for the purpose of interaction. In HMI, there are two possible approaches to define a modality: from the machine point of view and from the user point of view. On the machine side, a modality refers to the processes that generate information to the physical world and interpret information from it. It is thus possible to distinguish between input and output modalities and to associate them to the corresponding communication device. A keyboard, a mouse and a pad with the associated information processes are typical input modalities, while a text, an image and music presented on screens and speakers Emotional cues Senses Emotion assessment Decision of proper reaction Sensors User Emotion synthesis Ouput devices Machine Command execution Emotional adaptation Decision of proper reaction Emotion synthesis Evaluation Perception Interpretation Outcome evaluation Emotional evaluation / elicitation Execution Intentions formulation Actions specification Execution of actions Goals Emotional cues Machine User Games are also interesting from a HMI point of view because they are an ideal ground for the design of new ways to communicate with the machine. One of the main goals of games is to provide emotional experiences such as fun and excitement which generally occurs when the player is strongly involved in the course of the game action. However, a loss of involvement can occur if the game does not correspond to the player's expectations and competences; he might measuring various physiological signals requires the use of several sensors that are sometimes quite obtrusive since they can monopolize the use of one hand and are not comfortable. The price of those sensors should also to be taken into account. For these reasons the issue of determining the most useful sensors is of importance. Finally, there is also variability in physiological signals, from person to person but also from day to day, that yield to difficulty in designing a system that Reference Basic emotions Criteria Ekman [42] Anger,
A Predictive Model for Emotion Recognition Based on Individual Characteristics and Autonomic Changes
Basic and Clinical Neuroscience Journal, 2021
Introduction: The importance of individual differences in the problem of emotion recognition has been repeatedly stated in the studies. The major concentration of this study was the prediction of heart rate variability (HRV) changes due to affective stimuli from the subject characteristics. These features were age (A), gender (G), linguality (L), and sleep (S) information. In addition, the most potent combination of individual variables (like gender and age (GA) or age, linguality, and sleep (ALS)) in the estimation of emotional HRV was explored. Methods: To this end, HRV indices of 47 college students exposed to images with four emotional categories, including happy, sad, afraid, and relaxed were analyzed. Then, a novel predictive model was introduced based on the regression equation. Results: The results showed distinctive emotional situations provoke the importance of different individual variable combinations. Generally, the most satisfactory variable arrangement in the predicti...
Basic and Clinical Neuroscience (BCN), 2022
Studies have repeatedly stated the importance of individual differences in the problem of emotion recognition. The primary focus of this study is to predict Heart Rate Variability (HRV) changes due to affective stimuli from the individual characteristics. These features include age (A), gender (G), linguality (L), and sleep (S). In addition, the best combination of individual variables was explored to estimate emotional HRV. Methods: To this end, HRV indices of 47 college students exposed to images with four emotional categories of happiness, sadness, fear, and relaxation were analyzed. Then, a novel predictive model was introduced based on the regression equation. Results: The results show that different emotional situations provoke the importance of different individual variable combinations. The best variables arrangements to predict HRV changes due to emotional provocations are LS, GL, GA, ALS, and GALS. However, these combinations were changed according to each subject separately. Conclusion: The suggested simple model effectively offers new insight into emotion studies regarding subject characteristics and autonomic parameters.
Towards Truly Affective AAL Systems
Enhanced Living Environments, 2019
Affective computing is a growing field of artificial intelligence. It focuses on models and strategies for detecting, obtaining, and expressing various affective states, including emotions, moods, and personality related attributes. The techniques and models developed in affective computing are applicable to various affective contexts, including Ambient Assisted Living. One of the hypotheses for the origin of emotion is that the primary purpose was to regulate social interactions. Since one of the crucial characteristics of Ambient Assisted Living systems is supporting social contact, it is unthinkable to build such systems without considering emotions. Moreover, the emotional capacity needed for Ambient Assisted Living systems exceeds simple user emotion detection and showing emotion expressions of the system. In addition, emotion generation and emotion mapping on rational thinking and behavior of a system should be considered. The chapter discusses the need and requirements for these processes in the context of various application domains of Ambient Assisted Living, i.e., healthcare, mobility, education, and social interaction.