Individual risk profiling for portable devices using a neural network to process the recording of 30 successive pairs of cognitive reaction and emotional response to a multivariate situational risk assessment (original) (raw)
Related papers
ArXiv, 2021
In this paper, we are presenting a novel method and system for neuropsychological performance testing that can establish a link between cognition and emotion. It comprises a portable device used to interact with a cloud service which stores user information under username and is logged into by the user through the portable device; the user information is directly captured through the device and is processed by artificial neural network; and this tridimensional information comprises user cognitive reactions, user emotional responses and user chronometrics. The multivariate situational risk assessment is used to evaluate the performance of the subject by capturing the 3 dimensions of each reaction to a series of 30 dichotomous questions describing various situations of daily life and challenging the user’s knowledge, values, ethics, and principles. In industrial application, the timing of this assessment will depend on the user’s need to obtain a service from a provider such as openin...
Human Factors Analysis Using Wearable Sensors in the Context of Cognitive and Emotional Arousal
Procedia Manufacturing, 2015
Quantitative investigations on stress conditions in evaluation scenarios have so far mainly been conducted in static scenarios, such as, in desktop studies. Studies involving the mobility of participants are rather rare, in particular, considering substantial comparison between different stress conditions. Recently, data and eye tracking glasses have shifted the attention on future applications classes that would require wearable devices delivering multisensory, including psychophysiological, data in various everyday contexts, requiring information about the psychological status of the user. These settings need to investigate stress conditions in different mobile settings, querying which parameters would provide discriminative features for stress indication. A study with 20 participants was conducted in a shopping and a navigation context, involving participants-being equipped with portable psychophysiological sensors and eye tracking glasses-in memory and orientation tasks, respectively, for the inducing of cognitive and emotional arousal. From the results we conclude that the specific context and the cause of arousal lead to different reactions of the psychophysiological system as well as in the eye movement behavior. Depending on the context and the stress condition under investigation, different arousal, and consequently, stress classifiers as well as attention models should be applied.
Quantitative investigations on stress conditions in evaluation scenarios have so far mainly been conducted in static scenarios, such as, in desktop studies. Studies involving the mobility of participants are rather rare, in particular, considering substantial comparison between different stress conditions. Recently, data and eye tracking glasses have shifted the attention on future applications classes that would require wearable devices delivering multisensory, including psychophysiological, data in various everyday contexts, requiring information about the psychological status of the user. These settings need to investigate stress conditions in different mobile settings, querying which parameters would provide discriminative features for stress indication. A study with 20 participants was conducted in a shopping and a navigation context, involving participants – being equipped with portable psychophysiological sensors and eye tracking glasses-in memory and orientation tasks, respectively, for the inducing of cognitive and emotional arousal. From the results we conclude that the specific context and the cause of arousal lead to different reactions of the psychophysiological system as well as in the eye movement behavior. Depending on the context and the stress condition under investigation, different arousal, and consequently, stress classifiers as well as attention models should be applied.
Detecting Human Mood from Physiological Signal and Data Usage
Scientific Research, 2019
As the days go by, there are technologies that are being introduced everyday, whether it is a tiny music player iPod nano or a robot "Asimo" that runs 6 kilometers per hour. These technologies entertain, facilitate and make the day easier for the human being. It is not arguable anymore that the people need these technologies with the smart systems to lead their regular life smoothly. The smarter the system is; the more people like to use it. One major part of this smartness of the system depends on how well the system can interact with the person or the user.
Interacting with Computers, 2006
Emotions powerfully influence our physiology, behavior, and experience. A comprehensive assessment of affective states in health and disease would include responses from each of these domains in real life. Since no single physiologic parameter can index emotional states unambiguously, a broad assessment of physiologic responses is desirable. We present a recently developed system, the LifeShirt, which allows reliable ambulatory monitoring of a wide variety of cardiovascular, respiratory, metabolic, motor-behavioral, and experiential responses. The system consists of a garment with embedded inductive plethysmography and other sensors for physiologic data recording and a handheld computer for input of experiential data via touch screen. Parameters are extracted offline using sophisticated analysis and display software. The device is currently used in clinical studies and to monitor effects of physical and emotional stress in naturalistic settings. Further development of signal processing and pattern recognition algorithms will enhance computerized identification of type and extent of physical and emotional activation. q
A user-independent real-time emotion recognition system for software agents in domestic environments
Engineering Applications of Artificial Intelligence, 2007
The mystery surrounding emotions, how they work and how they affect our lives has not yet been unravelled. Scientists still debate the real nature of emotions, whether they are evolutionary, physiological or cognitive are just a few of the different approaches used to explain affective states. Regardless of the various emotional paradigms, neurologists have made progress in demonstrating that emotion is as, or more, important than reason in the process of making decisions and deciding actions. The significance of these findings should not be overlooked in a world that is increasingly reliant on computers to accommodate to user needs. In this paper, a novel approach for recognizing and classifying positive and negative emotional changes in real time using physiological signals is presented. Based on sequential analysis and autoassociative networks, the emotion detection system outlined here is potentially capable of operating on any individual regardless of their physical state and emotional intensity without requiring an arduous adaptation or pre-analysis phase. Results from applying this methodology on real-time data collected from a single subject demonstrated a recognition level of 71.4% which is comparable to the best results achieved by others through off-line analysis. It is suggested that the detection mechanism outlined in this paper has all the characteristics needed to perform emotion recognition in pervasive computing.
Current psychiatry reports, 2014
With the rapid and ubiquitous acceptance of new technologies, algorithms will be used to estimate new measures of mental state and behavior based on digital data. The algorithms will analyze data collected from sensors in smartphones and wearable technology, and data collected from Internet and smartphone usage and activities. In the future, new medical measures that assist with the screening, diagnosis, and monitoring of psychiatric disorders will be available despite unresolved reliability, usability, and privacy issues. At the same time, similar non-medical commercial measures of mental state are being developed primarily for targeted advertising. There are societal and ethical implications related to the use of these measures of mental state and behavior for both medical and non-medical purposes.
Emotion assessment for affective computing based on physiological responses
2012 IEEE International Conference on Fuzzy Systems, 2012
Fusion and rejection 6.4 Results 6.4.1 Participants reports and protocol validation 6.4.2 Results of single classifiers 6.4.3 Results of feature selection 6.4.4 Results of fusion 6.4.5 Results of rejection 6.5 Conclusions Chapter 7 Assessment of emotions for computer games 7.1 Introduction: the flow theory for games 7.2 Data acquisition 7.2.1 Acquisition protocol 7.2.2 Feature extraction 7.3 Analysis of questionnaires and of physiological features 7.3.1 Elicited emotions 7.3.2 Evolution of emotions in engaged trials 7.4 Classification of the gaming conditions using physiological signals 7.4.1 Classification methods 7.4.2 Peripheral signals 7.4.3 EEG signals 7.4.4 EEG and peripheral signals 7.4.5 Fusion 7.5 Analysis of game-over events 7.5.1 Method 7.5.2 Results 7.6 Conclusion x Chapter 8 Conclusions 8.1 Outcomes 8.2 Future prospects Appendix A Consent form Appendix B Neighborhood table for the Laplacian filter Appendix C List of IAPS images used Appendix D Questionnaire results for the game protocol Appendix E Publications References List of figures List of tables xii Acronyms ANOVA ANalysis Of VAriance ANS Autonomic Nervous System BCI Brain-Computer Interface Chapter 1 In order to obtain such an "emotional machine" it is necessary to include emotions in the humanmachine communication loop (see Figure 1.1). This is what is defined as "affective computing". According to Picard [2], affective computing "proposes to give computers the ability to recognize [and] express […] emotions". Synthetic expression of emotions can be achieved by enabling avatars or simpler agents to have facial expressions, different tones of voice, and empathic behaviors [3, 4]. Detection of human emotions can be realized by monitoring and interpreting the different cues that are given in both verbal and non-verbal communication. However, a system that is "emotionally intelligent" should not only detect and express emotions but it should also take the proper action in response to the detected emotion. Expressing the adequate emotion is thus one of the outputs of this decision making. The proper action to take is dependent on the application but examples of reactions could be to provide help in the case the user feels helpless or to lower task demand in the case he / she is highly stressed. Many applications are detailed in chapter 1.2.2. Accurately assessing emotions is thus a critical step toward affective computing since this will determine the reaction of the system. The present work will focus on this first step of affective computing by trying to reliably assess emotion from several emotional cues. Figure 1.1. Including emotions in the human-machine loop 1.2 Emotion assessment 1.2.1 Multimodal expression of emotion A modality is defined as a path used to carry information for the purpose of interaction. In HMI, there are two possible approaches to define a modality: from the machine point of view and from the user point of view. On the machine side, a modality refers to the processes that generate information to the physical world and interpret information from it. It is thus possible to distinguish between input and output modalities and to associate them to the corresponding communication device. A keyboard, a mouse and a pad with the associated information processes are typical input modalities, while a text, an image and music presented on screens and speakers Emotional cues Senses Emotion assessment Decision of proper reaction Sensors User Emotion synthesis Ouput devices Machine Command execution Emotional adaptation Decision of proper reaction Emotion synthesis Evaluation Perception Interpretation Outcome evaluation Emotional evaluation / elicitation Execution Intentions formulation Actions specification Execution of actions Goals Emotional cues Machine User Games are also interesting from a HMI point of view because they are an ideal ground for the design of new ways to communicate with the machine. One of the main goals of games is to provide emotional experiences such as fun and excitement which generally occurs when the player is strongly involved in the course of the game action. However, a loss of involvement can occur if the game does not correspond to the player's expectations and competences; he might measuring various physiological signals requires the use of several sensors that are sometimes quite obtrusive since they can monopolize the use of one hand and are not comfortable. The price of those sensors should also to be taken into account. For these reasons the issue of determining the most useful sensors is of importance. Finally, there is also variability in physiological signals, from person to person but also from day to day, that yield to difficulty in designing a system that Reference Basic emotions Criteria Ekman [42] Anger,
Artificial Neural Networks to Assess Emotional States from Brain-Computer Interface
Electronics, 2018
Estimation of human emotions plays an important role in the development of modern brain-computer interface devices like the Emotiv EPOC+ headset. In this paper, we present an experiment to assess the classification accuracy of the emotional states provided by the headset’s application programming interface (API). In this experiment, several sets of images selected from the International Affective Picture System (IAPS) dataset are shown to sixteen participants wearing the headset. Firstly, the participants’ responses in form of a self-assessment manikin questionnaire to the emotions elicited are compared with the validated IAPS predefined valence, arousal and dominance values. After statistically demonstrating that the responses are highly correlated with the IAPS values, several artificial neural networks (ANNs) based on the multilayer perceptron architecture are tested to calculate the classification accuracy of the Emotiv EPOC+ API emotional outcomes. The best result is obtained f...