Next generation of tools for industry to evaluate the user emotional perception: the biometric-based multimethod tools (original) (raw)
Related papers
Assessing User Experience via Biometric Sensor Affect Detection
End-User Considerations in Educational Technology Design, 2018
Traditional user experience assessments rely on self-report, human-system performance, and observational data that incompletely capture users' psychological demands, processing, or affect. Specifically, self-report measures require users to identify and articulate subjective responses to product features, yet users may not possess accurate awareness or may be unwilling or unable to express themselves. Similarly, human-system performance and observational measures require analysts to make inferences about hidden psychological states based on observed external patterns. This chapter discusses how biometric sensor-based affect detection technologies (e.g., eye tracking and EEG) may supplement traditional methods. By measuring biometric indicators of psychological states, researchers can gain potentially richer and more accurate insights into user experience. These technologies are gaining traction in educational technology development and functionality, and thus the extension of th...
This paper presents the results of the usability testing of an experimental component of the Virtual Collaboration Arena (VirCA) developed by the Cognitive Informatics Group of the Computer and Automation Research Institute of the Hungarian Academy of Sciences. This component is a semi-intelligent agent called the Emotional Display Object. We applied Wizard-of-Oz type high-fidelity early prototype evaluation technique to test the concept. The research focused on basic usability problems, and, in general, the perceptibility of the object as uncovered by eye-tracking and interview data; we analyzed and interpreted the results in correlation with the individual differences identified by a demographic questionnaire and psychological tests: the Myers-Briggs Type Indicator (MBTI), the Spatial-Visual Ability Paper Folding Test, and the Reading the Mind in the Eyes Test (RMET)however, the main goal of this paper outreaches beyond the particular issues found and the development of an agent: it shows a case study on how complex concepts in Virtual Reality (VR) can be tested in very early stage of development.
Biometric emotion assessment and feedback in an immersive digital environment
International Journal of Social …, 2009
Affective computing has increased its significance both in terms of academic and industry attention and investment. Alongside, immersive digital environments have settled as a reliable domain, with progressively inexpensive hardware solutions. Having this in mind, the authors envisioned the automatic real-time user emotion extraction through biometric readings in an immersive digital environment. In the running example, the environment consisted in an aeronautical simulation, and biometric readings were based mainly on galvanic skin response, respiration rate and amplitude, and phalanx temperature. The assessed emotional states were also used to modify some simulation context variables, such as flight path, weather conditions and maneuver smoothness level. The results were consistent with the emotional states as stated by the users, achieving a success rate of 77%, considering single emotions and 86% considering a quadrant-based analysis.
A Perspective to Understand Emotional Design - Extending of Design Methods with Inherent Knowledge
2019
Most products contain inherent images as well as intended images. While the influence of intended shapes on product evaluation has been well studied, the influence of inherent images remains unexplored. Thus, this study investigated the influence of inherent images on products. There were 20 subjects in two consecutive experimental sessions: 1) evaluation item screening, and 2) affective evaluation. In the first session, the subjects made decisions within the limited time in regards to their preference on the stimuli. In the second session, the subjects evaluated the stimuli with SAM (Self Assessment Manikin), which selected by their preference. SAM (Self Assessment Manikin) is an efficient measurement of emotional responses: valence, arousal, and dominance. In the study, car front images were used as stimuli on the similar disposition towards human face expression. Results showed that: 1) products contain inherent images that trigger affective differences; 2) arousal was the most a...
Face to interface: facial affect in (hu)man and machine
Proceedings of the …, 2000
Facial expression of emotion (or "facial affect") is rapidly becoming an area of intense interest in the computer science and interaction design communities. Ironically, this interest comes at a time when the classic findings on perception of human facial affect are being challenged in the psychological research literature, largely on methodological grounds. This paper presents two studies on perception of facial affect. Experiment 1 provides new data on the recognition of human facial expressions, using experimental methods and analyses designed to systematically address the criticisms and help resolve this controversy. Experiment 2 is a user study on affect in a prototype robot face; the results are compared to the human data of Experiment 1. Together they provide a demonstration of how basic and more applied research can mutually contribute to this rapidly developing field.
Assessing emotion in human–product interaction: an overview of available methods and a new approach
This paper reviews some of the tools available for the measurement of emotion in product design. These tools are organized in a general framework based on current theories of emotion. It distinguishes five components of emotion: feelings, expression, behavior, physiological activation and appraisal. Several problems specific to the measurement of user experience during interaction with products are also described and a coherent approach to address them is sketched. This approach includes several components: self-confrontation (video-supported self-report) to collect data about users' feelings, using tangible interaction principles and approach/avoidance tendencies to map the quality of the experience on the device used to report it, psychophysiological and facial expression recording to provide continuous measures of emotional arousal and valence. Finally, questioning users about appraisal processes underlying their affective responses is recommended as a useful inspirational tool at the beginning of the design process.
The influence of facial interface design on dynamic emotional recognition
Journal on Multimodal User Interfaces, 2012
The use of facial interfaces in distant communications highlights the relevance of emotional recognition. 2 However researches on emotional facial expression (EFE) 3 recognition are mainly based on static and posed stimuli and 4 their results are not much transferable to daily interactions. 5 The purpose of the present study is to compare emotional 6 recognition of authentic EFEs with 11 different interface 7 designs. A widget allowing participants both to recognize 8 an emotion and to assess it on-line was used. Divided-face 9 and compound-face interfaces are compared with a common 10 full frontal interface. Analytic and descriptive on-line results 11 reveal that some interfaces facilitate emotional recognition 12 whereas others would decrease it. This study suggests that 13 relevant interfaces could improve emotional recognition and 14 thus facilitate distant communications.
Multi-modal emotional processing for SCOUT: Beyond the HCI Psychometrics methods
2010
Human-Computer Interaction (HCI) research groups have recently attracted to the issue of emotion or affect especially in the examination of interaction and design. With recent technological advances, human users are able to interact with computers in ways which are almost impossible. New modalities for computer interaction with human emotion such as skin conductivity, heart rate, brain signals and physiological signals are emerging. It shows that emotion plays an important role in human communication and interaction, therefore allow people to express emotion beyond the verbal domain.
Assessing emotion in interaction: Some problems and a new approach
2009
This paper reviews some of the tools available for the measurement of emotion in product design. These tools are organized using the five components of emotion: feelings, expression, behavior, physiological activation and appraisal. Several problems specific to the measurement of user experience during interaction with products are also described and a coherent approach to address them is sketched. This approach
Proceedings of the 21st British HCI …, 2007
Emotional appeal is a key dimension in user experience that often goes unmeasured in most user-centered design projects. This paper presents preliminary work for developing a set of guidelines for efficiently, easily and cost-effectively assessing the users' affective state by evaluating their expressive reactions during an interface evaluation process. The evaluation of this dimension complements the analysis of the objective and quantitative data gathered through usability tests and the subjective feedback provided through post-test questionnaires.