Head movement and facial expressions as game input (original) (raw)
Related papers
2017 23rd International Conference on Virtual System & Multimedia (VSMM), 2017
With the recent development of Head Mounted Display (HMD) for Virtual Reality (VR) allowing to track and recognize user's Facial Expression (FE)s in real-time, we investigated the impact that the use of FEs as an action-trigger input mechanism (e.g. a FE mapped to a single action) has on our emotional state; as well as their workload and usability compared to the use of a controller button. We developed an Augmented Reality (AR)-based memory card where the users select virtual cards using a wand and flip them using either a FE (smiling; frowning) or a Xbox controller button. The users were separated into three groups: (1) flipping the card with a smile (n = 10); (2) flipping the card with a frown (n = 8) and (3) flipping the cards with the Xbox controller button (n = 11). We did not see any significant differences between our groups in: (i) the positive and negative affect of the participants and (ii) the reported workload and usability, thus highlighting that the FEs could be used inside a HMD in the same way as a controller button.
A feasibility study in using facial expressions analysis to evaluate player experiences
Proceedings of The 8th Australasian Conference on Interactive Entertainment Playing the System - IE '12, 2012
Current quantitative methods of measuring player experience in games are mostly intrusive to play and less suited to natural, non-laboratory play environments. This paper presents an initial study to validate the feasibility of using facial expressions analysis for evaluating player experiences. It builds on a prior position that video-based computer vision techniques can provide a less intrusive and more versatile solution for automatic evaluation of game user experiences. A user study was performed on an initial group of participants in a first-person puzzle shooter game (Portal 2) and a social drawing trivia game (Draw My Thing), and the results are shown to support our position.
Use of eye movements for video game control
Proceedings of the 2006 ACM SIGCHI international conference on Advances in computer entertainment technology - ACE '06, 2006
We present a study that explores the use of a commercially available eye tracker as a control device for video games. We examine its use across multiple gaming genres and present games that utilize the eye tracker in a variety of ways. First, we describe a firstperson shooter that uses the eyes to control orientation. Second, we study the use of eye movements for more natural interaction with characters in a role playing game. And lastly, we examine the use of eye tracking as a means to control a modified version of the classic action/arcade game Missile Command. Our results indicate that the use of an eye tracker can increase the immersion of a video game and can significantly alter the gameplay experience.
The Impact of Emotions on the Speed and Score of Computer Games
Pure science and Technology Applications (SCUG-PSTA-2022)
This Research has examined how video game affects the performance of the player. Unity tool has been used to create an interactive game with several ball objects that accrue additional points for each participant. Different methods have been used to strike the balls, using the mouse, keyboard, combination of both, and facial emotion. Image processing techniques have been used to match the facial expression with the EMOGI of the balls the players hit. The performance of the game has been affected by facial expressions. The study demonstrates how widely used, and highly engaging face-emotion techniques are in video games. The usage of visual observation concerning traditional input control and emotion input has been examined, and the differences are demonstrated. Our results show that facial emotions improve the technology for video games; in our game, we tested three types of facial emotions: sad, happy, and angry.
Facial expressions as game input with different emotional feedback conditions
Proceedings of the 2008 International Conference in Advances on Computer Entertainment Technology - ACE '08, 2008
We propose a game design approach that utilizes facial expressions as an input method under different emotional feedback configurations. A study was conducted in a shopping centre to assess our game "EmoFlowers" focusing on user experience and user effectiveness. The study revealed that interaction with a game via facial expression is perceived naturally, is easy to learn, and provides a positive user experience.
Advances in Intelligent Systems and Computing, 2018
Nowadays, the face reading technology could be a good solution to evaluate digital games or to be integrated in a user centered design process. The big advantages of this technology is the possibility to get direct measures in non-intrusive way, don't require a special computer and use a normal camera. In the literature, most studies use this technology in the context of neuromarketing, few studies use it in the context of gamming. In this context, this paper presents an exploratory study to analyse the potentialities of a face-reading tool to measure the emotional reactions, when the user plays a digital game. The aspects evaluated were the sensitivity of the tool at the different moments of the game, and to identification of usability problems. Our results show that the tool was effective in the evaluation of these aspects, allowing the identification of moments of the game where it is necessary to increase or not the emotions and develop a other studies to improve the usability problems.
2016
In many affective computing paradigms a user’s internal state is used as an implicit control signal in an interaction. In the work presented here, we are exploring the utilization of two measurement techniques commonly used to assess a user’s affective state as an explicit control signal in a navigation task in a virtual environment. Concretely, we are investigating the feasibility of combining a real-time emotional biometric sensing system and a computer vision system for human emotional characterization and controlling a computer game. A user’s “happiness” and “sadness” levels are assessed by combining information from a camerabased computer vision system and electromyogram (EMG) signals from the facial corrugator muscle. Using a purpose-designed 3D flight simulation game, users control their simulated up-down motions using their facial expressions. To assess if combining visual and EMG data improves facial tracking performance, we conduct a user study where users are navigating t...
A comparison between tilt-input and facial tracking as input methods for mobile games
2014 IEEE Games Media Entertainment, 2014
A user study was performed to compare two nontouch input methods for mobile gaming: tilt-input and facial tracking. User performance was measured on a mobile game called StarJelly installed on a Google Nexus 7 HD tablet. The tiltinput method yielded significantly better performance. The mean game-score attained using tilt-input was 665.8. This was 7× × higher than the mean of 95.1 for facial tracking. Additionally, participants were more precise with tilt-input with a mean star count of 19.7, compared to a mean of 1.9 using facial tracking. Although tilt-input was superior, participants praised facial tracking as challenging and innovative.