A Platform for Gaze-Contingent Virtual Environments (original) (raw)

Eye gaze in virtual environments: evaluating the need and initial work on implementation

2009

For efficient collaboration between participants, eye gaze is seen as being critical for interaction. Video conferencing either does not attempt to support eye gaze (e.g. AcessGrid) or only approximates it in round table conditions (e.g. life size telepresence). Immersive collaborative virtual environments represent remote participants through avatars that follow their tracked movements. By additionally tracking people's eyes and representing their movement on their avatars, the line of gaze can be faithfully reproduced, as opposed to approximated. This paper presents the results of initial work that tested if the focus of gaze could be more accurately gauged if tracked eye movement was added to that of the head of an avatar observed in an immersive VE. An experiment was conducted to assess the difference between user's abilities to judge what objects an avatar is looking at with only head movements being displayed, while the eyes remained static, and with eye gaze and head movement information being displayed. The results from the experiment show that eye gaze is of vital importance to the subjects correctly identifying what a person is looking at in an immersive virtual environment. This is followed by a description of the work that is now being undertaken following the positive results from the experiment. We discuss the integration of an eye tracker more suitable for immersive mobile use and the software and techniques that were developed to integrate the user's real-world eye movements into calibrated eye gaze in an immersive virtual world. This is to be used in the creation of an immersive collaborative virtual environment supporting eye gaze and its ongoing experiments.

Gaze-Based Interaction for Virtual Environments

Journal of Universal Computer Science, 2008

We present an alternative interface that allows users to perceive new sensa-tions in virtual environments. Gaze-based interaction in virtual environments creates the feeling of controlling objects with the mind, arguably translating into a more in-tense immersion sensation. ...

LAIF: A Logging and Interaction Framework for Gaze-Based Interfaces In Virtual Entertainment Environments

Entertainment …, 2010

Eye tracking is a fascinating technology that is starting to be used for evaluation of and for interacting in virtual environments. Especially digital games can benefit from an integrated (i.e., evaluation and interaction) approach, harnessing eye tracking technology for analysis and interaction. Such benefits include faster development of innovative games which can be automatically evaluated in an iterative fashion. For this purpose, we present a framework that enables rapid game development and gameplay analysis within an experimental research environment. The framework presented here is extensible for different kinds of logging (e.g., psychophysiological and in-game behavioral data) and facilitates studies using eye-tracking technology in digital entertainment environments. An experimental study using gaze-only interaction in a digital game is also presented and highlights the framework's capacity to create and evaluate novel entertainment interfaces.

Designing gaze-based user interfaces for steering in virtual environments

2012

Since eye gaze may serve as an efficient and natural input for steering in virtual 3D scenes, we investigate the design of eye gaze steering user interfaces (UIs) in this paper. We discuss design considerations and propose design alternatives based on two selected steering approaches differing in input condition (discrete vs. continuous) and velocity selection (constant vs. gradient-based). The proposed UIs have been iteratively advanced based on two user studies with twelve participants each. In particular, the combination of continuous and gradient-based input shows a high potential, because it allows for gradually changing the moving speed and direction depending on a user's point-of-regard. This has the advantage of reducing overshooting problems and dwell-time activations. We also investigate discrete constant input for which virtual buttons are toggled using gaze dwelling. As an alternative, we propose the Sticky Gaze Pointer as a more flexible way of discrete input.

Investigations of the Role of Gaze in Mixed-Reality Personal Computing

2011

ABSTRACT This short paper constitutes our first investigation of how eye tracking and gaze estimation can help create better mixed-reality personal computing systems involving both physical (real world) and virtual (digital) objects. The role of gaze is discussed in the light of the situative space model (SSM) which determines the set of objects a given human agent can perceive, and act on, in any given moment in time.

Combined Head-Eye Tracking for Immersive Virtual

Real-time gaze tracking is a promising interactiontechnique for virtual environments. Immersiveprojection-based virtual reality systems such as theCAVEallow users a wide range of natural movements. Unfortunately, most head and eye movementmeasurement techniques are of limited use during freehead and body motion. An improved head-eye trackingsystem is proposed and developed for use in immersiveapplications with free head motion. The system is basedupon a head-mounted video-based eye tracking systemand a ...

Real-time recording and classification of eye movements in an immersive virtual environment

Despite the growing popularity of virtual reality environments, few laboratories are equipped to investigate eye movements within these environments. This primer is intended to reduce the time and effort required to incorporate eye-tracking equipment into a virtual reality environment. We discuss issues related to the initial startup and provide algorithms necessary for basic analysis. Algorithms are provided for the calculation of gaze angle within a virtual world using a monocular eye-tracker in a three-dimensional environment. In addition, we provide algorithms for the calculation of the angular distance between the gaze and a relevant virtual object and for the identification of fixations, saccades, and pursuit eye movements. Finally, we provide tools that temporally synchronize gaze data and the visual stimulus and enable real-time assembly of a video-based record of the experiment using the Quicktime MOV format, available at http://sourceforge. net/p/utdvrlibraries/. This record contains the visual stimulus, the gaze cursor, and associated numerical data and can be used for data exportation, visual inspection, and validation of calculated gaze movements.