Using eye- and gaze-tracking to interact with a visual display (original) (raw)
A gaze-based interactive system to explore artwork imagery
Journal on Multimodal User Interfaces
Interactive and immersive technologies can significantly enhance the fruition of museums and exhibits. Several studies have proved that multimedia installations can attract visitors, presenting cultural and scientific information in an appealing way. In this article, we present our workflow for achieving a gaze-based interaction with artwork imagery. We designed both a tool for creating interactive “gaze-aware” images and an eye tracking application conceived to interact with those images with the gaze. Users can display different pictures, perform pan and zoom operations, and search for regions of interest with associated multimedia content (text, image, audio, or video). Besides being an assistive technology for motor impaired people (like most gaze-based interaction applications), our solution can also be a valid alternative to the common touch screen panels present in museums, in accordance with the new safety guidelines imposed by the COVID-19 pandemic. Experiments carried out ...
Introduction Eye-tracking technology: Tracking gaze
Eye-tracking technology allows researchers to record and analyse a range of information about what people visually attend to and how they process visual information. For example, eye-tracking technology can be used to document the order in which people attend to different features of a visual image, whether they gaze at (i.e. fixate on) particular elements of an image (or completely avoid them), and, if so, the frequency and duration of these gazes. It can also be used to track more basic processing information, such as pupil dilation (see Chapter 2 in this volume) and gaze 'directionality' (i.e. whether people's eyes tend to gaze in particular directions first or most dominantly). There are a variety of ways that researchers can track people's eye movements and gaze direction. For example, there are free-standing systems that are typically placed in front of the person-and, thus, require that the person remain still in one location, typically while viewing visual stimuli on a screen-as well as systems that can be secured to a person's head, and are thus more mobile, able to move with the person and track eye movement and gaze more organically, during motion (see suggested readings for reviews).
Computer display control and interaction using eye‐gaze
2012
Abstract—Innovative systems for user-computer interaction based on the user's eye-gaze behavior have important implications for various applications. Examples include user navigation in large images, typical of astronomy or medicine, and user selection and viewing of multiple video streams. Typically, a web environment is used for these applications. System latency must be negligible, while system obtrusiveness must be small. This paper describes the implementation and initial experimentation on such an innovative system.
The More You Look the More You Get: Intention-based Interface using Gaze-tracking
Only a decade ago eye-and gaze-tracking technologies using cumbersome and expensive equipment were confined to the university research labs. However, rapid technological advancements (increased processor speed, advanced digital video processing) and mass production have both lowered the cost and dramatically increased the efficacy of eye-and gaze-tracking equipment. This opens up a whole new area of interaction mechanisms with museum content. In this paper I will describe a conceptual framework for an interface, designed for use in museums and galleries, which is based on non-invasive tracking of a viewer's gaze direction. Following the simple premise that prolonged visual fixation is an indication of viewer's interest, I dubbed this approach intention-based interface.
Gaze Tracking System for Gaze-Based Human-Computer Interaction
2003
We introduce a novel gaze-tracking system called FreeGaze, which is designed to support gaze-based human-computer interaction (HCI). Existing gaze-tracking systems require complicated and burdensome calibration, which prevents gaze being used to operate computers. To simplify the calibration process, FreeGaze corrects the refraction at the surface of the cornea. Unlike existing systems, this calibration procedure requires each user to look at only two points on the display. After the initial calibration, our system needs no further calibration for later measurement sessions. A user study shows that its gaze detection accuracy is about 1.06° (view angle), which is sufficient for gaze-based HCI. Gaze Tracking System for Gaze-Based Human-Computer Interaction † NTT Communication Science Laboratories Atsugi-shi, 243-0198 Japan E-mail: takehiko@brl.ntt.co.jp
Using eye tracking for interaction
2011
The development of cheaper eye trackers and open source software for eye tracking and gaze interaction brings the possibility to integrate eye tracking into everyday use devices as well as highly specialized equipment. Apart from providing means for analyzing eye movements, eye tracking also offers the possibility of a natural user interaction modality. Gaze control interfaces are already used within assistive applications for disabled users. However, this novel user interaction possibility comes with its own set of limitations and challenges. The aim of this SIG is to provide a forum for Designers, Researchers and Usability Professionals to discuss the role of eye tracking as a user interaction method in the future as well as the technical and user interaction challenges that using eye tracking as an interaction method brings.
Gaze guided object recognition using a head-mounted eye tracker
Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA '12, 2012
Wearable eye trackers open up a large number of opportunities to cater for the information needs of users in today's dynamic society. Users no longer have to sit in front of a traditional desk-mounted eye tracker to benefit from the direct feedback given by the eye tracker about users' interest. Instead, eye tracking can be used as a ubiquitous interface in a real-world environment to provide users with supporting information that they need. This paper presents a novel application of intelligent interaction with the environment by combining eye tracking technology with real-time object recognition. In this context we present i) algorithms for guiding object recognition by using fixation points ii) algorithms for generating evidence of users' gaze on particular objects iii) building a next generation museum guide called Museum Guide 2.0 as a prototype application of gaze-based information provision in a real-world environment. We performed several experiments to evaluate our gazebased object recognition methods. Furthermore, we conducted a user study in the context of Museum Guide 2.0 to evaluate the usability of the new gaze-based interface for information provision. These results show that an enormous amount of potential exists for using a wearable eye tracker as a human-environment interface.
Design and implementation of an augmented reality system using gaze interaction
Multimedia Tools and Applications, 2011
In this paper, we discuss an interactive optical see-through head-mounted device (HMD) which makes use of a user's gaze for an augmented reality (AR) interface. In particular, we propose a method to employ a user's half-blink information for more efficient interaction. Since the interaction is achieved using a user's gaze and half-blinks, the proposed system can create a more efficient computing environment. In addition, the proposed system can be quite helpful to those who have difficulties in using their hands for conventional interaction methods. The experimental results present the robustness and efficiency of the proposed system.
The Embodied Gaze: Exploring Applications for Mobile Eye Tracking in the Art Museum
Visitor Studies, 2020
Recent advances in Mobile Eye Tracking (MET) technology facilitate the investigation of visitors' embodied visual behaviors as they move through exhibition spaces. Our MET-based pilot study of visitor behaviors in an art museum demonstrates the value of MET for identifying 'hotspots' of attention. The study also confirms the occurrence of the movement patterns identified by Eghbal-Azar in non-art museums and demonstrates how two patterns-'the Long Gaze' and 'Reading'-can be usefully described in more detail. To illustrate this, we report on one visitor's experience with a single painting, noting the complex embodied visual behaviors associated with gazing and reading. Our findings allow us to reflect on the potential benefits of eye tracking not only for mapping visitor engagement but also for promoting it. In contrast to art museum installations that use static eye tracking as a form of visitor engagement, we argue that MET applications enable visitors to observe, reflect on, and potentially modify, personal viewing practices.