Using eye- and gaze-tracking to interact with a visual display (original) (raw)
Related papers
A gaze-based interactive system to explore artwork imagery
Journal on Multimodal User Interfaces
Interactive and immersive technologies can significantly enhance the fruition of museums and exhibits. Several studies have proved that multimedia installations can attract visitors, presenting cultural and scientific information in an appealing way. In this article, we present our workflow for achieving a gaze-based interaction with artwork imagery. We designed both a tool for creating interactive “gaze-aware” images and an eye tracking application conceived to interact with those images with the gaze. Users can display different pictures, perform pan and zoom operations, and search for regions of interest with associated multimedia content (text, image, audio, or video). Besides being an assistive technology for motor impaired people (like most gaze-based interaction applications), our solution can also be a valid alternative to the common touch screen panels present in museums, in accordance with the new safety guidelines imposed by the COVID-19 pandemic. Experiments carried out ...
Introduction Eye-tracking technology: Tracking gaze
Eye-tracking technology allows researchers to record and analyse a range of information about what people visually attend to and how they process visual information. For example, eye-tracking technology can be used to document the order in which people attend to different features of a visual image, whether they gaze at (i.e. fixate on) particular elements of an image (or completely avoid them), and, if so, the frequency and duration of these gazes. It can also be used to track more basic processing information, such as pupil dilation (see Chapter 2 in this volume) and gaze 'directionality' (i.e. whether people's eyes tend to gaze in particular directions first or most dominantly). There are a variety of ways that researchers can track people's eye movements and gaze direction. For example, there are free-standing systems that are typically placed in front of the person-and, thus, require that the person remain still in one location, typically while viewing visual stimuli on a screen-as well as systems that can be secured to a person's head, and are thus more mobile, able to move with the person and track eye movement and gaze more organically, during motion (see suggested readings for reviews).
Computer display control and interaction using eye‐gaze
2012
Abstract—Innovative systems for user-computer interaction based on the user's eye-gaze behavior have important implications for various applications. Examples include user navigation in large images, typical of astronomy or medicine, and user selection and viewing of multiple video streams. Typically, a web environment is used for these applications. System latency must be negligible, while system obtrusiveness must be small. This paper describes the implementation and initial experimentation on such an innovative system.
Gaze-enhanced user interface design
2007
The eyes are a rich source of information for gathering context in our everyday lives. A user's gaze is postulated to be the best proxy for attention or intention. Using gaze information as a form of input can enable a computer system to gain more contextual information about the user's task, which in turn can be leveraged to design interfaces which are more intuitive and intelligent. Eye gaze tracking as a form of input was primarily developed for users who are unable to make normal use of a keyboard and pointing device. However, with the increasing accuracy and decreasing cost of eye gaze tracking systems it will soon be practical for able-bodied users to use gaze as a form of input in addition to keyboard and mouse.
The More You Look the More You Get: Intention-based Interface using Gaze-tracking
Only a decade ago eye-and gaze-tracking technologies using cumbersome and expensive equipment were confined to the university research labs. However, rapid technological advancements (increased processor speed, advanced digital video processing) and mass production have both lowered the cost and dramatically increased the efficacy of eye-and gaze-tracking equipment. This opens up a whole new area of interaction mechanisms with museum content. In this paper I will describe a conceptual framework for an interface, designed for use in museums and galleries, which is based on non-invasive tracking of a viewer's gaze direction. Following the simple premise that prolonged visual fixation is an indication of viewer's interest, I dubbed this approach intention-based interface.
Gaze Tracking System for Gaze-Based Human-Computer Interaction
2003
We introduce a novel gaze-tracking system called FreeGaze, which is designed to support gaze-based human-computer interaction (HCI). Existing gaze-tracking systems require complicated and burdensome calibration, which prevents gaze being used to operate computers. To simplify the calibration process, FreeGaze corrects the refraction at the surface of the cornea. Unlike existing systems, this calibration procedure requires each user to look at only two points on the display. After the initial calibration, our system needs no further calibration for later measurement sessions. A user study shows that its gaze detection accuracy is about 1.06° (view angle), which is sufficient for gaze-based HCI. Gaze Tracking System for Gaze-Based Human-Computer Interaction † NTT Communication Science Laboratories Atsugi-shi, 243-0198 Japan E-mail: takehiko@brl.ntt.co.jp
Eye Gaze as a Human-computer Interface
Procedia Technology, 2014
This work describes an eye tracking system for a natural user interface, based only on non-intrusive devices such as a simple webcam. Through image processing the system is able to convert the focus of attention of the user to the corresponding point on the screen. Experimental tests were performed displaying to the users a set of known points on the screen. These tests show that the application has promising results.
On the Design of a Low Cost Gaze Tracker for Interaction
Procedia Technology, 2012
The human gaze is a basic mean for non verbal interaction. However, in several situations, especially in the context of upper limb motor impairment, the gaze represents also an alternative mean for human interaction with the environment (real or virtual). This interaction can be mastered through specific tools and new learned skills. Therefore the technological tool is a key for new interaction models. This paper presents a tool for gaze interaction: a new gaze tracker. The system specifications and the status of the gaze tracker design are presented; the dedicated algorithm for eye detection and tracking as well as an improvement of Zelinsky's model for eye movement prediction during the search of a predefined object in an image are outlined. Results of the first pre-prototype first evaluation with end users are summarized.
Gaze Controlled Human-Computer Interface
2012
The goal of the Gaze Controlled Human Computer Interface project is to design and construct a non-invasive gaze-tracking system that will determine where a user is looking on a computer screen in real time. To accomplish this, a fixed illumination source consisting of Infrared (IR) Light Emitting Diodes (LEDs) is used to produce corneal reflections on the user’s eyes. These reflections are captured with a video camera and compared to the relative location of the user’s pupils. From this comparison, a correlation matrix can be created and the approximate location of the screen that the user is looking at can be determined. The final objective is to allow the user to manipulate a cursor on the computer screen simply by looking at different boxes in a grid on the monitor. The project includes design of the hardware setup to provide a suitable environment for glint detection, image processing of the user’s eyes to determine pupil location, the implementation of a probabilistic algorithm...
A history of eye gaze tracking
2007
Applications that use eye and gaze have recently become increasingly popular in the domain of human-computer interfaces. In this paper, we provide a historical review of eye tracker systems and applications. We address the use of gaze tracking in order to adapt entertainment and edutainment applications according to user behaviour. Figure 1.1: The visible white area of human eye (sclera) makes it easier to interpret the gaze direction.