Interacting with the Computer Using Gaze Gestures (original) (raw)
Related papers
Eye Movements in Gaze Interaction
Gaze, as a sole input modality must support complex navigation and selection tasks. Gaze interaction combines specific eye movements and graphic display objects (GDOs). This paper suggests a unifying taxonomy of gaze interaction principles. The taxonomy deals with three types of eye movements: fixations, saccades and smooth pursuits and three types of GDOs: static, dynamic, or absent. This taxonomy is qualified through related research and is the first main contribution of this paper. The second part of the paper offers an experimental exploration of single stroke gaze gestures (SSGG). The main findings suggest (1) that different lengths of SSGG can be used for interaction, (2) that GDOs are not necessary for successful completion, and (3) that SSGG are comparable to dwell time selection.
EyePoint: practical pointing and selection using gaze and keyboard
Proceedings of the SIGCHI …, 2007
We present a practical technique for pointing and selection using a combination of eye gaze and keyboard triggers. EyePoint uses a two-step progressive refinement process fluidly stitched together in a look-press-look-release action, which makes it possible to compensate for the accuracy limitations of the current state-of-the-art eye gaze trackers. While research in gaze-based pointing has traditionally focused on disabled users, EyePoint makes gaze-based pointing effective and simple enough for even able-bodied users to use for their everyday computing tasks. As the cost of eye gaze tracking devices decreases, it will become possible for such gaze-based techniques to be used as a viable alternative for users who choose not to use a mouse depending on their abilities, tasks and preferences.
Eye-gaze interaction for mobile phones
Proceedings of the 4th international conference on mobile technology, applications, and systems and the 1st international symposium on Computer human interaction in mobile technology - Mobility '07, 2007
In this paper, we discuss the use of eye-gaze tracking technology for mobile phones. In particular we investigate how gaze interaction can be used to control applications on handheld devices. In contrast to eye-tracking systems for desktop computers, mobile devices imply several problems like the intensity of light for outdoor use and calibration issues. Therefore, we compared two different approaches for controlling mobile phones with the eyes: standard eye-gaze interaction based on the dwell-time method and gaze gestures. Gaze gestures are a new concept, which we think has the potential to overcome many of these problems. We conducted a user study to see whether people are able to interact with applications using these approaches. The results confirm that eye-gaze interaction for mobile phones is attractive for the users and that the gaze gestures are an alternative method for eye-gaze based interaction.
Gaze-enhanced user interface design
2007
The eyes are a rich source of information for gathering context in our everyday lives. A user's gaze is postulated to be the best proxy for attention or intention. Using gaze information as a form of input can enable a computer system to gain more contextual information about the user's task, which in turn can be leveraged to design interfaces which are more intuitive and intelligent. Eye gaze tracking as a form of input was primarily developed for users who are unable to make normal use of a keyboard and pointing device. However, with the increasing accuracy and decreasing cost of eye gaze tracking systems it will soon be practical for able-bodied users to use gaze as a form of input in addition to keyboard and mouse.
Improving the accuracy of gaze input for interaction
Proceedings of the …, 2008
Using gaze information as a form of input poses challenges based on the nature of eye movements and how we humans use our eyes in conjunction with other motor actions. In this paper, we present three techniques for improving the use of gaze as a form of input. We first present a saccade detection and smoothing algorithm that works on real-time streaming gaze information. We then present a study which explores some of the timing issues of using gaze in conjunction with a trigger (key press or other motor action) and propose a solution for resolving these issues. Finally, we present the concept of Focus Points, which makes it easier for users to focus their gaze when using gaze-based interaction techniques. Though these techniques were developed for improving the performance of gaze-based pointing, their use is applicable in general to using gaze as a practical form of input.
Eye-Gaze Interaction On Computer Screen Evaluation
2005
Eye gaze positions evaluation on computer screen uses the human eye as an input device for computer systems is that it gives low resolution. We proposes a method to determine the eye gaze positions on the screen by using two-eye displacements as the information for mapping, and the perspective projection is applied to map the displacements to a position on a computer screen. The experiments were performed on 20 persons and a 17-inch monitor is used with the screen resolution of 1024x768 pixels. Gaze detection error was 3.18 cm (RMS error), with screen is divided into 5x8 and 7x10 positions on a 17-inch monitor. The results showed 100% and 96% correction, respectively.
The Costs and Benefits of Combining Gaze and Hand Gestures for Remote Interaction
Lecture Notes in Computer Science, 2015
Gaze has been proposed as an ideal modality for supporting remote target selection. We explored the potential of integrating gaze with hand gestures for remote interaction on a large display in terms of user experience and preference. We conducted a lab study to compare interaction in a photo-sorting task using gesture only, or the combination of gaze plus gesture. Results from the study show that a combination of gaze and gesture input can lead to significantly faster selection, reduced hand fatigue and increased ease of use compared to using only hand input. People largely preferred the combination of gaze for target selection and hand gestures for manipulation. However, gaze can cause particular kinds of errors and can induce a cost due to switching modalities.
Bringing gaze-based interaction back to basics
2001
This paper argues for a joint development of an eye gaze -based, on-line communication aid running on a standard PC with a web -camera. Tracking software is to be provided as open source to allow for improvements and individual integrations with other aids. The interface design shall be defined by the achieved resolution of the tr acking system. The design of a type -to-talk system with 12 large on -screen keys is described in the paper. In order for gaze tracking systems to become widely used, the strive for mouse -pointer precision should be replaced by a focus on the broad potentials of low-resolution gaze-based interactive systems.
Computer display control and interaction using eye‐gaze
2012
Abstract—Innovative systems for user-computer interaction based on the user's eye-gaze behavior have important implications for various applications. Examples include user navigation in large images, typical of astronomy or medicine, and user selection and viewing of multiple video streams. Typically, a web environment is used for these applications. System latency must be negligible, while system obtrusiveness must be small. This paper describes the implementation and initial experimentation on such an innovative system.