Gaze typing compared with input by head and hand (original) (raw)

Beyond Direct Gaze Typing: a Predictive Graphic User Interface for Writing and Communicating by Gaze

This paper introduces a new gaze-based Graphic User Interface (GUI) for Augmentative and Alternative Communication (AAC). In the state of the art, prediction methods to accelerate the production of textual, iconic and pictorial communication only by gaze control are still needed. The proposed GUI translates gaze inputs into words, phrases or symbols by the following methods and techniques: (i) a gaze-based information visualization technique, (ii) a prediction technique combining concurrent and retrospective methods, and (iii) an alternative prediction method based either on the recognition or morphing of spatial features. The system is designed for extending the communication function of individuals with severe motor disabilities, with the aim to allow end-users to independently hold a conversation without needing a human interpreter.

Writing with the Eyes: The Effect of Age on Eye-Tracking Performance in Non-Disabled Adults and a Comparison with Bimanual Typing

Computational Intelligence and Neuroscience

Eye-tracking technology is advancing rapidly, becoming cheaper and easier to use and more robust. This has fueled an increase in its implementation for Augmentative and Alternative Communication (AAC). Nowadays, Eye-Tracking Communication Devices (ETCDs) can be an effective aid for people with disabilities and communication problems. However, it is not clear what level of performance is attainable with these devices or how to optimize them for AAC use. The objective of this observational study was to provide data on non-disabled adults’ performance with ETCD regarding (a) range of eye-typing ability in terms of speed and errors for different age groups and (b) relationship between ETCD performance and bimanual writing with a conventional PC keyboard and (c) to suggest a method for a correct implementation of ETCD for AAC. Sixty-seven healthy adult volunteers (aged 20–79 years) were asked to type a sample sentence using, first, a commercial ETCD and then a standard PC keyboard; we re...

"HUMAN GAZE CONTROL KEYBOARD" School of Engineering & Technology SAGE University, Bhopal (M.P.) Towards Partial Fulfillment of The Requirements for the Award of the Degree Bachelor of Technology & Engineering In Computer Science & Engineering

Shivam Raghuwasnhi, 2024

The Human Gaze Control Keyboard, an emerging technology, utilizes algorithms for various applications like security and typing. This paper focuses on its application for physically disabled individuals, presenting the development of a virtual keyboard based on eye gaze and blinking detection. The system captures video from a PC camera and employs a simplified rule for accurate face detection. It utilizes 68 facial points, enabling eye gaze for keyboard selection and eye blinking for key choice. The system's goal is hands-free typing, particularly beneficial for those without limb control. Flowcharts detail each stage, and the study successfully achieves its aim, providing an efficient text input method for differently-abled individuals. Using the d-lib Library, OpenCV, and CNNs, the study analyzes eye movement and blinking, employing Neural Networks for accurate prediction. In the digital communication era, the study underscores the need for inclusivity, making technology accessible to all. As a milestone in assistive technology, this study paves the way for future advancements, contributing to the ongoing pursuit of accessibility in technological innovations.