A gaze interactive textual smartwatch interface (original) (raw)
Related papers
Gaze-enhanced user interface design
2007
GUIDe: gaze-enhanced UI design
CHI'07 extended abstracts on Human …, 2007
Gaze and speech in attentive user interfaces
Advances in Multimodal …, 2000
Writing With Your Eye: A Dwell Time Free Writing System Adapted to the Nature of Human Eye Gaze
Perception in Multimodal Dialogue Systems, 2008
Beyond Direct Gaze Typing: a Predictive Graphic User Interface for Writing and Communicating by Gaze
Wrist-worn pervasive gaze interaction
Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, 2016
Is Touch-Based Text Input Practical for a Smartwatch?
Communications in Computer and Information Science, 2015
Hummer: Text Entry by Gaze and Hum
Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems
TAGSwipe: Touch Assisted Gaze Swipe for Text Entry
Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 2020
Interacting with the Computer Using Gaze Gestures
Lecture Notes in Computer Science, 2007
Facilitating accessibility and digital inclusion using gaze-aware wearable computing
2014
Gaze and head pointing for hands-free text entry
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, 2018
IEEE Transactions on Information Technology in Biomedicine, 2000
IEEE Transactions on Information Technology in Biomedicine, 2011
The eyeBook – Using Eye Tracking to Enhance the Reading Experience
Informatik-Spektrum, 2009
Usability of the super-vowel for gaze-based text entry
2022 Symposium on Eye Tracking Research and Applications
TapGazer: Text Entry with Finger Tapping and Gaze-directed Word Selection
CHI Conference on Human Factors in Computing Systems
CamType: assistive text entry using gaze with an off-the-shelf webcam
Machine Vision and Applications, 2019
Gaze-supported Interaction ( Demo )
2012
The Markup Language for Designing Gaze Controlled Applications
arXiv: Human-Computer Interaction, 2019
Gaze typing compared with input by head and hand
2004
A Comparison of Gaze-Based and Gesture-Based Input for a Point-and-Click Task
Universal Access in Human-Computer Interaction. Access to Interaction, 2015
A Gaze-Assisted Multimodal Approach to Rich and Accessible Human-Computer Interaction
ArXiv, 2018
Bringing gaze-based interaction back to basics
2001
Calibration-free text entry using smooth pursuit eye movements
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, 2019
Can Gaze Beat Touch? A Fitts' Law Evaluation of Gaze, Touch, and Mouse Inputs
arXiv (Cornell University), 2022
Expanding the input expressivity of smartwatches with mechanical pan, twist, tilt and click
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2014
Head and Eye Movement as Pointing Modalities for Eyewear Computers
2014 11th International Conference on Wearable and Implantable Body Sensor Networks Workshops, 2014
Comparing Dwell time, Pursuits and Gaze Gestures for Gaze Interaction on Handheld Mobile Devices
Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems
2019