Advancing In-vehicle Gesture Interactions with Adaptive Hand-Recognition and Auditory Displays (original) (raw)
13th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Competition for visual attention in vehicles has increased with the integration of touch-based interfaces, which has led to an increased crash risk. To mitigate this visual distraction, we designed an invehicle gesture-based menu system with different auditory feedback types and hand-recognition systems. We are conducting an experiment using a driving simulator where the participant performs a secondary task of selecting a menu item. Three auditory feedback types are tested in addition to the baseline condition (no audio): auditory icons, earcons, and spearcons. For each type of auditory display, two hand-recognition systems are tested: fixed and adaptive. We expect we can reduce the driver's secondary task workload, while minimizing off-road glances for safety. Our experiment would contribute to the existing literature in multimodal signal processing, confirming the Multiple Resource Theory. It would also present practical design guidelines for auditory-feedback for gesture-based in-vehicle interactions. CCS CONCEPTS • Human-centered computing → Auditory feedback; Gestural input.
Sign up for access to the world's latest research.
checkGet notified about relevant papers
checkSave papers to use in your research
checkJoin the discussion with peers
checkTrack your impact