Playing Charades With Your Car – The Potential of Free-form and Contact-based Gestural Interfaces for Human Vehicle Interaction (original) (raw)
Related papers
Gestural interaction on the steering wheel
Proceedings of the 2011 annual conference on Human factors in computing systems - CHI '11, 2011
Cars offer an increasing number of infotainment systems as well as comfort functions that can be controlled by the driver. In our research, we investigate new interaction techniques that aim to make it easier to interact with these systems while driving. We suggest utilizing the steering wheel as an additional interaction surface. In this paper, we present two user studies conducted with a working prototype of a multi-touch steering wheel. In the first, we developed a user-defined steering wheel gesture set, and in the second, we applied the identified gestures and compared their application to conventional user interaction with infotainment systems in terms of driver distraction. The main outcome was that driver's visual demand is reduced significantly by using gestural interaction on the multi-touch steering wheel.
Gestural Interaction in Vehicular Applications
Computer, 2012
In-vehicle gestural interfaces are easy to use and increase safety by reducing visual demand on the driver. Prototype capacitive proximity sensing and depth-camera-based systems demonstrate that current technologies can recognize finger and hand gestures of varying complexity.
2007
Abstract The increasing quantity and complexity of in-vehicle systems creates a demand for user interfaces which are suited to driving. The steering wheel is a common location for the placement of buttons to control navigation, entertainment, and environmental systems, but what about a small touchpad? To investigate this question, we embedded a Synaptics StampPad in a computer game steering wheel and evaluated seven methods for selecting from a list of over 3000 street names.
Developing novel multimodal interaction techniques for touchscreen in-vehicle infotainment systems
2014 International Conference on Open Source Systems & Technologies, 2014
Methods of information presentation in the automotive space have been evolving continuously in recent years. As technology pushes forward the boundaries of what is possible, automobile manufacturers are trying to keep up with the current trends. Traditionally, the often-long development and quality control cycles of the automotive sector ensured slow yet steady progress. However, the exponential advancement in the mobile and hand-held computing space seen in the last 10 years has put immense pressure on automobile manufacturers to try to catch up. For this reason, we now see manufacturers trying to explore new techniques for in-vehicle interaction (IVI), which were ignored in the past. However, recent attempts have either simply extended the interaction model already used in mobile or handheld computing devices or increased visual-only presentation-of-information with limited expansion to other modalities (i.e. audio or haptics). This is also true for system interaction which generally happens within complex driving environments, making the primary task of a driver (driving) even more challenging. Essentially, there is an inherent need to design and research IVI systems that complement and natively support a multimodal interaction approach, providing all the necessary information without increasing driver's cognitive load or at a bare minimum his/her visual load. In this research we focus on the key elements of IVI system: touchscreen interaction by developing prototype devices that can complement the conventional visual and auditory modalities in a simple and natural manner. Instead of adding primitive touch feedback cues to increase redundancy or complexity, we approach the issue by looking at the current requirements of interaction and complementing the existing system with natural and intuitive input and output methods, which are less affected by environmental noise than traditional multimodal systems.
Natural, intuitive finger based input as substitution for traditional vehicle control
Both amount as well as dynamicity of content to be displayed in a car increases steadily, forcing manufacturer to change over to customizable screens integrated in dashboard and center console instead of dozens to hundreds of individual control signals. In addition, new requirements such as Internet access in the car or web services accessible while driving invalidates rudimentary display formats. Traditional forms of interaction such as buttons or knobs are unsuitable to respond to dynamic content shown on digital screens, requesting new mechanisms for distraction-free yet effective user (driver) input. We pick up this problem by introducing a novel sensing device allowing for natural, contactless, and eyes-free operation by relating finger movements in the area of the gearshift to screen coordinates. To assess quality features of this interface two research questions were formulated , (i) that the application of such a device would allow for natural, intuitive mouse pointer control in a similar manner than traditional forms of input and (ii) that the interface is insusceptible to varying workload conditions of the driver. Results from experimentation have revealed that, with respect to the first hypothesis, proximity sensing in a two-dimensional plane is a viable approach to directly control a mouse cursor on a screen integrated into the dashboard. A generally accepted conclusion on the assumption that the index of performance of the interface does not change with varying workload (hypothesis ii) cannot be drawn. To simulate different conditions of workload a dual task signal-response setting was used.
Gesturing on the Steering Wheel
Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, 2014
@hes-so.ch, 2 {firstname.lastname }@unifr.ch ABSTRACT "Eyes on the road, hands on the wheel" is a crucial principle to be taken into account designing interactions for current in-vehicle interfaces. Gesture interaction is a promising modality that can be implemented following this principle in order to reduce driver distraction and increase safety. We present the results of a user elicitation for gestures performed on the surface of the steering wheel. We asked to 40 participants to elicit 6 gestures, for a total of 240 gestures. Based on the results of this experience, we derived a taxonomy of gestures performed on the steering wheel. The analysis of the results offers useful suggestions for the design of in-vehicle gestural interfaces based on this approach.
IEEE Transactions on Intelligent Transportation Systems, 2022
Recently, there has been an excessive congestion occurring in the driving environment because of the presence of modern gadgets inside the car and increased traffic on the roads, which has resulted in a higher demand for the visual and cognitive senses. This prompted the need to reduce the demand to make driving experience safer and more comfortable. Consequently, a novel steering wheel design for in-car controls is presented in this paper. The new design introduces dual ubiquitous touch panels embedded in the steering wheel for interaction with in-car controls and haptic feedback as positive reinforcement upon successful execution of an in-car control. There are eight different functionalities that can be controlled using the embedded touch panels. The proposed system is compared with a standard car regarding its efficacy using the NASA task load index (NASA-TLX) evaluation technique. The results showed that the proposed system significantly reduced the drivers' visual, cognitive, and manual workload.
Advancing In-vehicle Gesture Interactions with Adaptive Hand-Recognition and Auditory Displays
13th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Competition for visual attention in vehicles has increased with the integration of touch-based interfaces, which has led to an increased crash risk. To mitigate this visual distraction, we designed an invehicle gesture-based menu system with different auditory feedback types and hand-recognition systems. We are conducting an experiment using a driving simulator where the participant performs a secondary task of selecting a menu item. Three auditory feedback types are tested in addition to the baseline condition (no audio): auditory icons, earcons, and spearcons. For each type of auditory display, two hand-recognition systems are tested: fixed and adaptive. We expect we can reduce the driver's secondary task workload, while minimizing off-road glances for safety. Our experiment would contribute to the existing literature in multimodal signal processing, confirming the Multiple Resource Theory. It would also present practical design guidelines for auditory-feedback for gesture-based in-vehicle interactions. CCS CONCEPTS • Human-centered computing → Auditory feedback; Gestural input.
SN Applied Sciences
Methods of information presentation in the automotive space have been evolving continuously in recent years. As technology pushes forward the boundaries of what is possible, automobile manufacturers are trying to keep up with the current trends. Traditionally, the often-long development and quality control cycles of the automotive sector ensured slow yet steady progress. However, the exponential advancement in the mobile and hand-held computing space seen in the last 10 years has put immense pressure on automobile manufacturers to try to catch up. For this reason, we now see manufacturers trying to explore new techniques for in-vehicle interaction (IVI), which were ignored in the past. However, recent attempts have either simply extended the interaction model already used in mobile or handheld computing devices or increased visual-only presentation-of-information with limited expansion to other modalities (i.e. audio or haptics). This is also true for system interaction which generally happens within complex driving environments, making the primary task of a driver (driving) even more challenging. Essentially, there is an inherent need to design and research IVI systems that complement and natively support a multimodal interaction approach, providing all the necessary information without increasing driver's cognitive load or at a bare minimum his/her visual load. In this research we focus on the key elements of IVI system: touchscreen interaction by developing prototype devices that can complement the conventional visual and auditory modalities in a simple and natural manner. Instead of adding primitive touch feedback cues to increase redundancy or complexity, we approach the issue by looking at the current requirements of interaction and complementing the existing system with natural and intuitive input and output methods, which are less affected by environmental noise than traditional multimodal systems.
Gesturing on the steering wheel, a comparison with speech and touch interaction modalities
2015
This paper compares an emergent interaction modality for the In-Vehicle Infotainment System (IVIS), i.e., gesturing on the steering wheel, with two more popular modalities in modern cars: touch and speech. We conducted a betweensubjects experiment with 20 participants for each modality to assess the interaction performance with the IVIS and the impact on the driving performance. Moreover, we compared the three modalities in terms of usability, subjective workload and emotional response. The results showed no statically significant differences between the three interaction modalities regarding the various indicators for the driving task performance, while significant differences were found in measures of IVIS interaction performance: users performed less interactions to complete the secondary tasks with the speech modality, while, in average, a lower task completion time was registered with the touch modality. The three interfaces were comparable in all the subjective metrics.