Exploring the Use of Gesture in Collaborative Tasks (original) (raw)
Related papers
Untying the knot between gestures and speech
Proceedings of the eight …, 2009
Do people speak differently when they cannot use their hands? This study looks at the influence of gestures on speech by having participants take part in an instructional task, half of which had to be performed while sitting on their hands. Other factors that influence the ease of ...
Gesture, 2009
Is communication the primary functional role of gesticulation? We conducted a study in which participants narrated to a presumed computer system, a presumed addressee in another room (via web cam), or an addressee in the same room, who could either see them or not. Participants produced significantly fewer gestures when they thought to be talking to a computer system. Our results show that people narrate differently to a computer system than to a human addressee. Considering the difference in gesticulation, it seems plausible that, in a narrative task, most gestures are produced with a communicative intent.
NEALT PROCEEDINGS SERIES VOL. 6, 2009
This paper contains an analysis of features of gesture types that are produced before or simultaneously with speech (mainly. nouns and verbs) and in relation to own communication management (choice and change). The types of gestures discussed are arm-hand gestures, head movements and gaze. The analysis is then discussed in relation to two selected social activities, where virtual agents (ECAs) are or can be used. Gesture types and features with different functions are briefly suggested for each of the two activities and also ...
Gesture and speech multimodal conversational interaction
VISLab Report: VISLab- …, 2001
Gesture and speech combine to form a rich basis for human conversational interaction. To exploit these modalities in HCI, we need to understand the interplay between them and the way in which they support communication. We propose a framework for the gesture research done to date, and present our work on the cross-modal cues for discourse segmentation in free-form gesticulation accompanying speech in natural conversation as a new paradigm for such multimodal interaction.
Gesture and speech in interaction: An overview
Speech Communication, 2014
Gestures and speech interact. They are linked in language production and perception, with their interaction contributing to felicitous communication. The multifaceted nature of these interactions has attracted considerable attention from the speech and gesture community. This article provides an overview of our current understanding of manual and head gesture form and function, of the principle functional interactions between gesture and speech aiding communication, transporting meaning and producing speech. Furthermore, we present an overview of research on temporal speech-gesture synchrony, including the special role of prosody in speech-gesture alignment. In addition, we provide a summary of tools and data available for gesture analysis, and describe speech-gesture interaction models and simulations in technical systems. This overview also serves as an introduction to a Special Issue covering a wide range of articles on these topics. We provide links to the Special Issue throughout this paper.
Usage of gestures along with other interaction modes in collaborative design
Currently many computer-aided multi-modal interaction tools are under development, and some have demonstrated their applications in design. To avoid disruptive transformation from current design tools to multi-modal designing, there is a need for several descriptive studies to understand commonly used interaction modes in design. To understand how gestures are amalgamated in collaborative design while using current design tools, a set of laboratory experiments were conducted with a pair of designers working together to solve a design problem. The two objectives of this paper are: 1. Which interaction mode, among verbal, gestural, textual, graphical, and combination of these, dominates in collaborative designing? and 2. How do these interaction modes change across design stages (requirement identification, development of preliminary concepts, concepts elaboration, evaluation, and detailing of chosen concepts)? The results aim to provide directions to develop new design tools which are aligned with designer's current interaction patterns as observed in using conventional CAD design tools.
Multimodal human discourse: gesture and speech
ACM Transactions on Computer-Human Interaction, 2002
Gesture and speech combine to form a rich basis for human conversational interaction. To exploit these modalities in HCI, we need to understand the interplay between them and the way in which they support communication. We propose a framework for the gesture research done to date, and present our work on the cross-modal cues for discourse segmentation in free-form gesticulation accompanying speech in natural conversation as a new paradigm for such multimodal interaction. The basis for this integration is the psycholinguistic concept of the coequal generation of gesture and speech from the same semantic intent. We present a detailed case study of a gesture and speech elicitation experiment in which a subject describes her living space to an interlocutor. We perform F. Quek et al.
How we gesture towards machines
CHI '13 Extended Abstracts on Human Factors in Computing Systems on - CHI EA '13, 2013
This paper explores if people perceive and perform touchless gestures differently when communicating with technology vs. with humans. Qualitative reports from a lab study of 10 participants revealed that people perceive differences in the speed of performing gestures, sense of enjoyment, feedback from the communication target. Preliminary analysis of 1200 gesture trials of motion capture data showed that hand shapes were less taut when communicating to technology. These differences provide implications for the design of gestural user interfaces that use symbolic gestures borrowed from human multimodal communication.
Creating a communication system from scratch: gesture beats vocalization hands down
2014
How does modality affect people's ability to create a communication system from scratch? The present study experimentally tests this question by having pairs of participants communicate a range of pre-specified items (emotions, actions, objects) over a series of trials to a partner using either non-linguistic vocalization, gesture or a combination of the two. Gesture-alone outperformed vocalization-alone, both in terms of successful communication and in terms of the creation of an inventory of sign-meaning mappings shared within a dyad (i.e., sign alignment). Combining vocalization with gesture did not improve performance beyond gesture-alone. In fact, for action items, gesture-alone was a more successful means of communication than the combined modalities. When people do not share a system for communication they can quickly create one, and gesture is the best means of doing so.
A Study of Gestures in a Video-Mediated Collaborative Assembly Task
Advances in Human-Computer Interaction, 2011
This paper presents the results of an experimental investigation of two gesture representations (overlaying hands and cursor pointer) in a video-mediated scenario—remote collaboration on physical task. Our study assessed the relative value of the two gesture representations with respect to their effectiveness in task performance, user's satisfaction, and user's perceived quality of collaboration in terms of the coordination and interaction with the remote partner. Our results show no clear difference between these two gesture representations in the effectiveness and user satisfaction. However, when considering the perceived quality of collaboration, the overlaying hands condition was statistically significantly higher than the pointer cursor condition. Our results seem to suggest that the value of a more expressive gesture representation is not so much a gain in performance but rather a gain in user's experience, more specifically in user's perceived quality of colla...