Eye gaze patterns in conversations (original) (raw)
Related papers
Eye Gaze Patterns in Conversations: There is More to Conversational Agents Than Meets the Eyes
2000
In multi-agent, multi-user environments, users as well as agents should have a means of establishing who is talking to whom. In this paper, we present an experiment aimed at evaluating whether gaze directional cues of users could be used for this purpose. Using an eye tracker, we measured subject gaze at the faces of conversational partners during four-person conversations. Results indicate that when someone is listening or speaking to individuals, there is indeed a high probability that the person looked at is the person listened (p=88%) or spoken to (p=77%). We conclude that gaze is an excellent predictor of conversational attention in multiparty conversations. As such, it may form a reliable source of input for conversational systems that need to establish whom the user is speaking or listening to. We implemented our findings in FRED, a multi-agent conversational system that uses eye input to gauge which agent the user is listening or speaking to.
Eye-gaze experiments for conversation monitoring
ACM International Conference Proceeding Series, 2009
Eye-tracking technology has recently been matured so that its use in studies dealing with unobtrusive and natural user experiments has become easier to conduct. Simultaneously, human computer interactions have become more conversational in style, and more challenging in that they require various human conversational strategies, such as giving feedback and managing turn-taking. In this paper, we focus on eye-gaze in order to investigate turn taking signals and conversation monitoring in naturally occurring dialogues. We seek to build models that deal with the important aspects of which interlocutor the speaker is talking to, and what kind of turn taking signals the partners elicit, and we report the first results of our eye-tracking experiments.
Gaze, conversational agents and face-to-face communication
Speech Communication, 2010
In this paper, we describe two series of experiments that examine audiovisual face-to-face interaction between naive human viewers and either a human interlocutor or a virtual conversational agent. The main objective is to analyze the interplay between speech activity and mutual gaze patterns during mediated face-to-face interactions. We first quantify the impact of deictic gaze patterns of our agent. We further aim at refining our experimental knowledge on mutual gaze patterns during human face-to-face interaction by using new technological devices such as non invasive eye trackers and pinhole cameras, and at quantifying the impact of a selection of cognitive states and communicative functions on recorded gaze patterns.
In this paper, we investigate the user's eye gaze behavior during the conversation with an interactive storytelling application. We present an interactive eye gaze model for embodied conversational agents in order to improve the experience of users participating in Interactive Storytelling. The underlying narrative in which the approach was tested is based on a classical XIXth century psychological novel: Madame Bovary, by Flaubert. At various stages of the narrative, the user can address the main character or respond to her using free-style spoken natural language input, impersonating her lover. An eye tracker was connected to enable the interactive gaze model to respond to user's current gaze (i.e. looking into the virtual character's eyes or not). We conducted a study with 19 students where we compared our interactive eye gaze model with a non-interactive eye gaze model that was informed by studies of human gaze behaviors, but had no information on where the user was looking. The interactive model achieved a higher score for user ratings than the non-interactive model. In addition we analyzed the users' gaze behavior during the conversation with the virtual character.
Conversing with the user based on eye-gaze patterns
Proceedings of the SIGCHI conference on Human factors in computing systems - CHI '05, 2005
Motivated by and grounded in observations of eye-gaze patterns in human-human dialogue, this study explores using eye-gaze patterns in managing human-computer dialogue. We developed an interactive system, iTourist, for city trip planning, which encapsulated knowledge of eye-gaze patterns gained from studies of human-human collaboration systems. User study results show that it was possible to sense users' interest based on eye-gaze patterns and manage computer information output accordingly. Study participants could successfully plan their trip with iTourist and positively rated their experience of using it. We demonstrate that eyegaze could play an important role in managing future multimodal human-computer dialogues.
(Simulated) listener gaze in real-time spoken interaction
Computer Animation and Virtual Worlds, 2018
Gaze is an important aspect of social communication. Previous research has concentrated mainly on the role of speaker gaze and listener gaze in isolation, neglecting the effect of the listener's gaze behavior on the speaker's behavior. This paper presents an exploratory eye-tracking study involving an interactive human-like agent following participants' gaze. This study demonstrates that a rather simple gaze-following mechanism convincingly simulates active listening behavior engaging the speaker. The study also highlights how speakers rely on their interlocutors' gaze when establishing common references.
Evaluating look-to-talk: a gaze-aware interface in a collaborative environment
CHI'02 extended …, 2002
We present "look-to-talk", a gaze-aware interface for directing a spoken utterance to a software agent in a multiuser collaborative environment. Through a prototype and a Wizard-of-Oz (WOz) experiment, we show that "look-totalk" is indeed a natural alternative to speech and other paradigms.
Modelling gaze behavior for conversational agents
2003
In this paper we propose an eye gaze model for an embodied conversational agent that embeds information on communicative functions as well as on statistical information of gaze patterns. This latter information has been derived from the analytic studies of an annotated video-corpus of conversation dyads. We aim at generating different gaze behaviors to stimulate several personalized gaze habits of an embodied conversational agent.
Visual Attention and Eye Gaze during Multipartite Conversations with Distractions
Our objective is to develop a computational model to predict visual attention behavior for an embodied conversational agent. During interpersonal interaction, gaze provides signal feedback and directs conversation flow. Simultaneously, in a dynamic environment, gaze also directs attention to peripheral movements. An embodied conversational agent should therefore employ social gaze not only for interpersonal interaction but also to possess human attention attributes so that its eyes and facial expression portray and convey appropriate distraction and engagement behaviors.
Controlling the Gaze of Conversational Agents
2005
We report on a pilot experiment that investigated the effects of different eye gaze behaviours of a cartoon-like talking face on the quality of human-agent dialogues. We compared a version of the talking face that roughly implements some patterns of human-like behaviour with two other versions. In one of the other versions the shifts in gaze were kept minimal and in the other version the shifts would occur randomly. The talking face has a number of restrictions. There is no speech recognition, so questions and replies have to be typed in by the users of the systems. Despite this restriction we found that participants that conversed with the agent that behaved according to the human-like patterns appreciated the agent better than participants that conversed with the other agents. Conversations with the optimal version also proceeded more efficiently. Participants needed less time to complete their task.