Exploring spoken dialog interaction in human-robot teams (original) (raw)
Related papers
Human-Robot Dialogue and Collaboration in Search and Navigation
2018
Collaboration with a remotely located robot in tasks such as disaster relief and search and rescue can be facilitated by grounding natural language task instructions into actions executable by the robot in its current physical context. The corpus we describe here provides insight into the translation and interpretation a natural language instruction undergoes starting from verbal human intent, to understanding and processing, and ultimately, to robot execution. We use a ‘Wizard-of-Oz’ methodology to elicit the corpus data in which a participant speaks freely to instruct a robot on what to do and where to move through a remote environment to accomplish collaborative search and navigation tasks. This data offers the potential for exploring and evaluating action models by connecting natural language instructions to execution by a physical robot (controlled by a human ‘wizard’). In this paper, a description of the corpus (soon to be openly available) and examples of actions in the dialo...
Human-Robot Conversational Interaction (HRCI)
Companion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction
Conversation is one of the primary methods of interaction between humans and robots. It provides a natural way of communication with the robot, thereby reducing the obstacles that can be faced through other interfaces (e.g., text or touch) that may cause difficulties to certain populations, such as the elderly or those with disabilities, promoting inclusivity in Human-Robot Interaction (HRI). Work in HRI has contributed significantly to the design, understanding and evaluation of human-robot conversational interactions. Concurrently, the Conversational User Interfaces (CUI) community has developed with similar aims, though with a wider focus on conversational interactions across a range of devices and platforms. This workshop aims to bring together the CUI and HRI communities to outline key shared opportunities and challenges in developing conversational interactions with robots, resulting in collaborative publications targeted at the CUI 2023 provocations track. CCS CONCEPTS • Human-centered computing → Collaborative interaction; Interaction design; Collaborative and social computing.
2012
Abstract Developing interactive robots is an extremely challenging task which requires a broad range of expertise across diverse disciplines, including, robotic planning, spoken language understanding, belief tracking and action management. While there has been a boom in recent years in the development of reusable components for robotic systems within common architectures, such as the Robot Operating System (ROS), little emphasis has been placed on developing components for Human-Robot-Interaction.
How Do People Talk with a Robot? An Analysis of Human-Robot Dialogues in the Real World
This paper reports the preliminary results of a human- robot dialogue analysis in the real world with the goal of understanding users’ interaction patterns. We analyzed the dialogue log data of Roboceptionist, a robotic receptionist located in a high-traffic area in an academic building [2][3]. The results show that (i) the occupation and background (persona) of the robot help people establish common ground with the robot, and (ii) there is great variability in the extent that users follow social norms of human-human dialogues in human-robot dialogues. Based on these results, we describe implications for designing the dialogue of a social robot.
Proceedings of the Human Factors and Ergonomics Society Annual Meeting
Industry, military, and academia are showing increasing interest in collaborative human-robot teaming in a variety of task contexts. Designing effective user interfaces for human-robot interaction is an ongoing challenge, and a variety of single-and multiple-modality interfaces have been explored. Our work is to develop a bi-directional natural language interface for remote human-robot collaboration in physically situated tasks. When combined with a visual interface and audio cueing, we intend for the natural language interface to provide a naturalistic user experience that requires little training. Building the language portion of this interface requires first understanding how potential users would speak to the robot. In this paper, we describe our elicitation of minimally-constrained robot-directed language, observations about the users' language behavior, and future directions for constructing an automated robotic system that can accommodate these language needs.
The Curious Robot as a Case-Study for Comparing Dialog Systems
2011
Modeling interaction with robots raises new and different challenges for dialog modeling than traditional dialog modeling with less embodied machines. We present four case studies of implementing a typical humanrobot interaction scenario with different state ...
2018
The Niki and Julie corpus contains more than 600 dialogues between human participants and a human-controlled robot or virtual agent, engaged in a series of collaborative item-ranking tasks designed to measure influence. Some of the dialogues contain deliberate conversational errors by the robot, designed to simulate the kinds of conversational breakdown that are typical of present-day automated agents. Data collected include audio and video recordings, the results of the ranking tasks, and questionnaire responses; some of the recordings have been transcribed and annotated for verbal and nonverbal feedback. The corpus has been used to study influence and grounding in dialogue. All the dialogues are in American English.
Crowdsourcing Real World Human-Robot Dialog and Teamwork through Online Multiplayer Games
AI Magazine, 2011
We present an innovative approach for large-scale data collection in human-robot interaction research through the use of online multi-player games. By casting a robotic task as a collaborative game, we gather thousands of examples of human-human interactions online, and then leverage this corpus of action and dialog data to create contextually relevant, social and task-oriented behaviors for human-robot interaction in the real world. We demonstrate our work in a collaborative search and retrieval task requiring dialog, action synchronization and action sequencing between the human and robot partners. A user study performed at the Boston Museum of Science shows that the autonomous robot exhibits many of the same patterns of behavior that were observed in the online dataset and survey results rate the robot similarly to human partners in several critical measures.
Human-robot interaction through spoken language dialogue
2000
Abstract The development of robots that are able to accept instructions, via a friendly interface, in terms of concepts that are familiar to a human user remains a challenge. It is argued that designing and building such intelligent robots can be seen as the problem of integrating four main dimensions: human-robot communication, sensory motor skills and perception, decision-making capabilities, and learning.
Balancing Efficiency and Coverage in Human-Robot Dialogue Collection
ArXiv, 2018
We describe a multi-phased Wizard-of-Oz approach to collecting human-robot dialogue in a collaborative search and navigation task. The data is being used to train an initial automated robot dialogue system to support collaborative exploration tasks. In the first phase, a wizard freely typed robot utterances to human participants. For the second phase, this data was used to design a GUI that includes buttons for the most common communications, and templates for communications with varying parameters. Comparison of the data gathered in these phases show that the GUI enabled a faster pace of dialogue while still maintaining high coverage of suitable responses, enabling more efficient targeted data collection, and improvements in natural language understanding using GUI-collected data. As a promising first step towards interactive learning, this work shows that our approach enables the collection of useful training data for navigation-based HRI tasks.