Adding language capabilities to a small robot (original) (raw)

Towards a Personal Robot with Spoken Language Interface

The development of robots capable of accepting instructions in terms of familiar concepts to the user is still a challenge. For these robots to emerge it's essential the development of natural language interfaces, since this is regarded as the only interface acceptable for a machine which expected to have a high level of interactivity with Man. Our group has been involved for several years in the development of a mobile intelligent robot, named Carl, designed having in mind such tasks as serving food in a reception or acting as a host in an organization. The approach that has been followed in the design of Carl is based on an explicit concern with the integration of the major dimensions of intelligence, namely Communication, Action, Reasoning and Learning. This paper focuses on the multi-modal human-robot language communication capabilities of Carl, since these have been significantly improved during the last year.

Advances in natural language interaction in mobile robots used for practice education

World Automation Congress, 2004

This paper presents a proposal for the development of laboratory assignments about the design and implementation of advanced interfaces for mobile robots using speech recognition. In these assignments, the main objective is the analysis of the possibilities for using speech interfaces as a complementary system for other interaction components with a mobile robot, such as artificial vision. The paper also

Mobile robot programming using natural language

Robotics and Autonomous Systems, 2002

How will naive users program domestic robots? This paper describes the design of a practical system that uses natural language to teach a vision-based robot how to navigate in a miniature town. To enable unconstrained speech the robot is provided with a set of primitive procedures derived from a corpus of route instructions. When the user refers to a route that is not known to the robot, the system will learn it by combining primitives as instructed by the user. This paper describes the components of the Instruction-Based Learning architecture and discusses issues of knowledge representation, the selection of primitives and the conversion of natural language into robot-understandable procedures.

An Hybrid Approach for Spoken Natural Language Understanding Applied to a Mobile Intelligent Robot

2004

Abstract. The greater sophistication and complexity of machines increases the necessity to equip them with human friendly interfaces. As we know, voice is the main support for human-human communication, so it is desirable to interact with machines, namely robots, using voice. In this paper we present the recent evolution of the Natural Language Understanding capabilities of Carl, our mobile intelligent robot capable of interacting with humans using spoken natural language.

Towards a personal robot with language interface

2003

Abstract The development of robots capable of accepting instructions in terms of familiar concepts to the user is still a challenge. For these robots to emerge it's essential the development of natural language interfaces, since this is regarded as the only interface acceptable for a machine which expected to have a high level of interactivity with Man.

Natural semantics for a mobile robot

1999

Functionalism is the view that a system (or system component) grasps the meaning of its inputs to the extent that it produces the right outputs. If a system retrieves all and only relevant documents in response to a query, we say it understands the query. If a robot avoids bumping into walls, we say it understands its sensors and its environment. If a chess program beats the world champion, we say it understands the game. One kind of functionalism, conventional functionalism, is currently very popular and productive in artificial intelligence and the other cognitive sciences, but it requires humans to specify the meanings of assertions. A second kind of functionalism, natural semantics requires computers to learn these meanings. This paper discusses the limitations of conventional functionalism and describes some robotics work from our laboratory on natural semantics.

Towards Robot Learning from Spoken Language

Companion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction

The paper proposes a robot learning framework that empowers a robot to automatically generate a sequence of actions from unstructured spoken language. The robot learning framework was able to distinguish between instructions and unrelated conversations. Data were collected from 25 participants, who were asked to instruct the robot to perform a collaborative cooking task while being interrupted and distracted. The system was able to identify the sequence of instructed actions for a cooking task with the accuracy of 92.85 ± 3.87%. CCS CONCEPTS • Computer systems organization → External interfaces for robotics; Robotic autonomy; • Human-centered computing → Accessibility technologies.

Automatic language acquisition by an autonomous robot

2003

There is no such thing as a disembodied mind. We posit that cognitive development can only occur through interaction with the physical world. To this end, we are developing a robotic platform for the purpose of studying cognition. We suggest that the central component of cognition is a memory which is primarily associative, one where learning occurs as the correlation of events from diverse inputs. We also believe that human-like cognition requires a well-integrated sensorymotor system, to provide these diverse inputs. As implemented in our robot, this system includes binaural hearing, stereo vision, tactile sense, and basic proprioceptive control. On top of these abilities, we are implementing and studying various models of processing, learning and decision making. Our goal is to produce a robot that will learn to carry out simple tasks in response to natural language requests. The robot's understanding of language will be learned concurrently with its other cognitive abilities. We have already developed a robust system and conducted a number of experiments on the way to this goal, some details of which appear in this paper. This is a progress report of what we believe will be a long term project with significant implications.

Speaking autonomous intelligent devices

2002

The development of speech tools suitable for use in real world environments requires collaboration between computational linguistics and new implementation fields eg robotics, and the incorporation of new AI techniques to improve overall system performance. In this paper we present the core development concepts of SAID (Speaking Autonomous Intelligent Devices).

Utilizing spatial relations for natural language access to an autonomous mobile robot

Lecture Notes in Computer Science, 1994

Natural language, a primary communication medium for humans, facilitates better human-machine interaction and could be an efficient means to use intelligent robots in a more flexible manner. In this paper, we report on our joint efforts at providing natural language access to the autonomous mobile two-arm robot KAMRO. The robot is able to perform complex assembly tasks. To achieve autonomous behaviour, several camera systems are used for the perception of the environment during task execution. Since natural language utterances must be interpreted with respect to the robot's current environment the processing must be based on a referential semantics that is perceptually anchored. Considering localization expressions, we demonstrate how, on the one hand, verbal descriptions, and on the other hand, knowledge about the physical environment, i.e., visual and geometric information, can be connected to each other.