Brain-Actuated Interaction (original) (raw)

Non-invasive brain-machine interaction

International Journal of Pattern Recognition and Artificial Intelligence, 2008

The promise of Brain-Computer Interfaces (BCI) technology is to augment human capabilities by enabling interaction with computers through a conscious and spontaneous modulation of the brainwaves after a short training period. Indeed, by analyzing brain electrical activity online, several groups have designed brain-actuated devices that provide alternative channels for communication, entertainment and control. Thus, a person can write messages using a virtual keyboard on a computer screen and also browse the internet. Alternatively, subjects can operate simple computer games, or brain games, and interact with educational software. Work with humans has shown that it is possible for them to move a cursor and even to drive a wheelchair. This paper briefly reviews the field of BCI, with a focus on non-invasive systems based on electroencephalogram (EEG) signals. It also describes three brain-actuated devices we have developed: a virtual keyboard, a brain game, and a wheelchair. Finally, it shortly discusses current research directions we are pursuing in order to improve the performance and robustness of our BCI system, especially for real-time control of brainactuated robots.

Human machine interaction via brain activity monitoring

2013 6th International Conference on Human System Interactions (HSI), 2013

Abstract. Brain Computer Interfaces (BCI) are becoming increasingly studied as methods for users to interact with computers because recent technological developments have lead to low priced, high precision BCI devices that are aimed at the mass market. This paper investigates the ability for using such a device in real world applications as well as limitations of such applications. The device tested in this paper is called the Emotiv EPOC headset, which is an electroencephalograph (EEG) measuring device and enables the measuring of brain activity using 14 strategically placed sensors. This paper presents: 1) a BCI framework driven completely by thought patterns, aimed at real world applications 2) a quantitative analysis of the performance of the implemented system. The Emotiv EPOC headset based BCI framework presented in this paper was tested on a problem of controlling a simple differential wheeled robot by identifying four thought patterns in the user: "neutral", "move forward", "turn left", and "turn right". The developed approach was tested on 6 individuals and the results show that while BCI control of a mobile robot is possible, precise movement required to guide a robot along a set path is difficult with the current setup. Furthermore, intense concentration is required from users to control the robot accurately.

Non-invasive brain-actuated interaction

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2007

The promise of Brain-Computer Interfaces (BCI) technology is to augment human capabilities by enabling interaction with computers through a conscious and spontaneous modulation of the brainwaves after a short training period. Indeed, by analyzing brain electrical activity online, several groups have designed brain-actuated devices that provide alternative channels for communication, entertainment and control. Thus, a person can write messages using a virtual keyboard on a computer screen and also browse the internet. Alternatively, subjects can operate simple computer games, or brain games, and interact with educational software. Work with humans has shown that it is possible for them to move a cursor and even to drive a wheelchair. This paper briefly reviews the field of BCI, with a focus on non-invasive systems based on electroencephalogram (EEG) signals. It also describes three brain-actuated devices we have developed: a virtual keyboard, a brain game, and a wheelchair. Finally, it shortly discusses current research directions we are pursuing in order to improve the performance and robustness of our BCI system, especially for real-time control of brain-actuated robots.

Non-invasive brain-actuated control of a mobile robot

International Joint Conference on Artificial Intelligence, 2003

Recent experiments have indicated the possibility to use the brain electrical activity to directly control the movement of robotics or prosthetic devices. In this paper we report results with a portable non-invasive brain-computer interface that makes possible the continuous control of a mobile robot in a house-like environment. The interface uses 8 surface electrodes to measure electroencephalogram (EEG) signals from which a statistical classifier recognizes 3 different mental states. Until now, brain-actuated control of robots has relied on invasive approachesrequiring surgical implantation of electrodessince EEG-based systems have been considered too slow for controlling rapid and complex sequences of movements. Here we show that, after a few days of training, two human subjects successfully moved a robot between several rooms by mental control only. Furthermore, mental control was only marginally worse than manual control on the same task. Recent experiments have shown the near possibility to use the brain electrical activity to directly control the

Noninvasive Brain-Actuated Control of a Mobile Robot by Human EEG

IEEE Transactions on Biomedical Engineering, 2004

Brain activity recorded non-invasively is sufficient to control a mobile robot if advanced robotics is used in combination with asynchronous EEG analysis and machine learning techniques. Until now brain-actuated control has mainly relied on implanted electrodes, since EEGbased systems have been considered too slow for controlling rapid and complex sequences of movements. We show that two human subjects successfully moved a robot between several rooms by mental control only, using an EEG-based brain-machine interface that recognized three mental states. Mental control was comparable to manual control on the same task with a performance ratio of 0.74.

Adaptive brain interfaces for communication and control

2003

This paper describes our work on a portable non-invasive brain-computer interface (BCI), called Adaptive Brain Interfaces (ABI), that analysis online the user's spontaneous electroencephalogram (EEG) signals from which a neural classifier recognizes 3 different mental states. The outputs of the classifier are used as mental commands to operate communication and control devices. Although still at a research stage, BCIs offer the possibility to augment human capabilities in a natural way and are particularly relevant as an aid for paralyzed humans.

Brain-computer interface research at Katholieke Universiteit Leuven

We present an overview of our Brain-computer interface (BCI) research, invasive as well as non-invasive, during the past four years. The invasive BCIs are based on local fieldand action potentials recorded with microelectrode arrays implanted in the visual cortex of the macaque monkey. The non-invasive BCIs are based on electroencephalogram (EEG) recorded from a human subject's scalp. Several EEG paradigms were used to enable the subject to type text or to select icons on a computer screen, without having to rely on one's fingers, gestures, or any other form of motor activity: the P300 event-related potential, the steady-state visual evoked potential, and the error related potential. We report on the status of our EEG BCI tests on healthy subjects as well as patients with severe communication disabilities, and our demonstrations to a broad audience to raise the public awareness of BCI.

Brain Computer Interfaces, a Review

A brain-computer interface (BCI) is a hardware and software communications system that permits cerebral activity alone to control computers or external devices. The immediate goal of BCI research is to provide communications capabilities to severely disabled people who are totally paralyzed or 'locked in' by neurological neuromuscular disorders, such as amyotrophic lateral sclerosis, brain stem stroke, or spinal cord injury. Here, we review the state-of-the-art of BCIs, looking at the different steps that form a standard BCI: signal acquisition, preprocessing or signal enhancement, feature extraction, classification and the control interface. We discuss their advantages, drawbacks, and latest advances, and we survey the numerous technologies reported in the scientific literature to design each step of a BCI. First, the review examines the neuroimaging modalities used in the signal acquisition step, each of which monitors a different functional brain activity such as electrical, magnetic or metabolic activity. Second, the review discusses different electrophysiological control signals that determine user intentions, which can be detected in brain activity. Third, the review includes some techniques used in the signal enhancement step to deal with the artifacts in the control signals and improve the performance. Fourth, the review studies some mathematic algorithms used in the feature extraction and classification steps which translate the information in the control signals into commands that operate a computer or other device. Finally, the review provides an overview of various BCI applications that control a range of devices.

Brain-Computer Interface for Controlling a Mobile Robot Utilizing an Electroencephalogram Signals

Brain-computer interface (BCI) system provides communication channel between human brain and external devices. The system processes and translates thought into control signals and thus enabling a user to navigate a robot from one place to another. In this context, we developed a system that enables a user to guide a robot by brain waves. The system consists of an Emotiv Epoc headset, a personal computer, and a mobile robot. The Emotiv Epoc headset attached to the head of the user and used to collect Electroencephalogram (EEG) signals. The headset picks up brain activities from 14 locations on the scalp and sends them to the computer for processing. Those brain activities can tell the system what a person is going to do in his virtual reality. Then, by using a novel application designed for this purpose, the cognitive suite supplied by Emotiv is responsible for generating the control actions needed to make the robot execute three different commands: turn right, turn left, and move forward. In this paper, hardware and software architectures were designed and implemented. Experimental results indicate that the robot can be successfully controlled in real-time based on the subject's physiological changes. Keywords: brain computer interface (BCI), electroencephalogram (EEG), emotiv epoc neuroheadset.