A Brain-Computer Interface for robotic arm control

Lenhardt A (2011)
Bielefeld: Universität Bielefeld.

Download
OA
Bielefeld Dissertation | English
Supervisor
Ritter, Helge
Abstract
Brain-Computer Interfaces (BCI) are tools that open a new channel of communication between humans and machines. The majority of human input devices for computers require proper functioning of our primary sensors and motor functions like grasping, moving and visual perception. In the case of severe motor disabilities, like amyotrophic lateral sclerosis (ALS) or spinal chord injury (SCI), The most common method to measure brain activity suitable for BCI are electroencephalographic measurements (EEG) due to their relative cost effectiveness and ease of use. Alternative ways to extract brain signals exist but either require invasive procedures, i.e. opening the skull, or are very costly and bulky (MEG, fMRI) which renders them unusable for home appliance. One of the most popular brain controlled input methods is the P300-Speller paradigm which gives the user control over a virtual keyboard to enter text. The term P300 refers to a specific EEG component that can be measured whenever a rare task relevant stimulus is interspersed with many non-relevant stimuli. This method requires the ability to control the visual presentation of stimuli and therefore also requires some sort of computer controlled display. The recognition rates for this type of BCI, yet already quite high with roughly 80-90% accuracy, are still prone to errors and may not be suitable for critical applications like issuing movement commands to a wheelchair in a highly populated environment. Commands to stop the wheelchair might be recognized too late. Further, it is impossible with the standard stimulus matrix to react to external influences like obstacles or select physical objects in a scene which does not allow the user to interact with a dynamic environment. This work aims to fuse state of the art BCI techniques into one single system to control an artificial actuator like a robot arm and use it to manipulate the physical environment. To achieve this goal, multiple techniques originating from different fields of research as augmented reality, computer vision, psychology, machine learning and data mining have to be combined to form a robust, intuitively to use input device.
Year
PUB-ID

Cite this

Lenhardt A. A Brain-Computer Interface for robotic arm control. Bielefeld: Universität Bielefeld; 2011.
Lenhardt, A. (2011). A Brain-Computer Interface for robotic arm control. Bielefeld: Universität Bielefeld.
Lenhardt, A. (2011). A Brain-Computer Interface for robotic arm control. Bielefeld: Universität Bielefeld.
Lenhardt, A., 2011. A Brain-Computer Interface for robotic arm control, Bielefeld: Universität Bielefeld.
A. Lenhardt, A Brain-Computer Interface for robotic arm control, Bielefeld: Universität Bielefeld, 2011.
Lenhardt, A.: A Brain-Computer Interface for robotic arm control. Universität Bielefeld, Bielefeld (2011).
Lenhardt, Alexander. A Brain-Computer Interface for robotic arm control. Bielefeld: Universität Bielefeld, 2011.
Main File(s)
File Name
Access Level
OA Open Access
Last Uploaded
2012-09-28 12:39:56

This data publication is cited in the following publications:
This publication cites the following data publications:

Export

0 Marked Publications

Open Data PUB

Search this title in

Google Scholar