A Brain-Computer Interface for robotic arm control

Lenhardt A (2011)
Bielefeld: Universität Bielefeld.

Bielefelder E-Dissertation | Englisch
 
Download
OA
Gutachter*in / Betreuer*in
Ritter, Helge
Abstract / Bemerkung
Brain-Computer Interfaces (BCI) are tools that open a new channel of communication between humans and machines. The majority of human input devices for computers require proper functioning of our primary sensors and motor functions like grasping, moving and visual perception. In the case of severe motor disabilities, like amyotrophic lateral sclerosis (ALS) or spinal chord injury (SCI), The most common method to measure brain activity suitable for BCI are electroencephalographic measurements (EEG) due to their relative cost effectiveness and ease of use. Alternative ways to extract brain signals exist but either require invasive procedures, i.e. opening the skull, or are very costly and bulky (MEG, fMRI) which renders them unusable for home appliance. One of the most popular brain controlled input methods is the P300-Speller paradigm which gives the user control over a virtual keyboard to enter text. The term P300 refers to a specific EEG component that can be measured whenever a rare task relevant stimulus is interspersed with many non-relevant stimuli. This method requires the ability to control the visual presentation of stimuli and therefore also requires some sort of computer controlled display. The recognition rates for this type of BCI, yet already quite high with roughly 80-90% accuracy, are still prone to errors and may not be suitable for critical applications like issuing movement commands to a wheelchair in a highly populated environment. Commands to stop the wheelchair might be recognized too late. Further, it is impossible with the standard stimulus matrix to react to external influences like obstacles or select physical objects in a scene which does not allow the user to interact with a dynamic environment. This work aims to fuse state of the art BCI techniques into one single system to control an artificial actuator like a robot arm and use it to manipulate the physical environment. To achieve this goal, multiple techniques originating from different fields of research as augmented reality, computer vision, psychology, machine learning and data mining have to be combined to form a robust, intuitively to use input device.
Jahr
2011
Seite(n)
173
Page URI
https://pub.uni-bielefeld.de/record/2529157

Zitieren

Lenhardt A. A Brain-Computer Interface for robotic arm control. Bielefeld: Universität Bielefeld; 2011.
Lenhardt, A. (2011). A Brain-Computer Interface for robotic arm control. Bielefeld: Universität Bielefeld.
Lenhardt, Alexander. 2011. A Brain-Computer Interface for robotic arm control. Bielefeld: Universität Bielefeld.
Lenhardt, A. (2011). A Brain-Computer Interface for robotic arm control. Bielefeld: Universität Bielefeld.
Lenhardt, A., 2011. A Brain-Computer Interface for robotic arm control, Bielefeld: Universität Bielefeld.
A. Lenhardt, A Brain-Computer Interface for robotic arm control, Bielefeld: Universität Bielefeld, 2011.
Lenhardt, A.: A Brain-Computer Interface for robotic arm control. Universität Bielefeld, Bielefeld (2011).
Lenhardt, Alexander. A Brain-Computer Interface for robotic arm control. Bielefeld: Universität Bielefeld, 2011.
Alle Dateien verfügbar unter der/den folgenden Lizenz(en):
Copyright Statement:
Dieses Objekt ist durch das Urheberrecht und/oder verwandte Schutzrechte geschützt. [...]
Volltext(e)
Access Level
OA Open Access
Zuletzt Hochgeladen
2019-09-06T09:18:06Z
MD5 Prüfsumme
e21c147946d868eb7d1c35790308d8fc


Export

Markieren/ Markierung löschen
Markierte Publikationen

Open Data PUB

Suchen in

Google Scholar