Learning to integrate arbitrary signals from vision and touch

Ernst MO (2007)
Journal of Vision 7(5): 7.

Download
Es wurde kein Volltext hochgeladen. Nur Publikationsnachweis!
Zeitschriftenaufsatz | Veröffentlicht | Englisch
Abstract / Bemerkung
When different perceptual signals of the same physical property are integrated, for example, an objects' size, which can be seen and felt, they form a more reliable sensory estimate ( e. g., M. O. Ernst & M. S. Banks, 2002). This, however, implies that the sensory system already knows which signals belong together and how they relate. In other words, the system has to know the mapping between the signals. In a Bayesian model of cue integration, this prior knowledge can be made explicit. Here, we ask whether such a mapping between two arbitrary sensory signals from vision and touch can be learned from their statistical co- occurrence such that they become integrated. In the Bayesian framework, this means changing the belief about the distribution of the stimuli. To this end, we trained subjects with stimuli that are usually unrelated in the world-the luminance of an object ( visual signal) and its stiffness ( haptic signal). In the training phase, we then presented subjects with combinations of these two signals, which were artificially correlated, and thus, we introduced a new mapping between them. For example, the stiffer the object, the brighter it was. We measured the influence of learning by comparing discrimination performance before and after training. The prediction is that integration makes discrimination worse for stimuli, which are incongruent with the newly learned mapping, because integration would cause this incongruency to disappear perceptually. The more certain subjects are about the new mapping, the stronger should the influence be on discrimination performance. Thus, learning in this context is about acquiring beliefs. We found a significant change in discrimination performance before and after training when comparing trials with congruent and incongruent stimuli. After training, discrimination thresholds for the incongruent stimuli are increased relative to thresholds for congruent stimuli, suggesting that subjects learned effectively to integrate the two formerly unrelated signals.
Erscheinungsjahr
Zeitschriftentitel
Journal of Vision
Band
7
Zeitschriftennummer
5
Seite
7
ISSN
PUB-ID

Zitieren

Ernst MO. Learning to integrate arbitrary signals from vision and touch. Journal of Vision. 2007;7(5):7.
Ernst, M. O. (2007). Learning to integrate arbitrary signals from vision and touch. Journal of Vision, 7(5), 7. doi:10.1167/7.5.7
Ernst, M. O. (2007). Learning to integrate arbitrary signals from vision and touch. Journal of Vision 7, 7.
Ernst, M.O., 2007. Learning to integrate arbitrary signals from vision and touch. Journal of Vision, 7(5), p 7.
M.O. Ernst, “Learning to integrate arbitrary signals from vision and touch”, Journal of Vision, vol. 7, 2007, pp. 7.
Ernst, M.O.: Learning to integrate arbitrary signals from vision and touch. Journal of Vision. 7, 7 (2007).
Ernst, Marc O. “Learning to integrate arbitrary signals from vision and touch”. Journal of Vision 7.5 (2007): 7.

55 Zitationen in Europe PMC

Daten bereitgestellt von Europe PubMed Central.

Constructive perception of self-motion.
Holly JE, McCollum G., J Vestib Res 18(5-6), 2008
PMID: 19542599
Recalibration of perceived time across sensory modalities.
Hanson JV, Heron J, Whitaker D., Exp Brain Res 185(2), 2008
PMID: 18236035
Benefits of multisensory learning.
Shams L, Seitz AR., Trends Cogn Sci 12(11), 2008
PMID: 18805039

Export

Markieren/ Markierung löschen
Markierte Publikationen

Open Data PUB

Web of Science

Dieser Datensatz im Web of Science®

Quellen

PMID: 18217847
PubMed | Europe PMC

Suchen in

Google Scholar