Vision and touch are automatically integrated for the perception of sequences of events

Bresciani J-P, Dammeier F, Ernst MO (2006)
Journal of Vision 6(5).

Journal Article | Published | English

No fulltext has been uploaded

Author
; ;
Abstract
The purpose of the present experiment was to investigate the integration of sequences of visual and tactile events. Subjects were presented with sequences of visual. ashes and tactile taps simultaneously and instructed to count either the. ashes ( Session 1) or the taps ( Session 2). The number of. ashes could differ from the number of taps by +/-1. For both sessions, the perceived number of events was significantly influenced by the number of events presented in the task-irrelevant modality. Touch had a stronger influence on vision than vision on touch. Interestingly, touch was the more reliable of the two modalities-less variable estimates when presented alone. For both sessions, the perceptual estimates were less variable when stimuli were presented in both modalities than when the task-relevant modality was presented alone. These results indicate that even when one signal is explicitly task irrelevant, sensory information tends to be automatically integrated across modalities. They also suggest that the relative weight of each sensory channel in the integration process depends on its relative reliability. The results are described using a Bayesian probabilistic model for multimodal integration that accounts for the coupling between the sensory estimates.
Publishing Year
ISSN
PUB-ID

Cite this

Bresciani J-P, Dammeier F, Ernst MO. Vision and touch are automatically integrated for the perception of sequences of events. Journal of Vision. 2006;6(5).
Bresciani, J. - P., Dammeier, F., & Ernst, M. O. (2006). Vision and touch are automatically integrated for the perception of sequences of events. Journal of Vision, 6(5).
Bresciani, J. - P., Dammeier, F., and Ernst, M. O. (2006). Vision and touch are automatically integrated for the perception of sequences of events. Journal of Vision 6.
Bresciani, J.-P., Dammeier, F., & Ernst, M.O., 2006. Vision and touch are automatically integrated for the perception of sequences of events. Journal of Vision, 6(5).
J.-P. Bresciani, F. Dammeier, and M.O. Ernst, “Vision and touch are automatically integrated for the perception of sequences of events”, Journal of Vision, vol. 6, 2006.
Bresciani, J.-P., Dammeier, F., Ernst, M.O.: Vision and touch are automatically integrated for the perception of sequences of events. Journal of Vision. 6, (2006).
Bresciani, Jean-Pierre, Dammeier, Franziska, and Ernst, Marc O. “Vision and touch are automatically integrated for the perception of sequences of events”. Journal of Vision 6.5 (2006).
This data publication is cited in the following publications:
This publication cites the following data publications:

23 Citations in Europe PMC

Data provided by Europe PubMed Central.

Decentralized Multisensory Information Integration in Neural Systems.
Zhang WH, Chen A, Rasch MJ, Wu S., J. Neurosci. 36(2), 2016
PMID: 26758843
A spatially collocated sound thrusts a flash into awareness.
Aller M, Giani A, Conrad V, Watanabe M, Noppeney U., Front Integr Neurosci 9(), 2015
PMID: 25774126
Allocentric coding: spatial range and combination rules.
Camors D, Jouffrais C, Cottereau BR, Durand JB., Vision Res. 109(Pt A), 2015
PMID: 25749676
Observers can reliably identify illusory flashes in the illusory flash paradigm.
van Erp JB, Philippi TG, Werkhoven P., Exp Brain Res 226(1), 2013
PMID: 23354667
Computational characterization of visually induced auditory spatial adaptation.
Wozny DR, Shams L., Front Integr Neurosci 5(), 2011
PMID: 22069383
A bayesian foundation for individual learning under uncertainty.
Mathys C, Daunizeau J, Friston KJ, Stephan KE., Front Hum Neurosci 5(), 2011
PMID: 21629826
Irrelevant visual faces influence haptic identification of facial expressions of emotion.
Klatzky RL, Abramowicz A, Hamilton C, Lederman SJ., Atten Percept Psychophys 73(2), 2011
PMID: 21264726
Within- and cross-modal distance information disambiguate visual size-change perception.
Battaglia PW, Di Luca M, Ernst MO, Schrater PR, Machulla T, Kersten D., PLoS Comput. Biol. 6(3), 2010
PMID: 20221263
Learning bimodal structure in audio-visual data.
Monaci G, Vandergheynst P, Sommer FT., IEEE Trans Neural Netw 20(12), 2009
PMID: 19963447
Multisensory integration in the superior colliculus requires synergy among corticocollicular inputs.
Alvarado JC, Stanford TR, Rowland BA, Vaughan JW, Stein BE., J. Neurosci. 29(20), 2009
PMID: 19458228
Multisensory oddity detection as bayesian inference.
Hospedales T, Vijayakumar S., PLoS ONE 4(1), 2009
PMID: 19145254
Multisensory integration: a late bloomer.
Ernst MO., Curr. Biol. 18(12), 2008
PMID: 18579094
Tri-modal integration of visual, tactile and auditory signals for the perception of sequences of events.
Bresciani JP, Dammeier F, Ernst MO., Brain Res. Bull. 75(6), 2008
PMID: 18394521
Distortions of subjective time perception within and across senses.
van Wassenhove V, Buonomano DV, Shimojo S, Shams L., PLoS ONE 3(1), 2008
PMID: 18197248
Temporal recalibration to tactile-visual asynchronous stimuli.
Keetels M, Vroomen J., Neurosci. Lett. 430(2), 2008
PMID: 18055112
Causal inference in multisensory perception.
Kording KP, Beierholm U, Ma WJ, Quartz S, Tenenbaum JB, Shams L., PLoS ONE 2(9), 2007
PMID: 17895984

Export

0 Marked Publications

Open Data PUB

Web of Science

View record in Web of Science®

Sources

PMID: 16881788
PubMed | Europe PMC

Search this title in

Google Scholar