Tri-modal integration of visual, tactile and auditory signals for the perception of sequences of events

Bresciani J-P, Darnmeier F, Ernst MO (2008)
Brain Research Bulletin 75(6): 753-760.

Journal Article | Published | English

No fulltext has been uploaded

Author
; ;
Abstract
We investigated the interactions between visual, tactile and auditory sensory signals for the perception of sequences of events. Sequences of flashes, taps and beeps were presented simultaneously. For each session, subjects were instructed to count the number of events presented in one modality (Target) and to ignore the stimuli presented in the other modalities (Background). The number of events presented in the background sequence could differ from the number of events in the target sequence. For each session, we quantified the Background-evoked bias by comparing subjects' responses with and without Background (Target presented alone). Nine combinations between vision, touch and audition were tested. In each session but two, the Background significantly biased the Target. Vision was the most susceptible to Background-evoked bias and the least efficient in biasing the other two modalities. By contrast, audition was the least susceptible to Background-evoked bias and the most efficient in biasing the other two modalities. These differences were strongly correlated to the relative reliability of each modality. In line with this, the evoked biases were larger when the Background consisted of two instead of only one modality. These results show that for the perception of sequences of events: (1) vision, touch and audition are automatically integrated; (2) the respective contributions of the three modalities to the integrated percept differ; (3) the relative contribution of each modality depends on its relative reliability (1/variability); (4) task-irrelevant stimuli have more weight when presented in two rather than only one modality. (c) 2008 Elsevier Inc. All rights reserved.
Publishing Year
ISSN
PUB-ID

Cite this

Bresciani J-P, Darnmeier F, Ernst MO. Tri-modal integration of visual, tactile and auditory signals for the perception of sequences of events. Brain Research Bulletin. 2008;75(6):753-760.
Bresciani, J. - P., Darnmeier, F., & Ernst, M. O. (2008). Tri-modal integration of visual, tactile and auditory signals for the perception of sequences of events. Brain Research Bulletin, 75(6), 753-760.
Bresciani, J. - P., Darnmeier, F., and Ernst, M. O. (2008). Tri-modal integration of visual, tactile and auditory signals for the perception of sequences of events. Brain Research Bulletin 75, 753-760.
Bresciani, J.-P., Darnmeier, F., & Ernst, M.O., 2008. Tri-modal integration of visual, tactile and auditory signals for the perception of sequences of events. Brain Research Bulletin, 75(6), p 753-760.
J.-P. Bresciani, F. Darnmeier, and M.O. Ernst, “Tri-modal integration of visual, tactile and auditory signals for the perception of sequences of events”, Brain Research Bulletin, vol. 75, 2008, pp. 753-760.
Bresciani, J.-P., Darnmeier, F., Ernst, M.O.: Tri-modal integration of visual, tactile and auditory signals for the perception of sequences of events. Brain Research Bulletin. 75, 753-760 (2008).
Bresciani, Jean-Pierre, Darnmeier, Franziska, and Ernst, Marc O. “Tri-modal integration of visual, tactile and auditory signals for the perception of sequences of events”. Brain Research Bulletin 75.6 (2008): 753-760.
This data publication is cited in the following publications:
This publication cites the following data publications:

13 Citations in Europe PMC

Data provided by Europe PubMed Central.

Integration of auditory and tactile inputs in musical meter perception.
Huang J, Gamble D, Sarnlertsophon K, Wang X, Hsiao S., Adv. Exp. Med. Biol. 787(), 2013
PMID: 23716252
Observers can reliably identify illusory flashes in the illusory flash paradigm.
van Erp JB, Philippi TG, Werkhoven P., Exp Brain Res 226(1), 2013
PMID: 23354667
Dynamic characteristics of multisensory facilitation and inhibition.
Wang WY, Hu L, Valentini E, Xie XB, Cui HY, Hu Y., Cogn Neurodyn 6(5), 2012
PMID: 24082962
Feeling music: integration of auditory and tactile inputs in musical meter perception.
Huang J, Gamble D, Sarnlertsophon K, Wang X, Hsiao S., PLoS ONE 7(10), 2012
PMID: 23119038
A "unity assumption" does not promote intersensory integration.
Misceo GF, Taylor NJ., Exp Psychol 58(5), 2011
PMID: 21592944
Crossmodal congruency effects based on stimulus identity.
Frings C, Spence C., Brain Res. 1354(), 2010
PMID: 20674555
The time-course of auditory and visual distraction effects in a new crossmodal paradigm.
Bendixen A, Grimm S, Deouell LY, Wetzel N, Madebach A, Schroger E., Neuropsychologia 48(7), 2010
PMID: 20385149
Visual stimulus locking of EEG is modulated by temporal congruency of auditory stimuli.
Schall S, Quigley C, Onat S, Konig P., Exp Brain Res 198(2-3), 2009
PMID: 19526359
Sensory dominance in combinations of audio, visual and haptic stimuli.
Hecht D, Reiner M., Exp Brain Res 193(2), 2009
PMID: 18985327

50 References

Data provided by Europe PubMed Central.

Reaction time as a measure of intersensory facilitation.
HERSHENSON M., J Exp Psychol 63(), 1962
PMID: 13906889
Parchment-skin illusion: sound-biased touch.
Jousmaki V, Hari R., Curr. Biol. 8(6), 1998
PMID: 9512426
Hearing visual motion in depth.
Kitagawa N, Ichihara S., Nature 416(6877), 2002
PMID: 11894093
Perception of material from contact sounds
Klatzky, Presence: Teleoperat. Vir. Environ. 9(), 2000
Functional properties of neurons in the temporo-parietal association cortex of awake monkey.
Leinonen L, Hyvarinen J, Sovijarvi AR., Exp Brain Res 39(2), 1980
PMID: 6772459
Bayesian inference with probabilistic population codes.
Ma WJ, Beck JM, Latham PE, Pouget A., Nat. Neurosci. 9(11), 2006
PMID: 17057707
Auditory capture of vision: examining temporal ventriloquism.
Morein-Zamir S, Soto-Faraco S, Kingstone A., Brain Res Cogn Brain Res 17(1), 2003
PMID: 12763201
Auditory influences on visual temporal rate perception.
Recanzone GH., J. Neurophysiol. 89(2), 2003
PMID: 12574482
Sound alters visual motion perception.
Sekuler R, Sekuler AB, Lau R., Nature 385(6614), 1997
PMID: 9002513
Illusions. What you see is what you hear.
Shams L, Kamitani Y, Shimojo S., Nature 408(6814), 2000
PMID: 11130706
AUDITORY FLUTTER-DRIVING OF VISUAL FLICKER.
SHIPLEY T., Science 145(3638), 1964
PMID: 14173429
Cortex governs multisensory integration in the midbrain.
Stein BE, Wallace MW, Stanford TR, Jiang W., Neuroscientist 8(4), 2002
PMID: 12194499
Integration of proprioceptive and visual position-information: An experimentally supported model.
van Beers RJ, Sittig AC, Gon JJ., J. Neurophysiol. 81(3), 1999
PMID: 10085361
Touch-induced visual illusion.
Violentyev A, Shimojo S, Shams L., Neuroreport 16(10), 2005
PMID: 15973157
Audio-visual integration in temporal perception.
Wada Y, Kitagawa N, Noguchi K., Int J Psychophysiol 50(1-2), 2003
PMID: 14511840
Note on the construction of diagram-balanced latin squares
Wagenaar, Psychol. Bull. 72(), 1969

Export

0 Marked Publications

Open Data PUB

Web of Science

View record in Web of Science®

Sources

PMID: 18394521
PubMed | Europe PMC

Search this title in

Google Scholar