Depth information in natural environments derived from optic flow by insect motion detection system: a model analysis

Schwegmann A, Lindemann JP, Egelhaaf M (2014)
Frontiers in Computational Neuroscience 8: 83.

Download
OA
Journal Article | Original Article | Published | English
Abstract
Knowing the depth structure of the environment is crucial for moving animals in many behavioral contexts, such as collision avoidance, targeting objects, or spatial navigation. An important source of depth information is motion parallax. This powerful cue is generated on the eyes during translatory self-motion with the retinal images of nearby objects moving faster than those of distant ones. To investigate how the visual motion pathway represents motion-based depth information we analyzed its responses to image sequences recorded in natural cluttered environments with a wide range of depth structures. The analysis was done on the basis of an experimentally validated model of the visual motion pathway of insects, with its core elements being correlation-type elementary motion detectors (EMDs). It is the key result of our analysis that the absolute EMD responses, i.e., the motion energy profile, represent the contrast-weighted nearness of environmental structures during translatory self-motion at a roughly constant velocity. In other words, the output of the EMD array highlights contours of nearby objects. This conclusion is largely independent of the scale over which EMDs are spatially pooled and was corroborated by scrutinizing the motion energy profile after eliminating the depth structure from the natural image sequences. Hence, the well-established dependence of correlation-type EMDs on both velocity and textural properties of motion stimuli appears to be advantageous for representing behaviorally relevant information about the environment in a computationally parsimonious way.
Publishing Year
ISSN
eISSN
Financial disclosure
Article Processing Charge funded by the Deutsche Forschungsgemeinschaft and the Open Access Publication Fund of Bielefeld University.
PUB-ID

Cite this

Schwegmann A, Lindemann JP, Egelhaaf M. Depth information in natural environments derived from optic flow by insect motion detection system: a model analysis. Frontiers in Computational Neuroscience. 2014;8:83.
Schwegmann, A., Lindemann, J. P., & Egelhaaf, M. (2014). Depth information in natural environments derived from optic flow by insect motion detection system: a model analysis. Frontiers in Computational Neuroscience, 8, 83. doi:10.3389/fncom.2014.00083
Schwegmann, A., Lindemann, J. P., and Egelhaaf, M. (2014). Depth information in natural environments derived from optic flow by insect motion detection system: a model analysis. Frontiers in Computational Neuroscience 8, 83.
Schwegmann, A., Lindemann, J.P., & Egelhaaf, M., 2014. Depth information in natural environments derived from optic flow by insect motion detection system: a model analysis. Frontiers in Computational Neuroscience, 8, p 83.
A. Schwegmann, J.P. Lindemann, and M. Egelhaaf, “Depth information in natural environments derived from optic flow by insect motion detection system: a model analysis”, Frontiers in Computational Neuroscience, vol. 8, 2014, pp. 83.
Schwegmann, A., Lindemann, J.P., Egelhaaf, M.: Depth information in natural environments derived from optic flow by insect motion detection system: a model analysis. Frontiers in Computational Neuroscience. 8, 83 (2014).
Schwegmann, Alexander, Lindemann, Jens Peter, and Egelhaaf, Martin. “Depth information in natural environments derived from optic flow by insect motion detection system: a model analysis”. Frontiers in Computational Neuroscience 8 (2014): 83.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]
Main File(s)
Access Level
OA Open Access
Last Uploaded
2014-10-02 10:39:08

This data publication is cited in the following publications:
This publication cites the following data publications:

8 Citations in Europe PMC

Data provided by Europe PubMed Central.

Head orientation of walking blowflies is controlled by visual and mechanical cues.
Monteagudo J, Lindemann JP, Egelhaaf M., J Exp Biol 220(pt 24), 2017
PMID: 29097591
Local motion adaptation enhances the representation of spatial structure at EMD arrays.
Li J, Lindemann JP, Egelhaaf M., PLoS Comput Biol 13(12), 2017
PMID: 29281631
Peripheral Processing Facilitates Optic Flow-Based Depth Perception.
Li J, Lindemann JP, Egelhaaf M., Front Comput Neurosci 10(), 2016
PMID: 27818631
Visual motion-sensitive neurons in the bumblebee brain convey information about landmarks during a navigational task.
Mertes M, Dittmar L, Egelhaaf M, Boeddeker N., Front Behav Neurosci 8(), 2014
PMID: 25309374

78 References

Data provided by Europe PubMed Central.

Modelling the power spectra of natural images: statistics and information.
van der Schaaf A, van Hateren JH., Vision Res. 36(17), 1996
PMID: 8917763
Blowfly flight and optic flow. II. Head movements during flight
Hateren JH, Schilstra C., J. Exp. Biol. 202 (Pt 11)(), 1999
PMID: 10229695
A model for the detection of moving targets in visual clutter inspired by insect physiology.
Wiederman SD, Shoemaker PA, O'Carroll DC., PLoS ONE 3(7), 2008
PMID: 18665213

Export

0 Marked Publications

Open Data PUB

Web of Science

View record in Web of Science®

Sources

PMID: 25136314
PubMed | Europe PMC

Search this title in

Google Scholar