Automatic Analysis of 3D Gaze Coordinates on Scene Objects Using Data From Eye-Tracking and Motion Capture Systems
Essig K, Prinzhorn D, Maycock J, Dornbusch D, Ritter H, Schack T (2012)
Presented at the Proceedings of the 2012 Symposium on Eye-Tracking Research and Applications, ETRA 2012, Santa Barbara, California, USA.
Konferenzbeitrag
| Veröffentlicht | Englisch
Download
Es wurden keine Dateien hochgeladen. Nur Publikationsnachweis!
Autor*in
Essig, KaiUniBi;
Prinzhorn, Daniel;
Maycock, JonathanUniBi;
Dornbusch, DanielUniBi;
Ritter, HelgeUniBi ;
Schack, ThomasUniBi
Herausgeber*in
Morimoto, Carlos H.;
Istance, Howell O.;
Spencer, Stephen N.;
Mulligan, Jeffrey B.;
Qvarfordt, Pernilla
Einrichtung
Abstract / Bemerkung
We present a method which removes the need for manual annotation
of eye-movement data. Our software produces as output object
and subject specific results for various eye tracking parameters in
complex 3D scenes. We synchronized a monocular mobile eyetracking
system with a VICON motion-capture system. Combining
the data of both systems, we calculate and visualize a 3D gaze vector
within the VICON coordinate frame of reference. By placing
markers on objects and subjects in the scene, we can automatically
compute how many times and where fixations occurred. We evaluated
our approach by comparing its outcome for a calibration and
a grasping task (with three objects: cup, stapler, sphere) against
the average results given by the manual annotation. Preliminary
data reveals that the program only differs from the average manual
annotation results by approximately 3 percent in case of the calibration
procedure, where the gaze is subsequently directed towards
five different markers on a board, without jumps between them. In
case of the more complicated grasping videos the results depend on
the object size: for bigger objects (i.e., sphere) the differences in
the number of fixations are very small and the cumulative fixaton
duration deviates by less than 16 percent (or 950ms). For smaller
objects, where there are more saccades towards object boundaries,
the differences are bigger. For one reason manual annotation becomes
inevitably more subjective; on the other hand both methods
analyze the 3D scene from slightly different perspectives (i.e., center
of eyeball versus position of scene camera). Although, even
then the automatic results come close to those of a manual annotation
(the average differences are 984ms and 399ms for the object
and hand, respectively) and reflect the fixation distribution when
interacting with objects in 3D scenes. Thus, eye-hand coordination
experiments with various objects in complex 3D scenes, especially
with bigger and moving objects, can now be realized fast and effectively.
Our approach allows the recording of eye-, head-, and grasping
movements when subjects interact with objects or systems. This
allows us to study the relation between gaze and hand movements
when people grasp and manipulate objects or indeed free movements
in normal gaze behavior. The automatic analysis of gaze and
movement data in complex 3D scenes can be applied to a variety of
research domains, i.e., Human Computer Interaction, Virtual reality
or grasping and gesture research.
Erscheinungsjahr
2012
Seite(n)
37-44
Konferenz
Proceedings of the 2012 Symposium on Eye-Tracking Research and Applications, ETRA 2012
Konferenzort
Santa Barbara, California, USA
Konferenzdatum
2012-03-28
Page URI
https://pub.uni-bielefeld.de/record/2406143
Zitieren
Essig K, Prinzhorn D, Maycock J, Dornbusch D, Ritter H, Schack T. Automatic Analysis of 3D Gaze Coordinates on Scene Objects Using Data From Eye-Tracking and Motion Capture Systems. Presented at the Proceedings of the 2012 Symposium on Eye-Tracking Research and Applications, ETRA 2012, Santa Barbara, California, USA.
Essig, K., Prinzhorn, D., Maycock, J., Dornbusch, D., Ritter, H., & Schack, T. (2012). Automatic Analysis of 3D Gaze Coordinates on Scene Objects Using Data From Eye-Tracking and Motion Capture Systems. Presented at the Proceedings of the 2012 Symposium on Eye-Tracking Research and Applications, ETRA 2012, Santa Barbara, California, USA. https://doi.org/10.1145/2168556.2168561
Essig, Kai, Prinzhorn, Daniel, Maycock, Jonathan, Dornbusch, Daniel, Ritter, Helge, and Schack, Thomas. 2012. “Automatic Analysis of 3D Gaze Coordinates on Scene Objects Using Data From Eye-Tracking and Motion Capture Systems”. Presented at the Proceedings of the 2012 Symposium on Eye-Tracking Research and Applications, ETRA 2012, Santa Barbara, California, USA , ed. Carlos H. Morimoto, Howell O. Istance, Stephen N. Spencer, Jeffrey B. Mulligan, and Pernilla Qvarfordt, 37-44. ACM.
Essig, K., Prinzhorn, D., Maycock, J., Dornbusch, D., Ritter, H., and Schack, T. (2012).“Automatic Analysis of 3D Gaze Coordinates on Scene Objects Using Data From Eye-Tracking and Motion Capture Systems”. Presented at the Proceedings of the 2012 Symposium on Eye-Tracking Research and Applications, ETRA 2012, Santa Barbara, California, USA.
Essig, K., et al., 2012. Automatic Analysis of 3D Gaze Coordinates on Scene Objects Using Data From Eye-Tracking and Motion Capture Systems. Presented at the Proceedings of the 2012 Symposium on Eye-Tracking Research and Applications, ETRA 2012, Santa Barbara, California, USA.
K. Essig, et al., “Automatic Analysis of 3D Gaze Coordinates on Scene Objects Using Data From Eye-Tracking and Motion Capture Systems”, Presented at the Proceedings of the 2012 Symposium on Eye-Tracking Research and Applications, ETRA 2012, Santa Barbara, California, USA, ACM, 2012.
Essig, K., Prinzhorn, D., Maycock, J., Dornbusch, D., Ritter, H., Schack, T.: Automatic Analysis of 3D Gaze Coordinates on Scene Objects Using Data From Eye-Tracking and Motion Capture Systems. Presented at the Proceedings of the 2012 Symposium on Eye-Tracking Research and Applications, ETRA 2012, Santa Barbara, California, USA (2012).
Essig, Kai, Prinzhorn, Daniel, Maycock, Jonathan, Dornbusch, Daniel, Ritter, Helge, and Schack, Thomas. “Automatic Analysis of 3D Gaze Coordinates on Scene Objects Using Data From Eye-Tracking and Motion Capture Systems”. Presented at the Proceedings of the 2012 Symposium on Eye-Tracking Research and Applications, ETRA 2012, Santa Barbara, California, USA, ACM, 2012.