Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Estimating 3D spatiotemporal point of regard: a device evaluation

Not Accessible

Your library or personal account may give you access

Abstract

This paper presents and evaluates a system and method that record spatiotemporal scene information and location of the center of visual attention, i.e., spatiotemporal point of regard (PoR) in ecological environments. A primary research application of the proposed system and method is for enhancing current 2D visual attention models. Current eye-tracking approaches collapse a scene’s depth structures to a 2D image, omitting visual cues that trigger important functions of the human visual system (e.g., accommodation and vergence). We combined head-mounted eye-tracking with a miniature time-of-flight camera to produce a system that could be used to estimate the spatiotemporal location of the PoR—the point of highest visual attention—within 3D scene layouts. Maintaining calibration accuracy is a primary challenge for gaze mapping; hence, we measured accuracy repeatedly by matching the PoR to fixated targets arranged within a range of working distances in depth. Accuracy was estimated as the deviation from estimated PoR relative to known locations of scene targets. We found that estimates of 3D PoR had an overall accuracy of approximately 2° omnidirectional mean average error (OMAE) with variation over a 1 h recording maintained within 3.6° OMAE. This method can be used to determine accommodation and vergence cues of the human visual system continuously within habitual environments, including everyday applications (e.g., use of hand-held devices).

© 2022 Optica Publishing Group

Full Article  |  PDF Article
More Like This
Calibration method for 3D gaze tracking systems

J.-N. Chi, Y.-Y. Xing, L.-N. Liu, W.-W. Gou, and G.-S. Zhang
Appl. Opt. 56(5) 1536-1541 (2017)

Cyclopean geometry of binocular vision

Miles Hansard and Radu Horaud
J. Opt. Soc. Am. A 25(9) 2357-2369 (2008)

Cross-talk elimination for lenslet array near eye display based on eye-gaze tracking

Bi Ye, Yuichiro Fujimoto, Yuta Uchimine, Taishi Sawabe, Masayuki Kanbara, and Hirokazu Kato
Opt. Express 30(10) 16196-16216 (2022)

Supplementary Material (1)

NameDescription
Supplement 1       Gaze mapping quality estimates.

Data availability

All code used for analyses with detailed description is available at Github [35]. Open source Pico Flexx depth plugin for PLpupil 1.6 is available at Github [29]. Raw data are available for ethical research on request. Please bear in mind that the entire raw data dataset is 300 GB, and the Pico Flexx point cloud data are accessible only via PMD’s Royal software.

35. P. Wagner, “Pupil Labs Pico Flexx Validation,” GitHub (2022), https://github.com/peteratBHVI/pupil_labs_pico_flexx_validation.

29. D. McFarlane and P. Prietz, “Pupil Labs, PMD Pico Flexx,” GitHub (2019), https://github.com/peteratBHVI/pupil_labs_pico_flexx_validation.

Cited By

You do not have subscription access to this journal. Cited by links are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.

Contact your librarian or system administrator
or
Login to access Optica Member Subscription

Figures (4)

You do not have subscription access to this journal. Figure files are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.

Contact your librarian or system administrator
or
Login to access Optica Member Subscription

Tables (2)

You do not have subscription access to this journal. Article tables are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.

Contact your librarian or system administrator
or
Login to access Optica Member Subscription

Equations (3)

You do not have subscription access to this journal. Equations are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.

Contact your librarian or system administrator
or
Login to access Optica Member Subscription

Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.