Much of neurophysiology and vision science relies on careful measurement of a human or animal subject's gaze direction. Video-based eye trackers have emerged as an especially popular option for gaze tracking, because they are easy to use and are completely non-invasive. However, video eye trackers typically require a calibration procedure in which the subject must look at a series of points at known gaze angles. While it is possible to rely on innate orienting behaviors for calibration in some non-human species, other species, such as rodents, do not reliably saccade to visual targets, making this form of calibration impossible. To overcome this problem, we developed a fully automated infrared video eye-tracking system that is able to quickly and accurately calibrate itself without requiring co-operation from the subject. This technique relies on the optical geometry of the cornea and uses computer-controlled motorized stages to rapidly estimate the geometry of the eye relative to the camera. The accuracy and precision of our system was carefully measured using an artificial eye, and its capability to monitor the gaze of rodents was verified by tracking spontaneous saccades and evoked oculomotor reflexes in head-fixed rats (in both cases, we obtained measurements that are consistent with those found in the literature). Overall, given its fully automated nature and its intrinsic robustness against operator errors, we believe that our eye-tracking system enhances the utility of existing approaches to gaze-tracking in rodents and represents a valid tool for rodent vision studies.

A self-calibrating, camera-based eye tracker for the recording of rodent eye movement / Zoccolan, Davide Franco; Graham, Jb; Cox, Dd. - In: FRONTIERS IN NEUROSCIENCE. - ISSN 1662-453X. - 4:NOV(2010), pp. 1-12. [10.3389/fnins.2010.00193]

A self-calibrating, camera-based eye tracker for the recording of rodent eye movement

Zoccolan, Davide Franco;
2010-01-01

Abstract

Much of neurophysiology and vision science relies on careful measurement of a human or animal subject's gaze direction. Video-based eye trackers have emerged as an especially popular option for gaze tracking, because they are easy to use and are completely non-invasive. However, video eye trackers typically require a calibration procedure in which the subject must look at a series of points at known gaze angles. While it is possible to rely on innate orienting behaviors for calibration in some non-human species, other species, such as rodents, do not reliably saccade to visual targets, making this form of calibration impossible. To overcome this problem, we developed a fully automated infrared video eye-tracking system that is able to quickly and accurately calibrate itself without requiring co-operation from the subject. This technique relies on the optical geometry of the cornea and uses computer-controlled motorized stages to rapidly estimate the geometry of the eye relative to the camera. The accuracy and precision of our system was carefully measured using an artificial eye, and its capability to monitor the gaze of rodents was verified by tracking spontaneous saccades and evoked oculomotor reflexes in head-fixed rats (in both cases, we obtained measurements that are consistent with those found in the literature). Overall, given its fully automated nature and its intrinsic robustness against operator errors, we believe that our eye-tracking system enhances the utility of existing approaches to gaze-tracking in rodents and represents a valid tool for rodent vision studies.
2010
4
NOV
1
12
193
10.3389/fnins.2010.00193
Zoccolan, Davide Franco; Graham, Jb; Cox, Dd
File in questo prodotto:
File Dimensione Formato  
Front. Neurosci. 2010 Zoccolan.pdf

accesso aperto

Descrizione: DOAJ Open Access
Tipologia: Versione Editoriale (PDF)
Licenza: Non specificato
Dimensione 3.4 MB
Formato Adobe PDF
3.4 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.11767/13651
Citazioni
  • ???jsp.display-item.citation.pmc??? 23
  • Scopus 39
  • ???jsp.display-item.citation.isi??? 35
social impact