A Psychophysics-inspired Model of Gaze Selection Performance

Abstract

Eye gaze promises to be a fast and intuitive way of interacting with technology. Importantly, the performance of a gaze selection paradigm depends on the eye tracker used: Higher tracking accuracy allows for selection of smaller targets, and higher precision and sampling rate allow for faster and more robust interaction. Here we present a novel approach to predict the minimal eye tracker specifications required for gaze-based selection. We quantified selection performance for targets of different sizes while recording high-fidelity gaze data. Selection performance across target sizes was well modeled by a sigmoid similar to a psychometric function. We then simulated lower tracker fidelity by adding noise, a constant spatial bias, or temporal sub-sampling of the recorded data while re-fitting the model each time. Our approach can inform design by predicting performance for a given interface element and tracker fidelity or the minimal element size for a specific performance level.

Publication
Symposium on Eye Tracking Research and Applications