I am a researcher in experimental psychology at Justus-Liebig University Giessen, Germany. My research is focused on how we as humans sense, perceive, and act on our environment.
Even during a simple every-day task, such as taking a sip of coffee while reading this sentence, our brain continuously interprets and reacts to the world in a highly dynamic sensorimotor loop. To unravel this loop, I use a variety of methods such as eye tracking, hand and body tracking, psychophysics, and computational modeling techniques. Most of my research makes use of virtual reality to create well-controlled yet highly naturalistic environments. On the applied side, I am interested in how sensorimotor principles can inform and improve our interactions with technology and lead to more human-centric computing. Finally, I also support other research projects through code, statistical analysis, and virtual experience design.
I graduated with a degree in psychology from Philipps University Marburg. In 2015, I successfully defended my PhD (Dr. rer. nat) in experimental psychology in the Perception & Action lab of Prof. Dr. Katja Fiehler at Giessen University. My postdoctoral research experience since then includes helping establish the Physics of Cognition lab at Chemnitz University of Technology (Prof. Dr. Wolfgang Einhäuser-Treyer) and a two-year industry postdoctoral assignment at Facebook Reality Labs (Redmond WA, USA).
Note: This is a selection of recent academic papers. Click here to view the full publication list.
Eye gaze promises to be a fast and intuitive way of interacting with technology. Importantly, the performance of a gaze selection paradigm depends on the eye tracker used: Higher tracking accuracy allows for selection of smaller targets, and higher precision and sampling rate allow for faster and more robust interaction. Here we present a novel approach to predict the minimal eye tracker specifications required for gaze-based selection. We quantified selection performance for targets of different sizes while recording high-fidelity gaze data. Selection performance across target sizes was well modeled by a sigmoid similar to a psychometric function. We then simulated lower tracker fidelity by adding noise, a constant spatial bias, or temporal sub-sampling of the recorded data while re-fitting the model each time. Our approach can inform design by predicting performance for a given interface element and tracker fidelity or the minimal element size for a specific performance level.
Eye gaze as an input method has been studied since the 1990s, to varied results: some studies found gaze to be more efficient than traditional input methods like a mouse, others far behind. Comparisons are often backed up by Fitts' Law without explicitly acknowledging the ballistic nature of saccadic eye movements. Using a vision science-inspired model, we here show that a Fitts'-like distribution of movement times can arise due to the execution of secondary saccades, especially when targets are small. Study participants selected circular targets using gaze. Seven different target sizes and two saccade distances were used. We then determined performance across target sizes for different sampling windows ("dwell times") and predicted an optimal dwell time range. Best performance was achieved for large targets reachable by a single saccade. Our findings highlight that Fitts' Law, while a suitable approximation in some cases, is an incomplete description of gaze interaction dynamics.