Attentive interfaces for users with disabilities: eye gaze for intention and uncertainty estimation

Helmut Prendinger, Aulikki Hyrskykari, Minoru Nakayama, Howell Istance, Nikolaus Bee and Yosiyuki Takahasi

erschienen 2009 "Univers. Access Inf. Soc.", Volume 8, Issue 4, Pages 339 - 354

Verlag: Springer



Attentive user interfaces (AUIs) capitalize on the rich information that can be obtained from users’ gaze behavior in order to infer relevant aspects of their cognitive state. Not only is eye gaze an excellent clue to states of interest and intention, but also to preference and confidence in comprehension. AUIs are built with the aim of adapting the interface to the user’s current information need, and thus reduce workload of interaction. Given those characteristics, it is believed that AUIs can have particular benefits for users with severe disabilities, for whom operating a physical device (like a mouse pointer) might be very strenuous or infeasible. This paper presents three studies that attempt to gauge uncertainty and intention on the part of the user from gaze data, and compare the success of each approach. The paper discusses how the application of the approaches adopted in each study to user interfaces can support users with severe disabilities.


  • BibTeX  -  (BibTeX.txt, 0 KB)