Suche

Exploring Eye-Tracking-Driven Sonification for the Visually Impaired


erschienen 26.02.2016 in: Genf Proceedings of the 7th Augmented Human International Conference (AH) 2016

Verlag: ACM

DOI: 10.1145/2875194.2875208


Most existing sonification approaches for the visually impaired restrict the user to the perception of static scenes by performing sequential scans and transformations of visual information to acoustic signals. This takes away the user's freedom to explore the environment and to decide which information is relevant at a given point in time. As a solution, we propose an eye tracking system to allow the user to choose which elements of the field of view should be sonified. More specifically, we enhance the sonification approaches for color, text and facial expressions with eye tracking mechanisms. To find out how visually impaired people might react to such a system we applied a user centered design approach. Finally, we explored the effectiveness of our concept in a user study with seven visually impaired persons. The results show that eye tracking is a very promising input method to control the sonification, but the large variety of visual impairment conditions restricts the applicability of the technology.