Dominik Osinski

and 4 more

Objective: Blindness deprives a person of a significant part of sensory information resulting in limited perceptual abilities and decreased quality of life. Although some aspects of visual information, like the shape of an object, can be conveyed by other senses, there is no easy way to perceive color without using sight. To address this issue, we developed the Colorophone sensory substitution device. Method: In a way analogous to pixels in visual displays, we introduced auxels (auditory pixels) that can be used as basic building blocks of a generic auditory display. The developed auxel-based system realizes real-time, spatial, color-to-sound conversion. We created a dedicated software suite that enables the independent introduction of various system features to ensure effective training. Four blind participants assessed the prototype’s usability. The evaluation methods included: auditory color recognition, object identification, and virtual sound source localization tasks, as well as two self-descriptive methods: the System Usability Scale and the NASA Task Load Index. Results: The developed wearable system generates spatially calibrated colorful soundscapes. It enables auditory color recognition and object identification significantly above chance. However, analyzing complex natural scenes remains challenging. Users judged the system’s usability from good to best imaginable.  The identified usability issues are discussed together with the proposed solutions. Conclusion: The Colorophone device shows promise for the future development of a useful visual rehabilitation device; however, further work is needed to eliminate existing usability issues. Significance: The presented work contributes to developing a universal, affordable and user-friendly visual rehabilitation device.