Investigating target search with audio, visual, or audio-visual information in virtual reality (vor Ort)
* Presenting author
Abstract:
The auditory system is thought to guide visual localization. In this study, we investigated if indeed audio-visual localization can be explained by a combination of auditory guidance and visual localization and how this is affected by auditory distractors. To study audio-visual localization behavior and how it changes with increasing auditory distractors, seven normal-hearing listeners participated in an auditory, visual and audio-visual search task with a varying number of auditory distractors (0, 1, 2, 3, 5, 7 or 11). Participants were presented with an audio-only, visual-only or audio-visual target, which was then moved to an unknown position somewhere around them. Their task was then to find this target as quickly as possible. Behavioral features, such as eye-gaze and head-rotation, were tracked and analyzed. The results demonstrated a more complex interaction between the auditory and visual systems when localizing stimuli than the traditionally held view that the auditory system guides visual localization. Further, this interaction was influenced by the number of auditory distractors that were present.