In the absence of visual and tactile information, the auditory system becomes the primary source to guide human awareness within the boundaries of an enclosing space. For example, blind humans are able to extract information about a surrounding room by echolocation, i.e., by interpreting the effects of surfaces and objects on sounds, generating early reflections and late reverberation.Here we assessed acoustically-guided awareness in interactive virtual environments, combining a real-time acoustic simulation [based on Wendt et al., JAES, 62, 11 (2014)] with an experimental framework realized within the Unreal Engine 4 (for visual reference conditions). Subjects were seated on a rotating chair in the center of a 3-dimensional 86-channel loudspeaker system. Virtual sound sources were placed dynamically at the collision point of a virtual ray emitted from the position of a hand-held controller with the room boundaries. First, navigation through different simple room shapes (I, U, or Z maze) was investigated. Second, listeners were asked to identify the shape of the virtual environment between different presented floorplans (e.g., I or T maze) from a fixed position in the room. Available acoustic information of early reflections up to the n-th order was altered between trials.