Virtual reality modeling for listening tests (online)
* Presenting author
Abstract:
Music and sound perception is a multi-sensory process. Therefore it is crucial for listening tests to simulate as many aspects of human perception as possible. This study presents a plausible workflow for the creation of a photorealistic “almost real-life” virtual experience (performance and music venue) to be used as a stimulus in an audiovisual listening test. The main focus is to provide the researcher with the ability to reproduce exactly the same musical performance under adjustable conditions. These conditions include but are not limited to: multiple viewing/listening positions, different lighting plans, colour and surface texture, adjustable seats, stage size and other props (music instruments, panels. etc). Α transparent head-up display menu, that allows the audience to interact through a controller and rate attributes or give answers while attending the performance, is also included. The study analyses the four parts of the creation process: greenscreen recordings in an anechoic chamber (solo instruments), visual replication of a concert hall (Konzerthaus Berlin) in a graphics engine, auralization (measured BRIRs and HRTFs) and the experimental setup in virtual reality. The functionality of the model is explained through a series of study cases along with a presentation of possible uses and upgrades is also presented.