Audio-visual perception of
acoustical environments

From 2015 to 2018, the SIM conducted the second phase of the project "Audio-visual perception of acoustical environments" (MA 4343/1-2) within the framework of the DFG research unit Simulation and Evaluation of Acoustical Environments (SEACEN). The project investigated the effects of acoustic and optical information on various auditory (e.g. loudness), visual (e.g. brightness), and audiovisual (e.g. geometric, aesthetic) features relating to the perception of and in rooms. To this end, participants were presented with music and speech performances in six rooms varying in size and material (involving acoustic absorption) and asked for their ratings.

Due to the patchy and inconsistent state of research, a topic-related research strategy was developed at first. Because the artistic renditions and performance spaces had to be brought into the lab for methodological reasons, the required research infrastructure was created, the Virtual Concert Hall. This 3D virtual environment featuring rich cue conditions allowed for the mutually independent variation of the acoustic and optical components and of the artistic renditions and the spaces in which they were staged, as well as for the investigation of the effects of both the presence and the properties of the acoustic and optical domains. The simulation system was perceptually validated by means of the experimental comparison of a real with the respective virtual performance space. It was then used for the presentation of virtual music and speech performances within six spaces varying in size and material which were rated by the test participants.

By the first project phase, generally neither crossmodal effects nor appreciable interaction effects of acoustics and optics were observed, indicating that for most of the investigated features the perceptual processing of acoustic and optical information is straightforward. Regarding the audiovisual features, the geometric estimates relied predominantly on seeing, the aesthetic judgments predominantly on hearing.

In the second phase, certain topics such as distance perception, intermodal capabilities, the effects of room acoustic parameters, and the perceptual evaluation of loudspeakers were focused, and potential subject- and stimulus-related moderator variables were investigated. Therefore seven experiments were performed, 433 participants were individually tested, and more than 200,000 data were collected.

The findings are still subject of publication. They contribute to the understanding of the principles of multi-modal perception and broaden the basis for an empirically-founded model on audiovisual perception in and of rooms; they further indicate to what extent results of experiments conducted under acoustic stimulus conditions may be transferred to optoacoustic stimulus conditions; and they may be useful for the perceptually-oriented advancement of optoacoustic virtual environments and related content.

Several findings and the Virtual Concert Hall are currently made permanent on the level of knowledge transfer by the follow-up projectSound & Vision Experience Lab (SV_XL).