Working with sound spatialization today is a common process: there are many software, plugins, and sharing formats. While they are becoming more intuitive thanks to the object-based formalism, we can still ask ourselves how to manipulate and synthesize sounds that can’t be represented by an extended sound object like ambient noise or textures. At the same time, composing an auditory space by only positioning sound objects is hardly ever perceptually satisfactory and our quotidian experience as immersive content creators shows that it needs something else to improve the plausibility and the feeling of immersion. Based on this report, we want to investigate the influence of the auditory object’s context on the perceived sound environment and more specifically the possible correlation between acoustic parameters and auditory percept.
The main goal of this work is to start a reflection on an audio tool allowing to manipulate the spatial aspects of auditory scenes based on perceptual parameters. To illustrate this investigation, we focus on the perceived acoustical horizon and on the spatial features correlated to it, as described by the World Soundscape Project.
In my talk, I will present an experimental protocol designed to address these questions. The presentation will first focus on the design of the stimuli, i.e. four virtual soundscapes made within the object-oriented formalism, in collaboration with a sound engineer. We will then present the experiment itself which was conducted into our 42 speakers’ geodesic sphere at the PRISM laboratory and the first results corroborating our hypotheses. We will conclude with perspectives on the leads for the future creation of a perception-based spatial audio tool. (10 minutes)