Virtual reality (VR) technologies open up the possibility of designing virtual environments and carrying out experiments with a higher ecological validity and in a larger variety than in conventional laboratory environments. For this purpose, however, the validity of the generated virtual reality as well as of theoretical findings known from real environments must be verified. Within the framework of the DFG Priority Program "Audictive", the purpose of the present project is to develop and validate a VR-based audio-visual test environment for the investigation of spatial perception and attention in three-dimensional space. Towards this goal an experimental scenario will be designed in which subjects interact with simulated robots in a real-world environment. Special emphasis is placed on the (so far less investigated) perception of distance between subjects and robots. The VR environment will be generated using VR goggles and an auditory virtual environment (AVE). The validation of the AVE will follow two routes: One the one hand we will record electrophysiological signals (EEG) in real and virtual scenarios in order to test theoretical assumptions on auditory spatial perception and selective attention. As experimental factors, the spatial position of the robots, the modality of the stimuli, the presence of competing stimulation, the type of interaction, and the age of the test subjects will be varied. On the other hand, we will train data-driven algorithms for binaural localization in virtual environments and test these algorithms in real scenarios. We aim at improvements of source localization algorithms through the deployment of deep neural networks and new training procedures using large amounts of simulated data. Besides these algorithms, which will be useful for hearing devices and augmented-reality applications, the project will deliver new insights into the neurophysiological correlates of complex audio-visual scenes and a thoroughly validated auditory virtual environment. The project is interdisciplinary and unites expertise from acoustics, VR technology, cognitive psychology as well as audio-visual signal processing and machine learning. It thus contributes to the goals of the DFG Priority Program "Audictive" in a comprehensive fashion.


People:



Rainer Martin
Ruhr-Universität Bochum

Project Leader


Stephan Getzmann
TU Dortmund

Project Leader


Benjamin Stodt
TU Dortmund

PostDoc


Daniel Neudek
Ruhr-Universität Bochum

PhD