Case study: Applying emerging Virtual Reality technologies to the study of spatial audio
School of Physics, Engineering & Technology (PET)
Michael McLoughlin (PET) and Joe Ree-Jones (XR-Stories)
The "Sound Interactions in the Metaverse" module, part of the MSc Audio and Music Technology programme, explores the application of emerging Virtual Reality (VR) technologies in the study of spatial audio. Led by Michael McLoughlin (School of PET) and Joe Ree-Jones (XR-Stories), and developed through the Sound Interactions in the Metaverse (SIM) Centre for Doctoral Training, the module focuses on psychoacoustics, signal processing, and interaction. Students learn how sound perception is influenced by head movements and spatial positioning, aligning with trends in virtual acoustics and video game audio. The primary objective is to equip students with the skills to develop applications integrating spatial audio and head tracking within virtual environments.
To support learning outcomes, the module utilises Pico 4 VR headsets, chosen for their ease of development and compatibility with game engines such as Unity. In the second semester, students first gain foundational programming skills in platforms such as Matlab and iOS app development before engaging in tutorials covering Unity and spatial audio design. Student progress is closely monitored through weekly tutorials and milestone assessments with the final evaluation to include a presentation and demonstration of their applications.
By integrating immersive technology, students develop hands-on experience, creating innovative projects such as maze navigation games and virtual exhibitions. This approach enhances their technical proficiency and prepares them for careers in Virtual and Extended Reality (VR/XR) development.
Read the full case study from the following webpage: Applying emerging VR technologies to the study of spatial audio