Spatial Audio with Head Tracking
PLEASE BRING YOUR OWN COMPUTER AND HEADPHONES
The workshop will first quickly overview the physiological and psychoacoustic theories underpinning spatial audio. I will also review some basic programming/mathematical concepts that are important for developing an intuition for spatial audio design.
These details were covered in more detail during the first Spatial Audio with Max/MSP workshop at Spektrum. If you would like to review this info, I would be happy to forward the slides I used for the presentation of these concepts.
NOTE: The focus of this workshop is getting spatial audio and head tracking working together in a functional and interesting patch, which you can take home as a jumping off point for other projects.
The second part of the workshop will focus on integrating head tracking data into spatial audio processing. Head tracking enables realtime cardinal navigation of a virtual sound field from a specific location (i.e. not necessarily walking through a sound field, but looking around within it - with the sound of the space adjusting as if you were actually within it). I will provide a .zip file of HRTFs (short audio files) to participants before the workshop for spatialization processing.
I will also provide a couple of low level Max/MSP objects so we can get started building a cool interactive spatial audio patch together. Participants are StRoGLY(!) encouraged to bring headphones. Spatial audio is primarily experienced over headphones. Please also bring a cell phone with a gyroscope in order to track your head!
about the workshop holder
Jordan Juras is a musician and DSP engineer. He graduated with a M.Music in Music Technology from NYU in New York City. There he conducted research at M.A.R.L., which focused on improving VR experiences by increasing the perceived realness of the virtual soundfield.