Sound Design

Spatial Audio VR Smog for Rustlers

Having moved from my hometown of Blackpool in my late teens, I thought I’d left the holy trinity of microwaveable burgers, poor visibility and attention grabbing sounds behind….. How wrong I was.

Step forward Droga5 and their bold vision (no pun intended….) for a 360 virtual reality walking tour of London, through the great smog of 1952.

Working with the latest audio spatialisation software (thanks Audioease and Noisemakers!) we created an ambisonic mix – optimised for playback on YouTube’s new 360 video engine.

YouTube uses the gyroscope in your VR headset (or phone) to work out which way you’re facing. The audio is decoded live in playback to a binaural mix, which tracks your head movements, changing direction and perspective as you look around. A car approaching on your right from the front, will swing around to come from behind on your left if you spin around.

In plain-speak Binaural is how your ears hear in the real world, sounds are above you, behind you and all around you.

It’s literally magic.

At the time of writing, the implementation of spatial audio in VR films is a fairly new technology. There are no hard and fast rules for how to mix VR audio yet, so I wanted to share a few thoughts to add to the discussion.

From a creative point of view, working on a VR film with no tangible visuals, freed us from having to spot sounds in vision, allowing us to place things wherever they sounded good. In a 3D mix, there’s so much more space to use and I found you could separate sounds in similar frequency areas pretty easily. Initially we got a bit carried away, throwing loads of sounds into the mix that didn’t need to be there, before paring things back again. In everyday life your brain filters out a lot of background sounds, so beyond the initial layers of ambience, we only needed to include sounds that we wanted people to pick up on.

When you watch the film (link at the bottom) make sure you have a listen to the time-machine bit right at the end. For the main part, the speed of sound movement in the film is fairly gentle but I was chuffed to be able to chuck things around in a way that would probably induce vomiting for that part where you are transported back to the year 2017.

To create our 360-degree audio-soup we splits everything into three distinct layers:

1) Ambisonic field recordings

Kaspar got up at the crack of dawn (to avoid rush-hour) armed with a Sennheiser Ambeo microphone and walked around the route of the tour.

This mic utilises four capsules to capture first order (quad channel) ambisonic atmospheres, which can then be decoded to binaural, 5.1, atmos or whatever you want.

It looks like something from Star Trek right? As he discovered, it’s not the kind of thing that helps you blend into the background whilst standing on a bridge at 6am in the morning…..

These recordings would form our base layer, giving the listener a believable “air” to ground them in reality. The next step was to add some less diffuse objects to the mix.

2) Stereo sounds

We used stereo sound sources for larger objects that we wanted to represent by a “cloud” of sound. Width controls on our 360 software allowed us to control the size of crowds, a group of pigeons etc. as these they moved towards and past the listener.

3) Mono sounds

Finally we added passing cars, people speaking, footsteps and anything else we could approximate to have come from a single point was treated as mono.

We did the final mix on headphones, using our tools to monitor through the same HRTF impulse responses that YouTube uses. This allows us to hear it exactly as it would playback on YouTube (The HRTF IRs are an approximation of how an average person (with a symmetrical head) will perceive a sound coming from a point in space. Monitoring this way allows us to hear it exactly as it would playback on YouTube. You could in theory mix in the middle of a four speaker setup but that’s not how 99.9% of people will hear it.

For the full experience get yourself a VR headset ( Google cardboard is one of the cheapest ) grab a gyroscope enabled Android phone (anything above entry level should have this), stick on some nice headphones, head to YouTube and see what you think. You’ll need to click the little Goggle icon to get it into dual eye mode.

Unbelievably at the time of writing Apple have still not implemented spatial audio support for YouTube – c’mon guys pull your finger out! If you don’t have an Android phone you can watch it on YouTube in a Chrome browser but make sure you use headphones. You can grab the picture and move the audio field around.

It would seem not everyone gets the gag straight away:

…..but as David Kolbusz, CCO at Droga5 London so succinctly says:

“VR is the future. Rustlers are the present. London’s Great Smog is the past. We’ve taken past, present, and future and fused them together to bring you one of the most technologically advanced experiences you’re likely to have all afternoon.”

Mike Bamford, String and Tins


360 Mix: Mike Bamford @ String and Tins
Sound Design: Mike Bamford, Nigel Manington @ String and Tins
Ambisonic field recordings: Kaspar Broyd @ String and Tins
Creative director: David Kolbusz, Rick Dodds, Steve Howell
Copywriter: Teddy Souter, Dan Morris
Art director: Frazer Price, Charlene Chandrasekaran
Director/VFX: Christos Mavridis
Agency producer: Chris Watling
Account director: Alex Dousie