Mike and I can safely say this project was a labour of love, working with some of our best friends and getting to collaborate with esteemed house producer Riton – this work was the biggest technical challenge that we have been involved with for a while. It also recently won at the International MASAwards for ‘best online re-record / adaptation’, so I figured it was time to take note of what was done for the Desperados House Party:
The goal was to turn an apartment into a musical instrument – stairs that played a synth part, lights that could be struck as drums, taps that could modulate parameters on an effects unit, a kitchen hob that played a video sampler… and a few other cheeky ideas that didn’t make the cut (damn that human step sequencer – ahead of it’s time perhaps). Once the apartment was rigged up, it would be able to be played live – in the final film we see a group of performers playing various elements of a track, all in camera and for real:
Directed by long time collaborator Chris Cairns @ is this good?, for the agency We Are Pi, we assisted from early treatment through to final mix alongside a team of technologists and set designers from the is this good? stable, including the multi-talented Will Gallia, David Cranmer, Maz Staruch and Tim Warren.
Fellow soundies P&S had been researching commercial music options for the agency and had come up with a great shortlist for the project. Riton’s Rinse & Repeat stood out because of the sampled content, so it was good news to hear that the agency went with it. From there ourselves and Chris went to town on testing edits that would work for the main film duration and content, working through the storyboards and concepts with Chris and discussing what elements of the track could be attributed to what mechanisms in the apartment.
We were introduced to Riton and discussed the potential for the project plus some of the underlying concepts with him; he sent over the stems and agreed to help if and when needed. We re-built his track from the elements provided by breaking all the parts down into sampler instruments and/or re-creating them with soft synths. Using an Ableton session of our software instruments we began testing their playability on the various pieces of incredible gear Dave Cranmer and the guys were prototyping. See this (sorry a bit rough) video of Pat demonstrating for the director Chris how an element of the sound check in the film might work – we used an Ototo system here to roughly work up the idea:
Once all the pre-production and testing was complete, we spent time in rehearsals working with the performers so they could learn their parts, and get to grips with the gear. Here’s some embarrassing footage of me tweaking some oven knobs and squealing with excitement at the moment we first hooked up the oven/hob controller to a synth patch:
Finally the shoot… we had a day of setup and then a two day shoot – myself and Tom Belton representing String and Tins on midi setup, Ableton Live playback and recording, music direction and supervision. We had Andy Hewitson on sound recordist duties picking up live sound and some wildtrack FX etc. There was a ton of hectic re-editing on set, re-writing drum parts, re-configuring the (custom made by Will G) video sampler, very late nights, moments where the neighbours were trying to shut us down, circuits frying, fire extinguishers firing… and disappointingly not a drop of Desperados was drunk until we had packed down at 2 a.m. after the shoot wrapped. Tequila laced beer never tasted so good!
Here’s a quick behind the scenes of a take from the intro sequence, showing the tech village team checking everything is ticking over… note Tom assisting on Ableton duties like a boss:
Post production – there’s a little bit of trickery and foley to augment the film but TBH it’s all fundamentally done live 🙂
If you got this far in my post (sorry for the ramblings), We Are Pi asked me a couple of questions on the back of the project earlier this year – I include my answers here for a bit more detail on the geek side of things:
‘The film’s execution is deceptively simple. Is there a complex tangle of back-end underneath it all?’
Behind the scenes, each element of the house was sending midi data individually to the audio system – so the stairs steps and railings were sending midi notes, as were the pendant light drums, the pads of the hob and the Desperados bottles. This MIDI was routed to sampler instruments that I built from Riton’s stems – so for instance I took the acid synth part, re-sampled / chopped it up and spread it across a range of midi notes that related to the MIDI notes being sent from the stair railings / bottles in the kitchen (so in simple terms – hitting a railing sends a midi note that triggers a sample). This approach was used for all the stems. Other elements in the house sent MIDI CC (control change) data – such as the instrument switch on the stairs (the thermostat), the taps in the bathroom and kitchen, the umbrella and the hob drawers. I set up various effects: beat repeat / ring modulation / low pass filter / delay plugins – so that we could effect elements of my arrangement of Riton’s tune by controlling these effects via incoming midi CC data. For instance, the destruction of the tune at the end in the kitchen, was a direct result of the taps in the kitchen sending midi cc data to parameters on bit crusher, filter and beat repeat plugins simultaneously. Unlike a standard midi controller a DJ might use where the amount of individual MIDI inputs is minimal, we were dealing with 16 midi ports at once – we maxed out two Motu 8 channel midi interfaces to get everything running at the same time. There were spreadsheets (download the master location modulation destination document HERE if you want to see some routing etc), and lots of scrawled notes to keep track of what we were setting up… e.g. ‘Hob Knobs’:
‘What software was being used to rig the sounds to the physical objects in the house?’
All the instruments in the house were triggering sampler instruments within Ableton, or controlling parameters within plugin effects within Ableton. For certain setups we used the MaxMSP programming language to scale and map various elements of the house before they reached Ableton. For instance when we were setting up the hob synth, Will Gallia had set the Hob Arduino to output certain midi note numbers (say 54 through to 57). I had a sampler in Ableton that contained each word of the Rinse and Repeat vocal (courtesy of the awesome Kah-Lo!) spread across the whole keyboard. In order to quickly re-map different words onto different hob ring pads to test different arrangements, rather than change the Arduino programming or the sampler layout, we made a quick Max patch to do the re-mapping for us. There were various other MaxMSP patches that were to do with mapping controller numbers and re-routing midi to the DMX lighting desk as well. Actually the flexibility of the midi controller setup in Ableton is amazing – it’s very quick to change and scale incoming midi cc data to multiple parameters at once – so for instance the bathroom taps were finely tuned so that the tap turn would control beat repeat on, grid size, and ring modulator frequency all at once, with a different amount and polarity per parameter. Ableton also has the ability to run MaxMSP plugins internally – we were trying to work out a simple way of turning Ableton effect parameters and notes into OSC information that Will Gallia could use for the lighting software he had written for the project, and we ended up using various Max for Live devices within Ableton to do this. I was speaking to a music producer friend of mine called Max Cooper – he does a lot of really exciting live shows that involve a lot of tightly synced animation and graphics to his music – he suggested looking for a Max for Live device to be able to get parameter information (ableton envelopes) out of Ableton as OSC – and after some hunting I found the most excellent ‘livegrabber’ group of devices – using the paramgrabber device allowed us to convert certain effect parameters in ableton into OSC that was passed across a network to Will Gallia’s machine that was controlling various aspects of the lighting – for instance the position of a tap would determine the position of a ring modulator effect in ableton, that would in turn determine a light intensity control in the lighting system via OSC.
‘You’ve worked on projects that map visuals to sound. Was mapping sound to physical triggers a completely different beast?’
Part of the challenge of this project was making sure that the arrangement of Riton’s tune was going to be playable by the performers in the house. As many of the instruments weren’t ready to test while we were deciding on the arrangement, we needed to use readily available tools to demonstrate certain aspects of the arrangement – for instance we set up a very basic stairs instrument using an ‘Ototo’ so we could see whether our plan for the stairs was making sense (see ‘testing the stairs soundcheck.mov’). Mapping sound to triggers is extremely common – since the early days of sampling, people have been taking recordings and mapping them to pads and keys within a midi environment. But the challenge in this situation was quite different – the feedback and response from the parts of the house that were mapped to samplers or effect parameters needed a lot of fine tuning, fixing when it broke, even re-thinking entirely on occasion. I suppose an electronic drum kit is a common thing – the drum pads are engineered to have the perfect response, and the way the trigger within the drum is mounted and engineered means there is a great deal of isolation and range. In contrast to that, the pans in the kitchen were extremely temperamental – they were super loud and resonant when hit, which meant that they had a tendency to trigger not just themselves but the other pans near them that weren’t being hit – so we spent some time messing with the levels and thresholds of the outputs to get them to behave. We also had to change how they worked – initially we were using tiny electret mics that were sending a voltage to an Arduino that would then convert that to midi, but we decided to switch to using piezo transducers that sent to the trigger inputs on a drum brain that in turn converted the triggers to midi – this was the most stable setup for the pan drums. This kind of testing and revising situation with the pans was mirrored all over the house – because it was all custom stuff, we needed to spend a lot of time tweaking it to get it to work how we wanted it to.
‘How was it to collaborate with Nervous Squirrel and combine programming with physical engineering?’
In the weeks before the shoot when we were working on the arrangement of Riton’s track, it was awesome to see the physical elements being developed by Nervous Squirrel in parallel with what we were doing – we would get sent a video of Dave swinging a kitchen blender around with a laser projecting from it, or a weird video of a massively elaborate kitchen tap and wooden spoon assembly… suddenly an ominous glove would come into shot and bring the thing to life – it was exciting to see these physical items being created, and be able to put into practise what I had imagined they could be controlling in Ableton.
On the shoot itself it was inspiring to be able to work with such a multi-talented bunch as Chris Cairns, Will Gallia, Tim Warren, Maz Starch, David Cranmer, Tom Belton and the rest of the guys in the production. Although we had our core roles, we all helped each other out when needed to solve problems that were seemingly personally insurmountable – when I couldn’t work out how to do something from the Ableton side of things, I could turn to Will Gallia who always had a coding solution; when the set building guys setting up the pans needed help tweaking the settings on the drum brain either myself or my colleague Tom helped out; when we suddenly needed some new parts soldering, David was able to help.
Ok I think that’s enough eh… thanks so much to We Are Pi, Partizan and Is This Good for getting us involved in such a fun job… and thanks to Riton for making such a banging track and trusting us to mangle his stems!
p.s. here’s a clip of the laser blender Chris and Dave made that got ditched… there’s an excellent Reaktor ‘Lazerbass’ patch we were going to hook up to match the oscillations and intensity of the blender… watch out for Dave Cranmer’s huge modular synth on the wall at the end of the clip: