0
Read Next:

Rüfüs du Atmos

Mixing Rüfüs du Sol’s Live From Joshua Tree for Dolby Atmos Music.

By

10 October 2022

Talk about scope creep. Rüfüs Du Sol’s ‘Live from Joshua Tree’ started life as a Mixmag online special. The concept was to take Rüfüs’ live show and perform it to the coyotes. It was recorded live and shot as dusk turned into night, with lighting fixtures illuminating the dramatic desert landscape of the Joshua Tree National Park. It’s seriously worth a look/listen, even now, two years on from when it hit YouTube.

Actually, it was such a hit that budget became available to remix the production for a theatrical release. Rüfüs du Sol’s go-to audio engineer, Cam Trewin, booked himself into a Hollywood dub stage, to remix the audio in surround. The studio manager suggested he ‘may as well’ mix it in Atmos, as the room was equipped for it and wouldn’t attract an additional fee. Cam embarked on a crash course in Atmos mixing. The movie premiere was happening down the road, and Cam was able to test his Atmos mix in that theatre. He wasn’t convinced about the results (“the music sounded disconnected to the visuals”), so they resolved to spend the rest of the post production process in 5.1 and put the Atmos mix to one side. Where it stayed for over a year, until Apple made the snap decision to put its full heft behind Dolby Atmos for Music. Cam Trewin got a ‘drop everything’ phone call from the band who was getting pressure from its management who was getting pressure from its record label who was getting pressure from Apple. “We want Rüfüs to be a featured artist when the announcement goes public!”

The music industry was in full scramble mode, and as it transpired, the only Australian band to be part of Apple’s Dolby Atmos Music launch was Rüfüs with Live from Joshua Tree.

This is the story of how it was done, told by the blokes who did it.

APPLE: SHOCKED TO THE CORE 

Cam Trewin: “You might think it should have been easy, given we had the theatrical Atmos mix from L.A. in our back pockets. But that was only the beginning. This was an Atmos Music mix, without the aid of any visuals. Because that was the first question we asked Apple: ‘can we release individual spatial audio mixes with companion video from the movie?’ No, was the answer.

“In 2021, like the rest of the world, I was in lockdown at home in Melbourne. I was looking for a place to mix the Atmos version of this release for Apple… which all sounds kinda relaxed and methodical. The truth is, I was frantic. Little did I know but I was about 45 minutes drive from the only Dolby Atmos Music certified studio in Australia at the time, run by Angus Davidson — Red Road Immersive.”

Angus Davidson: “My journey with this studio started when I was initially embarking on a space for live sound engineers to demo spatial audio. Working for KV2 Audio, I wanted to show that you could achieve spatial audio without a huge budget — I had KV2 Audio loudspeakers, an Avid S6L mixing console and a copy of Spat Revolution from Flux.

“Then when covid struck I was looking for ways to get some work through the studio. I was aware that Netflix was investing heavily in local productions, so I worked on getting my demo room transformed into a Atmos immersive mix studio.

“It was around this time that I was coincidentally talking with Cam Trewin, primarily to talk him into giving KV2 Audio a go in a spatial audio concert setting.”

Cam Trewin: “My first thought was I wanted to get hold of that Atmos mix from L.A. and hear it in Angus’ studio. So we got the ADM file sent through.”

Angus Davidson: “It was nerve wracking because it was early days for my Atmos studio. I’d been working on a Netflix movie and early signs suggested that things were translating well — but that was movie content… dialogue, FX, foley etc. The truth was: I’d not done a proper music mix in the studio when Cam arrived.”

WIDE HORIZONS

Cam Trewin: “The Atmos mix canvas is big and, given the possibilities, it can be daunting. I suspect that if you go into a mix like Live From Joshua Tree with a totally clean slate, then you’ll get into trouble. Instead, I already had a stereo master that had been approved by the band, so my philosophy was: ‘this is a great place to start — let’s work outwards from there’.

“Prior to that realisation I was getting lost down a rabbit hole. I was chasing a mix in the room, moving the faders of the stems — drums, synths, vocals et al. It didn’t make sense and it wasn’t translating.

“Angus could tell I was losing the plot: ‘I think you’re digging yourself into a hole by starting to do automation moves. Instead, think about just placing it more in the room.’ And that was that. After that point, things fell into place, because the mix <is mixed>, the intent is already there.

“After all, all the fader rides have already been written within those stems. My job was to take a completed 2D mix and place it into a 3D space.”

Angus Davidson: “Typically, a mix engineer is thinking about a left and a right speaker and everything goes into those two speakers. You’re using compression, EQ, reverbs, delays and myriad tools of the trade to place elements in that stereo image in an effort to create a pleasant sound stage that showcases all the instruments and vocals. We’ve now got 14 speakers… not two. Suddenly, the two-speaker restrictions disappear. The question then is: what do we put in those additional speakers? How much do we move things around?”

STAYING GLUED

Cam Trewin: We’ve noticed that other engineers are still very much mixing with a two-track focus and then taking key elements, making them object based, and putting them in the room. That was my starting point: putting the stem faders back to unity and then identify what could be turned into objects and moved around. So rather than a fader move, I can accentuate parts by literally moving it in the space. That was the defining point for me. It helped me understand what Dolby’s done with object-based mixing in the X, Y, Z plane.

“But my advice to anyone embarking on an Atmos mix is: don’t lose the glue that traditionally holds a two-track mix together.”

Angus Davidson: “I’d echo that. I’m listening to other engineers talking about mixing in Dolby Atmos and all the advice is to keep that glue — ensure the object-based moves your making aren’t ungluing the integrity of the mix. Or, in other words, don’t just put an element of the mix somewhere for the hell of it and for the big-picture relevance of that element to be lost as a result. It’s tweaky and it’s subjective but it’s important.

Cam Trewin: “Rüfüs music has a lot of sound design it, which a lot of electronic artists are doing these days. Those elements make for an obvious candidate as an object, moved dynamically in the mix. For example, one track from the album has Blade Runner-style sound design in it — something like the sound of Deckard’s vehicle. It’s a very cinematic moment that allowed me to push that from the back of the room forward. I did a lot of experimenting with that before just going for it. I think it actually really works. There are many moments like that, but those objects aren’t messing with the overall intent of the mix.”

My job was to take a completed 2D mix and place it into a 3D space

ROOM CONFIG

Angus Davidson: Dolby Atmos Production Suite is designed to operate on the same computer as Pro Tools utilising a utility called the Dolby Audio Bridge, which is a 128-channel internal audio driver that sends audio from Pro Tools to the renderer and back. If you want to run an external renderer, you have to run DAMS, which is Dolby Atmos Mastering Suite. That’s not my experience, as I’m running them on two computers. So I’ve got a Pro Tools Mac, a Dolby Atmos Mac and third computer running video.

“From the Dolby Atmos Production Suite you can view your room set up, charting where you have your speakers. Currently I run 9.1.4. So that’s three speakers across the front, two at the back, two either side, with four height speakers. 9.1.6 is the gold standard. Beyond that you’re looking at speaker arrays, which the Production Suite and DAMS can wrangle in larger rooms. “Production Suite helps you optimise your setup. Everything’s delayed back to your sweetspot or MLA (main listening area).

“I’m running Avid gear here. I’ve got an Avid MTRX Studio with a bunch of cards in it, which is around a A$30,000 investment. I’ve got the Dolby Atmos Renderer and I’m running Pro Tools Ultimate. I’ve got an Avid Dock here, but I’ve also got a 40-channel D-Command. The speakers are probably worth around A$40,000. And then there’s the room itself, which took about four months to build from scratch. I’ve got a background in acoustic design, so it’s properly designed, measured and tuned. 

“The KV2 Audio LCR speakers are the EX10s (self-powered 10-inch+horn), while the surrounds and ceiling speakers are EX6s (six+horn). There is EX1.8 sub, which is an 18-inch speaker in a manifold enclosure. This system has proven to be extraordinarily accurate, both at low volume and all the way through to ear bleed, and everyone that’s worked here has just been blown away by how accurately they translate to every format.”

BINARUAL & QC

Angus Davidson: You can listen to your Atmos mix using a pair of headphones with what’s called a binaural mixdown. Binaural is an interpretation of the immersive mix that’s been  manipulated in such a way that it gives you the time-based and frequency-based information to recreate surround sound in a pair of stereo headphones. It’s quite extraordinary when it’s done really well.”

Cam Trewin: “You get there by exporting an MP4 out of Dolby Renderer, which you can save to a thumb drive or export straight to your phone. If you put it in the Files of your iPhone and pop any pair of headphones on with Dolby Atmos turned on, you can hear that binaural rendering in your headphones.”

Angus Davidson: “Dolby is keen for audio people to experience this, with a free 90-day download of the Dolby Renderer. In my opinion, to achieve high quality Dolby Atmos surround mixing, you’ve got to listen to it in an Atmos room. Sure, the vast majority of consumers will be listening to the mix on headphones but in the same way you currently monitor via the best possible stereo studio monitors you can afford, you’d do the same for an Atmos mix.

“Having said that, I’ve just recently bought a Sonos Arc Soundbar, which is also quite good. I bought it primarily as a QC tool — to check how my mixes are translating.”

Cam Trewin: “I have a domestic Atmos setup with a Marantz AVR and some PMC monitors. That was the only way I was able to QC my mix at home and that went through my Xbox — Microsoft and Windows have supported Dolby plugins and players  as part of all the PCs and Xbox.”

Angus Davidson: “If you’re mixing for Dolby Atmos, there are a set of standards that you have to meet to be able to to deliver  to Apple Music. “The minimum acceptable room is 7.1.4. While you can you can mix using Binaural, to actually get it onto Apple Music, it has to be mastered in a Atmos-certified studio.

“Mixing is, as you’d expect, very different in an Atmos environment. In the Dolby Atmos Renderer you have your 7.1.2 mix bed, then you have up to 118 objects that can be placed anywhere in the 3D virtual space — you’re not placing sound into speakers, you’re placing them virtually into space, which allows your mix to translate to much larger systems.

“The tools and the processing are naturally more processor-intensive. A 7.1.2 reverb, for example, chews through the cycles. Nugen produces lots of good tools in the space — the Paragon reverb is amazing, for example.”

A LOT TO PROCESS

Angus Davidson: “Traditionally, getting a mix complete requires the usual back and forth with the client, mix notes and revisions — attenuate the hats a bit here, boost the vocals by half a dB there. It’s about how hard you’re hitting the bus compressor, what you’re doing with the master EQ, or how much harmonic distortion you’re choosing to use on something… these are the kinds of thought processes you have with a two-track mix. With Atmos, everything changes.

“In fact, if you make good musical decisions about the way you build your mix, there’s an appreciable decrease in the need for processing. Parts that you would perhaps EQ the crap out of to make space for, you don’t need to do that. The same with compression. That was illuminating.”

Cam Trewin: “It’s more about placement than about crafting. I find I’m asking myself ‘How does this section make me feel?’ Maybe it feels like the lead synth wants to go somewhere in the room. Perhaps that percussion percussion part wants to do a circular motion — deciding where certain elements belong in the room… that was the creative process.”

my advice to anyone embarking on an Atmos mix is: don’t lose the glue that traditionally holds a two-track mix together

LOUDNESS WARS: A TRUCE?

Angus Davidson: Since streaming video has become huge and streaming audio, with the likes of Tidal and Spotify, there are now certain standards that have been implemented to help guarantee the quality of the deliverable content. The ITU-R BS.1770-4 standard for sound, states that it has to be delivered at -18LKFS with a true peak of -1. On the other hand, a standard stereo offering would normally be mastered somewhere between -7 to -10, with really loud as -7 and your average would be -9 or -10. At -18LKFS, that’s 10dB quieter than what we’re used to. Suddenly, we’ve just won the loudness wars by default; dynamic range has a chance to come back into music. Everything used to be hitting zero and just pounding; limited up to within an inch of its life. That to me is one of the most incredible results of this whole exercise — we’ve redefined how audio is going to be delivered to the public. A lot of people are bitching about it because they saying it’s too quiet. In my view, that’s what the volume knob is for — turn it up.

RESPONSES

Leave a Reply

Your email address will not be published. Required fields are marked *

More for you