Taking the Plunge: Immersive Mix Techniques — AudioTechnology
What’s it like mixing in Immersive? Is it a brave new world of ‘anything goes’? Experienced live sound engineer, Dennie Miller, breaks it down.
Stereo has been with us for a long time. Over the decades, audio engineers have come to master a whole range of techniques in order to make all the elements of a composition fit within the limited sonic space afforded by stereo systems. We are now at a place where we can relearn some of those rules.
If you’ve not experienced an immersive live show, then as an engineer or concert goer, you owe it to yourself to find an opportunity. It’s an entirely different sonic experience. Everybody’s favourite sound engineer, Dave Rat, once said ‘nowhere in nature does the exact same sound radiate from two different points in space [simultaneously]’. Which made me think: stereo has served us pretty well, but ultimately it’s only properly served a narrow band of the audience in the sweet-spot.
Immersive breaks that limitation. The energy is all around you. It transforms the concert experience from a battle of compromises into a visceral process of creativity. It’s the next evolutionary step in mainstream concert production.
At this point, I’ve used a number of immersive audio platforms. Most recently I’ve spent time at Meyer Sound HQ working with its new SpaceMap Live platform — using it and offering my feedback. It’s been amazing fun, and they are really onto something with their take on immersive tools.
This issue I’d like to share with you some insights. Immersive has the power to transform the way you mix.
The last 20 years has seen the rise and rise of live sound fidelity. Line array and in-ear monitoring have been two key drivers, but what hasn’t changed much (until the emergence of immersive audio) is sound source localisation. Stereo record production allows the virtual placement of instruments across the sound stage. Stereo live mixing doesn’t. Not really. As I mentioned in my Dave Rat ruminations, stereo in a concert setting only adequately serves a small portion of the audience. The rest of the audience do not enjoy anything like precise sound source localisation.
So how important is localisation? Let me paint an intentionally outlandish sonic picture for you:
Imagine how bizarre it would be to see your favourite artist in concert, and see them singing on stage, but you are near the back of the audience hearing his/her voice obviously coming from a speaker just behind you? That would be a serious psychoacoustic disconnect. Our mind wants to hear the sound radiate from where we see it originating.
Immersive audio goes a long way to joining the psychoacoustic dots. When you hear an Immersive system with five, seven (or more) sources across the sound stage, you’ll be amazed. You’re not relying on the phantom placement of sound sources any longer. Instruments emanate from their position on stage because they’re being reproduced by those corresponding loudspeakers. It means so much more of the audience enjoys sound localisation — not just the lucky, sweetspot, few.
Here’s how I like to think about what’s going on in an immersive PA setup: more speakers are doing less work, which yields more sonic space for the engineer to preserve the frequency content and dynamics in our independent sources.
In the past we’ve been compelled to hack and squash our sound sources to help create the illusion of space. With Immersive, you don’t need to brutalise your sound sources… unless you want to for creative reasons. This is a profound difference for two reasons. No. 1: It means you don’t necessarily need to spend so much time and energy working to make elements in the mix sit correctly with each other. No. 2: Since the sounds are less adulterated, perhaps even familiar, it’s easier to localise them in your immersive system — it’s easier to pinpoint a sound’s location because it retains more of its original character.
It’s an exciting time to work in the touring industry. The days of loud stages and tube amplifiers are slowly coming to an end, and loudspeakers have come so far that we can now use the loudspeakers to judge the quality of our music… not just the other way around! All of these are factors very much to the benefit of the concert experience.
There’s another sonic improvement that’s been a hot topic in touring as of late, and it’s certainly one of my favourites to discuss. Amp modellers. Not so long ago, touring musicians would rather die than part with their fastidiously curated combination of amps, cabs and effects. Now, the amp modellers are so good (and I’m thinking of the likes of Fractal and Kemper) that to move from loud and temperamental hardware to the virtual/digital world of guitar tone is no longer anathema. This is great news for we, the sound engineers. Traditional amplifiers’ tone fluctuates with ambient temperature, humidity, altitude, A/C line voltage, and physical placement. Consistency was impossible. Then there’s also the less discussed point that the microphones we use are imperfect. There’s simply no microphone in existence that can capture 100 percent of the nuanced sound of a real amplifier with zero distortion. The amp modellers have negated these factors. They’re digital (perfect consistency night to night), they don’t need miking up, and they greatly lower the level of stage sound volume thanks to the absence of a speaker cabinet.
It feels like a happy coincidence that a world of increased consistency and sonic fidelity can now be reproduced in the format of an immersive audio system with such ease.
And now — through the combination of Immersive and superior amp modelling — we can localise the guitarist properly in our PA. There’s no backline sound to compete with. We can simply place the guitar where we want with precision and finesse.
RIPPING UP THE RULE BOOK
Some engineers will find immersive intimidating. It’s certainly a change in our way of thinking. But you don’t have to throw away everything you know. Take your drum sound, as an example. Whether you have only three mics or 30 on your kit, you can achieve a superior result with the wider canvas of Immersive. You can use the same creative parallel bus compression you’re familiar with to create the dynamics and energy you desire, but rather than pan your two-channel drum mix hard left and right you can place them in front of you and, say, slightly left or right (depending on where the drums are placed on stage). There’s no need to unlearn everything you’ve refined over the years.
OUT OF MY WAY
There’s always been a very distinct demarcation between creative and corrective compression. Guitars provide another great example of that when it comes to corrective compression. I might find that the piano is treading on the toes of the guitar. So when using a stereo system, I’ll likely reach for a sidechain cross-compression approach to dynamically get those two elements to happily coexist in stereo.
Now, with Immersive, I can simply move the guitar a little bit and now I’ve got a whole world of sonic separation. All because I have more than two places to go — left and right. Now a 130-channel pop show can become as clear and as defined as a guitar/bass/drums rock show like AC/DC.
BREAKING THE RULES
I’ve mentioned how our brains find it pleasing to hear and see a sound source emanating from the same, matching point in space. But for elements that don’t have a physical presence on stage, like special effects and certain playback tracks, you’re not constrained.
I’m talking about percussive elements, synths, or even background vocals. They all have one thing in common: when they’re a playback source, you can’t see anyone performing these sounds. This is where the fun begins, you can begin to pan things around the rear of your venue, or have real-time movements happening that really grab the attention of the audience. Of course, to mess with an audience’s head you do need surround speakers:
BEST BANG FOR BUCK
My view is you absolutely need some surround speakers to really bring the wow factor into an immersive setup. You might have seen deployments with seven or more sources across stage and that will serve to give you a highly detailed sound stage, there’s no doubt about that. But it’s amazing how people respond to hearing sound moving behind and above them. Pink Floyd figured this out years ago. I acknowledge, it’ll be hard to rig those surround speakers in some venues, but it’s becoming easier with lighter, more compact, yet powerful loudspeakers. There will always be plenty of design work to be done in advance, but those special effect channels are what gets people excited — people aren’t hearing that in other concerts and it may very well be the justification for the ticket price necessary to carry the required loudspeakers.
With the rise of Immersive, the sound engineer potentially takes on a new level of creative responsibility. In the past, creative decisions were primarily the concern of the artist and/or the producer and it was the FOH engineer’s job to simply replicate the band’s recorded (stereo) sound. That’s changing. In the same way that the lighting designer currently works with the tour’s creative director, the sound engineer will need to be involved in the audio aesthetic of the show. It’s a new avenue of possibilities for artists, creative directors, and engineers alike. Personally, I’m excited for the future as artists begin creating content with immersive performances in mind.
About Dennie Miller: Dennie is a Nashville-based FOH Mixer and Systems Engineer. His company, Miller Audio Industries, has been bringing music to the masses with a wide range of clients such as Miguel, Volbeat, The 1975, Godsmack, the legendary Bob Dylan and more.