0

Sci-Fi Reality

Elysium director, Neil Blomkamp, uses audio to help audiences suspend their disbelief. While two new surround formats have given movie theatres another dimension.

By

9 September 2013

Reality and sci-fi don’t normally mix. As soon as Star Wars audiences, back in the day, experienced the sub-rattling rumble of a giant spacecraft cruising overhead they were hooked. The pedants who pointed out that there is no sound in the airless expanse of space were shouted down. George Lucas and his right-hand audio henchman, Ben Burtt, had created a totally fresh world, one ‘far, far away’ from the banalities of Earth, humanoids, and, often, the natural laws of physics. ‘Reality’ didn’t come into it.

Neil Blomkamp isn’t George Lucas. When District 9 was unleashed on an unsuspecting world, people simply couldn’t believe how believable the movie felt. I’m sure many curtains twitched in Johannesburg at the time — half expecting a huge spaceship to be parked over its outskirts.

Believability is tough to attain. Most commentators will focus on the fact that Neil Blomkamp started life as a special FX creator and has a talent for composing richly detailed digital worlds. True. But it’s also been recognised that Elysium (not to mention District 9) was built with nowhere near the budget of an effects-heavy Michael Bay movie. In other words, believability is far more than simply having an army of FX techs and acres of feverishly-rendering server farms. It’s a philosophy.

REALITY IS AN ATTITUDE

So when Neil Blomkamp talks to his sound guys, he’s reinforcing an attitude: it’s all about locking the audience into a world of his creation, and not doing anything to shake them awake from their reverie.

Only, Elysium is about two worlds, and they could hardly be more different.

Dave Whitehead: Earth is a mess: polluted, overpopulated, depleted, and a police state. The technology is old-school. Conversely, Elysium [the orbiting world created by and for the rich] is a paradise: a huge country club where sickness has been eliminated and technology is seamless and largely unseen. Knowing this was a solid foundation for me to begin to build two very different audio worlds.

Dave Whitehead is Elysium’s sound designer. He’s based at Park Road Post in Wellington and most recently been working on the latest Hobbit. Dave spent 10 weeks recording, collecting and preparing sounds, and another 10 weeks with the movie preparing his sound design for the final mix. I started out by asking Dave how he made the Earth feel so grittily authentic.

Dave Whitehead: It all starts with Neil as the director. I recall having an interesting conversation with Neil and how he likes the idea that even though this film is a sci-fi, the source sounds come from the real world. By which I mean, what you hear as effects and ambiences are composed of source sounds we as humans hear daily. So in that sense it’s far easier for our brains to accept the sound. Neil really shies away from synthesised sounds — I couldn’t really bust out the Moog for this movie, because he doesn’t like over-processed sounds. It’s definitely not Transformers-ish. Not to say I don’t like the Transformers movies, I love them, but Neil would really shy away from that… it would be too processed.

So if you take the Matt Damon character’s exo-suit, we recorded every servo we could, every printer we could; we went to Weta [the creative/modelling force behind Lord of the Rings etc] and recorded all their robots. We recorded every gadget in our house and friends’ houses.

GOOD VIBES: FLYING WITH THE RAVEN

If there’s a cooler ‘bad guy’ spaceship than the Bounty Hunter’s ‘Slave 1’ in Star Wars, then it’d have to be the Raven. 

Dave Whitehead:I liked the idea that the Elysium spacecraft and the underlying technology worked on ‘vibration’ for its means of propulsion. I took that concept quite literally and bought some vibrators, shoved them into a Dobro guitar and miked up the results, moving it around, using it with a slide, etc. I also attached a vibrator to a colander — stuck it to the bottom and swung it around with a rope.

The Raven is a military spacecraft and I love the sound of a Huey helicopter. So I thought if I could evoke the feeling of an approaching Huey with rippling sounds coming over the hills. We were given permission to record on the tarmac of Wellington airport, including standing near the engines as they started them up, which was awesome. And it was the real-world sound of the jets that you’re accustomed to hearing that helps you accept that there’s this menacing spacecraft approaching in the distance. While it’s the vibrators providing the X Factor.

MIXING AS YOU GO

Craig Berkey and Chris Scarabosio mixed the movie. Chris was in charge of the music and dialogue, while Craig mixed all the effects, Foley, ambiences etc. Craig is based in Vancouver along with Neil Blomkamp and worked closely with the director throughout. I asked Craig about his approach to mixing a movie.

Craig Berkey: I work with a filmmaker from day one. We work in a linear fashion from beginning to end. Whatever tool we use doesn’t matter to me — I build a soundtrack and they’re involved the whole way. Every time I add a new sound, I can pan it, EQ it, and mix it. That’s the beauty of this approach. By ‘building as you go’, when you get to the final mix you have more creative freedom – you’re thinking about the big picture rather than scrambling to ensure you’ve got everything in place. When I get to the final mix stage I can hit Play on my ProTools session and every track plays with its automation… but every element also remains separate. I’ve not committed to stems that can’t be pulled apart.

AT: So you’re not under quite so much crushing pressure in the couple of weeks you have to produce the final mix.

CB: Right. On the first day of the final mix on this film, Neil Blomkamp had already heard the mix. Which means that rather than focussing on ‘why is that footstep so loud?’, we have the luxury of stepping back a bit and thinking: ‘do we really need that music cue?’ etc. We can look at the big picture rather than trouble shooting.

AT: And I guess it means you’re less likely to be thrown any curve balls from the director at the 11th hour if he’s already comfortable with the mix?

CB: Exactly. He likes what we have. It’s not like he’s not heard anything for the first time. If you hear a director say ‘What’s that?!’ in a final mix; if I have to explain to a director what something is, we’re in trouble!

AT: Given all your pan automation is within your ProTools session you must had a few misgivings about having to do it all again for an Atmos and Auro mix?

CB: The kind man from Dolby copied my pan information from the ProTools panner to their plug-in. Even though the two panners don’t match, you can Copy and ‘Force’ Paste in ProTools. The upshot is that all my panning automation showed up on their Dolby panner. And what it meant was that I could make a track an Atmos Object and my original panning would dynamically pan in the Atmos space, rather than just simply showing up in the surrounds.

Ah, yes, mixing for Auro and Atmos. That’s almost a story in itself…

The final mix in full swing. Dave Whitehead

ATMOS & AURO: MIXING ON A LARGER CANVAS

Yes, new cinema formats! Auro has been developed by Barco, the world’s biggest supplier of high-performance cinema projectors, and Dolby. Both promise a more immersive listening experience for theatre goers thanks to more channels. Both promise a superior listening experience thanks to full frequency speakers throughout the theatre, rather than narrow-band speakers that are 3dB down on those at the front. Both promise the ability to mix in the vertical plane, not just horizontal. But that’s about where the similarities end.

Auro is an 11.1 format with the extra channels occupied by a second tier of speakers above the conventional, as well as yet another layer overhead (three tiers in all). Dolby Atmos is a 9.1 format, with the extra two channel being overhead (left/right). The kicker with Atmos is you can assign tracks to be ‘Objects’ in the mix, which allows you to individually place that sound discretely in any speaker (not just the surround channel) and pan it accordingly.

even though this film is a sci-fi, the source sounds come from the real world … I couldn’t really bust out the Moog for this movie, because he doesn’t like over-processed sounds

WORKING IN NEW FORMATS

Elysium wasn’t originally mixed in the new surround formats: Dolby Atmos and Barco Auro. Rather, about a month after the ‘final’ mix, Craig Berkey and Chris Scarabosio got the call-up to head to Skywalker Ranch where they would ‘remix’ the movie on a Neve DFC console. Saying that, Chris was quick to point out that their intention wasn’t to ‘remix’ the movie but simply to take advantage of the extra surround dimensions afforded by the new formats.

As outlined in the ‘Larger Canvas’ box item, both new formats provide extra surround channels for a great degree of immersion. Auro tackles this more conventionally, with an 11.1 panner that provides a ‘height’ dimension to push sounds vertically. It takes a little more time to understand Atmos.

To come to grips with Atmos you have to understand that there are two ways of addressing the speakers within the new format: Bed tracks or Objects. Bed tracks are panned in the room like a regular 7.1 setup, only you also have two extra channels overhead, so they call it 9.1. Objects on the other hand can address individual speakers.

Craig Berkey: “For example with a shuttle sound. In a 7.1 mix when I panned it from the front to the left side and then to the left rear it would go to the whole wall of the left side and the whole left hand side of the back wall. When we did the Atmos mix, I could take the same sound and put it in the object track and the same panning made it travel down the wall and along the back. There was a greater sense of motion.”

AT: Placing sounds discretely in an individual speaker sounds wonderful, but movie theatres come in different shapes and sizes. How can you be sure the precise speaker to which you’re panning actually exists in every movie theatre?

CB: Good point. If someone on screen is responding to a sound in the left surrounds in my mix room, we can lock the sound to the fourth speaker in the left surround wall. But if I go to a theatre with a different number of speakers — if the source is locked to the fourth surround — it’s not going to be lined up to where this actor is looking. Depending on the size of the room, Dolby can allow you to lock the sound to that location in the room, rather than that speaker channel. There’s a mode on the panner that allows you to do that.

MIXING TO NEW HEIGHTS

Both new formats explore the vertical domain. Auro has three levels of speakers: an entire additional layer higher up the wall and a smaller complement (one channel) on the ceiling, while Atmos adds two ceiling channels to its 7.1 setup.

AT: What did you think about the extra height offered by both formats?

Craig Berkey: The overhead channels are tricky to deal with. The danger with Atmos, especially, is it can take away some of the width of your mix when you pan things up there — it feels like you’re mono’ing things up. I used them for specific FX. Like when Max [Matt Damon’s character] gets locked in the radiation cell. I have those FX sounds up the top and those extra channels are really effective for things like that. But when we placed other, less specific, components up there, we found it could sound cloudy or muddy. We just had to be careful.

Auro doesn’t have the Object mode; it has channels. It’s got an upper and lower surround mode and a couple on the ceiling. Auro was good for creative ambiences because the upper level wasn’t too far overhead — it didn’t pull the sound up to mono. Auro also has three additional channels behind the screen above the normal LCR channels. I used those for spaceships and other elements that were placed up-screen high. I could pan up there and that would help provide some separation for the dialogue — it would provide extra clarity. Atmos has height channels on the ceiling, they’re not on screen, so it doesn’t work as well in that regard.

Braves the wilds of Canada to record atmos for Elysium.

MIXING MUSIC IN AURO/ATMOS

AT: Chris, How did you attack your music mixing in the new formats?

Chris Scarabosio: I had my music on 12 stems. I started with the bed tracks — such as strings and horns — by putting them in the middle height position (there are height panners in Auro). In Atmos, the speakers just off the screen, not behind the screen — what we liked to call the ‘band shell’ channels — served as a really good place to put the orchestration (I created Objects out of those orchestration stems), and opened up the front three speakers quite a bit for dialogue.

Most of the time I would keep the big drums and percussion across the front, occasionally move it into the room but then bring it straight back.

AT: No doubt it will be interesting to mix in Auro and Atmos from scratch.

CS: That’s right. I think I would take an approach that was more dimensional. Elysium’s music track is quite dense and you could see how the additional channels would allow you to spread elements out in such a way that it still feels completely cohesive, but with a little more room to breathe. Saying that, I think the original 7.1 mix came out pretty good!

AT: Were you concerned that the extra channels of the new formats made it feel like the fabric of the music mix was getting stretched too far?

CS: That was the danger. And pulling it apart really does change how the music plays. There were times when I had to check myself: ‘Okay, now this feels like a different mix; it feels like a bunch of different stems’. It was a matter of having the time to carefully place the instruments then apply some reverb in between to glue it all together.

AT: So reverb was an answer?

CS: Reverb on specific channels, and using reverb not to create more ambience but to localise the ambience to begin to transform the entire room into one big speaker. I think that’s ultimately the goal: to utilise the whole theatre to make it sound like one big living entity. But it’s more involved than just panning some stuff around.

I never wanted to use the system as some kind of gimmick or novelty. I just want everything that’s happening on the screen to be more dramatic, so you truly experience it. It’s not like, ‘oh, I see what they’re doing there’. It’s more like: ‘I dunno what’s happening, but I’m totally involved in what’s going on’.

ADR: PARDON MY FRENCH

AT: Word has it that Jodie Foster’s dialogue was all re-recorded?

Chris Scarabosio: The studio wanted Jodie Foster’s accent changed. She had done her original performance with a French accent, and a lot of test audience people weren’t responding well to that, so all her lines were re-recorded in a more American accent.

AT: A huge task.

CS: Yes, a huge undertaking. Vince Renaud, the ADR supervisor/dialogue supervisor, did a great job working with Jodie Foster, who’s a phenomenal looper — her ADR skills are really good. So it was a combination of being vigilant about getting the performances as tight with her mouth as possible, but recognising that some of the different inflections meant that getting a perfect lock every time was impossible.

AT: How do you smooth the edges?

CS: EQ and reverb are always the first stops — matching the tonal balance and space that’s captured on set in production. But I also use a plug-in called The Decapitator. It’s a plug-in designed to provide extreme distortion, but I use it like a preamp to colour the voice in such a way that it doesn’t sound like it just got recorded in the studio.

Ultimately, it has to sound part of it all happening that day on the set. Even with sound effects. I don’t want anyone to know that whatever we did happened in a studio. I just want it to feel like it happened right then when the cameras were rolling.

Mixing the final on an Avid D-Control console at Sharpe Sound studios, Vancouver. Photo: Lee Smith

OLD TECH REBORN

Dave Whitehead: The Elysium earth is a wasteland, where all the technology is old. Neil said he wanted the PC noises to be more like Commodore 64 type noises — really old tech. Neil is a very good communicator in terms of what he wants and definitely what he doesn’t want. He’s good on the brief right from the start. From the beginning he said he loves CB radios going on constantly in the background. So you’ve got the constant babble of messages coming over radios, which contributes to the oppressive feel on earth.

Chris Scaraposio: I’m a big fan of Speakerphone [the plug-in from Audio Ease]; some of the results you can get from that are pretty incredible.

IF YOU HAD TO CHOOSE

AT: Auro or Atmos?

Chris Scarabosio: In some ways Auro was pretty satisfying because, especially for the music, you can spread it out quickly — it’s easy and it sounds wider and bigger. It’s immediately a case of: ‘oh that’s great’. But it doesn’t give you the range and complexity of Atmos. So I think Atmos definitely would have an edge over Auro in the long run.

Craig Berkey: I liked different aspects of both. But I’d like to have something neither new format provides: more speakers horizontally across the front, so we can pan dialogue more easily — it would anchor the dialogue and sound better. I’d sacrifice the extra speakers around the theatre for that. I’ve never thought to myself, ‘I wish I had speakers on the ceiling’. I’ve never said that. Maybe others have.

AT: It’s not like you’ll be given more time to mix movies now there’s Auro and Atmos?

CS: You’d like to think you might, but you’re probably right! But I think Dolby is talking about ways to approximate the array systems in a smaller room so that even without all the speakers you can get close in a premix.

AT: What did Neil Blomkamp think?

CS: He liked the extra clarity and definition; the dialogue seemed clearer; and it felt like the room had become bigger.

RESPONSES

Leave a Reply

Your email address will not be published. Required fields are marked *

More for you