Horsing Around

The new Kinect-controlled Fable game features an epic journey of destiny undertaken in the company of a rather fine filly called Seren. AT saddled up and trailed the footsteps of Russell Shaw and Steve Brown.


17 September 2013

Story: John Broomhall

As an avid fan of the near-hallowed Fable franchise, Steve Brown has spent more time than most roaming the beautiful, enigmatic world of Albion. He joined the audio team for the fifth instalment in the Lionhead Studios-created series — commandeered by composer and music/audio head Russell Shaw — as sound supervisor. It was not only a dream come true for Brown, but a once-in-a-lifetime opportunity that many audio guys would give an ear for.


Fable: The Journey is a spin-off from Fable III, giving the creators license to change up the format of the game. The main difference is that it’s an Xbox Kinect game, where you use your limbs as controls. It also means the game is in first person perspective, which is a big departure from the previous titles, requiring a level of detail in the sound design that would entice you to reach out and touch Fable’s
virtual world.

Steve Brown: “Part of my ethos behind the sound design for a Kinect project is that the audio should tell the story and convey the actions of the user who is literally using their body to interact with the game. We don’t want lots of HUD (Heads-Up Display) abstract noises that don’t really communicate the physical action of the user. For instance, when you’re riding the horse, Seren, if you make a gentle whipping action with the reins, you’ll hear a nice whip crack sound that makes you think, ‘Good, I’ve got the action right.’ But errant players might keep whipping Seren till her health level drops. If they kept on, you would see her heart health indicator on the HUD eventually drop down to just one heart icon. At that point we would alter the mix and choice of cracking sounds to be much more abrasive, psychologically telling the player they’re overdoing it, rather than using a warning beep.”

When gamers were feeling sorry for whipping Seren too hard, they were really thinking of this bloke


From a technical point of view, using Audiokinetic’s Wwise middleware for the audio ‘engine’ (the software system that controls sound replay based on instructions from the main game engine) enabled the team to deploy real-time parameter controls (RTPCs). In practice, this means the audio re-play system can ‘poll’ or monitor data being sent from the main game and interpret this intelligently to determine re-play of a sound element. For instance, when a player moves their arms as if to stroke the horse, the movement detected by Kinect would be intepreted to modulate and pan the audio on the fly, using sound to viscerally link the player to the graphics. “Wwise is geared towards the sound designer’s perspective rather than a programmer’s perspective,” said Brown. “And it influences the way game sound is designed.”

Russell Shaw: “It’s very intuitive. You can get straight in and implement a sound, attaching or associating it to an object in the game. Then you start to think, ‘I wonder if it could do this or that?’ And as you delve deeper, you realise there’s a whole sub-layer of functionality allowing you to do some amazing things. You can get lost very quickly if you’re
not careful.”


Brown:Fable’s always had a high level of detail, but Fable: The Journey required more. Because it’s ‘first person’, the creatures in the game can get right up to your nose, so we needed to re-design them by adding extra layers of detail. For instance, we listened to the Balverines which persisted from Fable 3 and although the essence of what we already had was right, we worked with Soundelux to create separate sound character elements to be mixed together on-the-fly — the roar, the guttural stuff, and the airy breath. These creatures come at you from 20m away. Now we can dial in the detail as they get closer, until they’re so in-yer-face, you can almost smell the sound elements!”

These multiple layers of sound design elements to be switched in and out using Wwise’s ‘blend containers’ place a significant load on the CPU, so Brown and Shaw had to work smart, trading off voice-count, against run-time DSP effects, against data compression quality.

“The voice-count teetered around 130 with a memory limit of 30MB of data-compressed audio loaded at any one time,” said Shaw. “We use the Xbox’s XMA compression across the board and WXMA compression on the dialogue. Someone once asked Valentino Rossi the secret of his success. He said that throughout the whole race, he rides as fast as he possibly can until he’s just about to fall off. That’s pretty much our approach to pushing the Xbox hardware to its limit. We throw as much as we can at it until it ‘breaks’, then we rein it back a fraction and that’s how it’s running all the time. Our games are a real testament to the hardware given the number of people working on the game and the amount of content it’s handling.”


For Russell Shaw, the new title provided a fresh canvas on which to develop the much-loved Fable music canon — resulting in a staggering five hours’ worth of score, orchestrated and conducted by long-time collaborator, Allan Wilson, and featuring the Slovak National Symphony, percussionists from the London Philharmonia, The Pinewood Singers and Celtic instrument specialist, Bob White.

Shaw created 120 minutes of orchestration, recording multiple versions of some music cues e.g. ‘alts’ of strings only, or strings and brass. The result was some 200 minutes’ worth of actual recorded material. Added to this is the specific Celtic music, created with Bob White. Overall total: 4-5 hours.

Though the music is generally linear, reflecting the nature of the game, the combat music uses Wwise’s ‘interactive music’ engine to segue different versions of the same music, each characterised by a different intensity plus synchronous triggering of ‘stingers’ at key points in the gameplay.

The Stomp-inspired combat music features complex rhythmic percussion for which Shaw and Wilson engaged the services of six percussionists from the London Philharmonia to play a huge range of instruments selected at Bell Percussion, suppliers of instruments for major motion pictures like Star Wars. The 60 instruments selected included Chinese, African, Samba and Celtic sets, including a monster 60-inch taiko. They were recorded at London’s Air Studios.

Much of Fable: The Journey’s music is written using the Mixolydian mode, which the Shaw describes as “all the white notes on a piano keyboard, played over a G root. The beauty of this is two things — firstly, it gives the music a really nice Celtic feel and secondly, I don’t have to use any of the black keys on the keyboard. I find the black keys really scary!”

I am text block. Click edit button to change this text. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.


In the prototyping and placeholder phase, individual library samples were edited tight around a hard attack transient, which will work best to ascertain if the individual sample for each foot fall approach would work.


The sounds were then placed in a random container so each foot fall would trigger a different sample to avoid repetition so as not to draw the player’s ear. These sound events are incorporated into the Unreal game engine by tagging the horse’s locomotion animations, with every footfall tagged specifying which leg bone is hitting the ground.


To use real-time parameter controls (RTPC) to change the sound of the horse’s footsteps at different speeds you program the game engine to calculate the speed of the horse in the world, feed the value into the sound engine as a Wwise RTPC value, and use the value to change the values of parametric EQ and volume changes to the footstep samples playing, to convey the heavy and dense feel of a horse’s hooves on the ground when travelling at speed, providing drama and a sense of speed to the player.


Brown: “Another key area of detail was the bodyfall system we developed. There’s a spell which whips out a kind of tentacle that extends out from the player into the game world. You can grab your enemies with it, then à la Star Wars, you can force–push them up and chuck them around. When they land on the floor, we didn’t want just a simple bodyfall sample. We wanted to build a system that had limb dexterity. The game engine would ‘do the physics’ and calculate whether an individual arm or leg was hitting the ground, as well as the body, as they were all separately defined and identified physics ‘characters’ — head joint, two legs, two body parts, eight joints — all with individual samples, to be composited together according to the player’s actions.”

Shaw: “With the evil Hollowmen enemies, you can actually rip off their limbs with the tentacle and even their heads with all the associated gore detail you’d expect — you literally hear the tendons ripping and the limbs drop on the floor with a bespoke replay sequence of sound detail created on-the-fly according to the specifics of each incident.”

When it came to the environmental sounds of world, the same modus operandi applied. Shaw: “We wanted to have that sense of things rushing past, so we littered the world with very individual pinpoint sounds placed in specific geographical positions in the 3D geometry. Take an area where you have to drive through lots of trees and there’s a storm raging — for this we had lots of 3D ‘point emitters’ with leaf rustle sounds attached to each. As the evil wind gets up, these emitters are triggered to play and as the gust becomes more intense we switch in more aggressive rustling sounds. If you’re riding along, you get this incredible sense of speed and the world whooshing by.”


For Steve Brown, the audio equivalent of the big money shot, was always going to be the horse and cart, which the player will experience for the vast majority of the game. A lack of detail or repetitive sound triggering for this game feature, above all others, would have been unforgivable.

Brown: “As you travel through the world, you cover many different surface types. For these, we recorded 19 sets of foley — not only horse footsteps, but also wheel loops for the cart. The technical approach for the cart was much the same as a racing car simulation game like Forza. We broke the cart down into its component parts — the wheels, the chassis, and its canopy complete with sundry dangling objects. We wanted the cart to be incredibly rich because you’re surrounded by it for nearly 14 hours and we needed it to evolve with the terrain it travelled. So we have a spindly element for the wood of the spokes which we layer on in Wwise, with an RTCP for velocity controlling pitch. Then we have to pick the gravel wheel loop or snow wheel loop, or whatever is appropriate to the surface. But if you go over cobblestones, as well as picking a cobblestone wheel loop, the whole cart will react — for instance, the lanterns will jiggle creating metallic sounds. If Seren is moving too fast over very rough terrain which could damage her, we bring in a very iconic footstep sound — very slatey, with a sharp, abrasive feel. Again, it’s all a question of embedding the sound into the things that you’re interacting with, as you use your hands and arms via Kinect.”


In Wwise, individual sound events for each object or element are hand-placed within the game world’s 3D geometry maps, literally painting the sound-scape of the world around you. Maps can contain thousands of pin point 3D sound details. The game calculates the player’s real world position in relation to the objects on the maps and decides what the player should be hearing at any moment based on the replay rules and parameters you set. Then you can define roll-off and distance parameters of the 3D emitters — e.g. if the player is 20m away from a babbling brook you would perhaps dull the high frequencies a little and attenuate the volume until they get much closer.


Number of people working on Fable: The Journey sound design: 35 

Time taken: Eight months 

Foley artist for the project, Pete Burgis, also worked on all the Harry Potter movies 

Animal recording specialist, Ann Kroeber, who assisted with horse recording, also worked on the Gladiator and Horse Whisperer movies. 

There are 20,000 lines of dialogue in the game, managed within a bespoke in-house database called LHTS. The final versions of dialogue are recorded late in the process to ensure the script is completely finalised first. Before that, placeholder material from the motion capture stage may be used as well as placeholder performances by Lionhead staffers, and before that, synthesised robotic speech files generated by the database software text-to-speech function (Shaw jokes that it’s a toss-up as to whether ‘Robo-voice’ or the Lionhead staff deliver the most convincing performances).

Performance dialogue was recorded on the motion capture stage at Giant (of Avatar fame). However, it couldn’t be used due to the overspill of extraneous noise from the actual metal cart and a real live horse used during the mo-cap process. Instead, the performances were used as a guide and ADR’d using lavalier radio mics mounted just in front of the actors enabling them to physically ‘perform’ their lines again. 

The final voice recordings were also captured with a distant uni-directional shotgun microphone used in-game when characters are shouting or in an exterior location. 

Altiverb convolution reverb is used as an extra plug-in to Wwise to create a believable acoustic environment that helps bed the dialogue into the world of Albion. Pinewood Studios recorded specific impulse responses as did Steve Brown on his holiday in the south of France (much to his partner’s chagrin).


Meanwhile, the horse, Seren, was treated very much as a character in her own right, with Brown seeking to tell the story of her bond with main protagonist, Gabriel. There are times in the game when Seren will look after him and her vocals are used to pre-figure or even warn him about future events. This meant figuring out how, without articulating words, she could ‘speak’ and express feelings through emotive vocalisations.

Brown: “For me, horses in Hollywood movies have been misrepresented. There’s always this thing that they just whinny all the time — or blurt — the exhale sound you often hear. I really wanted to broaden the vocabulary of the horse so we did a lot of research into, and learned to recognise, all the individual sounds that horses make. We sent the Microsoft Games Studios (MGS) audio team out to do some field recording at local stables to get an idea of what was possible to capture from different horses. What they got was pretty good and some of it was used in the game, but it was mostly whinnies and blurts. However, there were some sounds which were like the horse communicating to you — more characterful noises that make people ask, ‘Is that really a horse?’

“I started to categorise what we had based on how each sound made me feel emotionally. I’d go through the library and think, that one sounds like the horse is distressed, that one sounds like it’s expressing agreement. I started to broaden out the vocabulary of the emotion of the horse, but there were still holes to be filled. So we contacted Ann Kroeber, a horse and animal recording specialist, who’s worked on some movies where horses appear to go through great pain. This was important because a nasty Fable: The Journey player can actually hurt Seren in the game, plus the horse may be hit by a splintering arrow which you then have to pull out to help her. The sounds we got from Ann were very shocking to the team. People couldn’t quite believe that the squealing, almost
pig-like sounds, were actually a horse.

“Thanks to a contact by one of our MGS creature designers, we were able to take some advice from sound design legend, Gary Rydstrom who had recently done the movie War Horse. He said one of the best ways of conveying horse emotions was to focus on the breath — so then I started wondering how on earth are we going to get breathing noises for the horse, not just when it’s static or when you’re petting it, but when it’s at high locomotion, breathing very heavily — what I call the ‘battle breath’. We did two recordings with a range of horses — all at night-time, as there was too much bleed in the day. With a horse, you have to capture everything you can, so we had four radio mics on the body, and a couple of digi-recorders on the saddle plus a radio mic along the brow of the horse, down its face between its eyes and also by its mouth. The result was a big ProTools session with lots of takes but the small on-board mics on the face didn’t actually work that well. There was too much bleed. So then the Soundelux guys simply got a shotgun mic on a boom and rode the horse one handed, holding the short boom in front of the horse’s mouth. This time, we got some really phenomenal breathing loops when the horse was at high exertion.”

Meanwhile, to create the sounds of Seren in complete pain and agony, the team sourced and manipulated recordings of horses in heat in close proximity to an enthusiastic stallion. There were even some recordings where the stallion was probably smiling, but we will draw a veil at this point…

“It’s actually a very powerful moment in the credits when the list of horses’ names comes past,” said Shaw. “It’s amazing when you realise the amount of work that went in to just producing these horse sounds for Seren.”

“We’ve had reports that players actually apologised to Seren,” said Brown. “Because the pain noises have such emotional intensity, they make you physically wince. But as a game sound designer, you know you’ve achieved something when there’s that level of emotional connection.”


Leave a Reply

Your email address will not be published. Required fields are marked *

More for you

Filter by
Post Page
L-Acoustics Synthesizers + Keyboards Digital Synths Studio Outboard Dynamics Processors Behringer News Microphones Drum Microphones Condenser Microphone Lauten Audio Software + Plug-ins Virtual Instruments Cherry Audio Analogue Synths Arturia Studio Focus Moog Adamson Plug-ins
Sort by