Audio Media International was in attendance at this year’s Game Music Connect conference on 15 September as it returned to London’s Southbank Centre for the third year, where hundreds of professionals from game audio and beyond gathered for a day of informative talks and discussions on contemporary issues in the industry.
Composer John Broomhall (pictured), one half of the team behind the event, opened proceedings before handing over to Chuck Doud, director of music for Sony Computer Entertainment America (SCEA) Worldwide Studios, for the day’s keynote.
Doud (pictured above), whose career highlights include overseeing big budget audio projects including the Unchartedseriesand The Last of Us, put forward the notion of music as an ‘active medium’ and a ‘unique vehicle of delivery’, one that is driven by logic and context.
Throughout his keynote, Doud attested to the importance of audio in video game development processes; citing The Last of Us as a prime example, he explained how creative director Neil Druckmann had emphasised that the purpose of music on the project was to tell the story of the connection between the two main characters. With this in mind, the production of certain music tracks during development actually prompted heavy alterations to elements of the game in order to better complement the accompanying music.
‘It’s not enough to hire a composer, go to Abbey Road, record it, get it mixed and drop it into the game at the end of development. It just doesn’t work that way,” Doud stressed, urging for a production process with a much more significant and cohesive audio involvement.
Later, Alastair Lindsay and Joe Thwaites (pictured above), members from Sony’s in-house music team, took the stage to discuss their findings on the working relationship between audio and the company’s virtual reality headset Project Morpheus following their short time working with the technology.
While they had determined that binaural audio, a presentation of sound which accurately creates the illusion and location of three-dimensional sound, is the perfect companion for VR, the issue of user immersion is an ongoing issue. The pair divided the role of music in VR into three categories: linear, where audio is not affected by the player; reactive, influenced by the player’s actions; and proactive, where the audio seeks to actively guide and prompt the player.
With this in mind, the team discussed the issue of where to place the audio in order to maintain maximum user immersion. While two-dimensional audio – which is not affected by the head movements of the player – is effective, starting and stopping audio can prompt a ‘reality check’, breaking immersion. Thus, the team determined that layered, dynamic mixing is imperative to VR experiences, changing the mix dependent on where the player looks, and using gameplay elements to mask dramatic changes in sound.
Another highlight of the day came as key members of the in-house audio team from UK developer Creative Assembly took the spotlight to discuss the making of the BAFTA-award winning survival horror hit Alien: Isolation, joined by The Flight, the game’s musical composers (all pictured above). Among those present was Byron Bullock, whom AMI interviewed earlier this year on the making of the game.
Music was cited as a key element of the project from the very beginning, with The Flight – composed of Joe Henson and Alexis Smith – meeting as many of the developer’s artists and designers as possible to ensure a solid marriage of music with the game’s other elements; the score, they explained, was greatly influenced by the visual cues from the game. While striving for a more minimalist approach to sound, the team tackled audio development with a very interconnected approach, with the sound team informing the game designers, and vice versa; an approach they believe was key to the game’s success and critical acclaim.
Audiokinetic’s interactive music expert Simon Ashby (pictured above) returned from last year’s event to discuss differentiation and variability in video game soundtracks, an area which was cited by many previous speakers as an area of great challenge during production; while many games feature anywhere between six and fifteen hours of gameplay, and some include potentially endless hours of multiplayer gameplay, the average game audio development only generates around one or two hours of source material.
Utilising examples, Ashby was keen to put across the idea that ‘games are software’, and like software they have systems and game states, and these and can be advantageous when it comes to crafting a soundtrack. This can include, he told the audience, attaching sound to systems the game is already using, for instance a ‘stealth factor’ which monitors how stealthy a player is being. This factor then dynamically mixes the game’s audio automatically depending on the player’s actions, allowing for a much more varied and engaging audio experience.
Broomhall told AMI: “Co-founder James Hannigan and I are delighted with how Game Music Connect 2015 went down. It’s an honour and pleasure for us to be a catalyst to get all these amazing delegates and speakers in one room in London. The level and tone of discussion and the mix of topics felt really good and people didn’t seem to mind that the day was longer that years 1 and 2, appreciating that extra opportunity for more content. We’re extremely grateful to organisations like Sony Playstation and BAFTA for backing our efforts to bring all this together and of course to all our amazing speakers for coming from around the globe to contribute their wisdom and expertise. For us, Game Music Connect 2015 was another great success.”