Garry Taylor, audio director at Sony Computer Entertainment Europe, explains why he believes the new PlayStation 4 Mastering Suite is good news for both gamers and developers…
I suppose some people might consider me an audiophile. I’m not naive enough to buy expensive blocks of wood to keep my speaker cables off the floor, but I do have the luxury of a decent 5.1 system at home, as well as a number of 7.1 studios at work where our sound and music teams create and mix soundtracks for some very successful video games.
However, I’m not everyone, and I understand that not everyone listens to games in the same way that I do.
The problem of not knowing what sort of system the consumer is listening to our content on has been a challenge for audio teams in the games industry for years. Sound designers and engineers at most big game studios these days have good facilities, but producers and other team members tend to listen in less than ideal environments on less than ideal speaker systems, just like most of our audience.
Dynamic soundtracks sound great on a big system and may suit the title you’re working on, but a significant percentage of your audience will miss out if they’re listening to a dynamic mix on two tiny speakers on the back of a thin, flatpanel TV. How do we give people the ability to ship big dynamic mixes, as well as ensure people who may be listening on a small TV or tablet don’t miss anything?
Last year, a small group of us at various divisions across Sony Computer Entertainment (SCE) got together to try to solve this problem. The result is the PlayStation 4 Audio Mastering Suite.
We needed to give game audio developers the ability to modify dynamic range, equalisation, limiting and gain of their finished mix, and to hit SCE’s average loudness standard for PlayStation 4 of -24LKFS, easily. We also needed this functionality to be available to all game developers, regardless of whether they’re big triple A studios or small indie developers, and we needed the mastering suite to have no impact on the performance of the title.
The Mastering Suite we’ve introduced to the PlayStation 4 consists of a four-band parametric equaliser, a three-band dynamics processor, a gain stage and a limiter, as well as loudness metering and spectral analysis. It runs independently of the game’s audio engine, meaning it works on all titles, and it runs on the system core of the CPU, separate from the cores running the game, meaning that it doesn’t affect the performance of the title.
We also created a tool called Sulpha, part of the PlayStation 4 Software Development Kit (SDK) which, as well as giving developers audio system analysis tools, allows them to create presets for the Mastering Suite. It connects to the game and allows engineers to adjust parameters in real time to suit different types of playback devices.
This is all well and good, but it means nothing if the user cannot tell the PlayStation what type of system they’re listening to a specific game on, or if the player has to dig through lots of menus to find the audio settings, which, even then, they may not fully understand the ramifications of any choice they make. It boils down to a user experience (UX) and user interface (UI) issue.
A game called SOMA, a sci-fi horror title developed by Frictional Games, was the first game to be released on PS4 that utilised the Mastering Suite. It solved this problem in a simple and quite obvious way. The first time it’s run, it asks the player to adjust the gamma settings to suit the player’s screen, as many games do. It also asked the player on what type of system they were listening to the game, and gave options such as ‘home cinema’, ‘small TV’, ‘headphones’ etc. Presets for these options were then loaded by the Mastering Suite to suit the speaker system. The choice takes the player a couple of seconds and is not intrusive.
A final audio mastering process is something that most game audio teams have not had to deal with before and some developers may need educating as to the benefits it affords them. This also means there’s currently a lack of experienced audio mastering engineers working in the games industry. Our tool design means that it’s obvious to any audio engineer how it works, even if they’ve never been involved in the complexities of game audio development. So, now, a mastering engineer can take the game installed as a ‘package’ on a PS4 development kit and a laptop running Sulpha to any studio in the world and master the game to suit as many scenarios as they wish.
As far as getting information about a particular player’s speaker system, all that’s left is for the developer to ask, like Frictional Games did, is “What are you listening on?”
Garry Taylor is audio director at Sony Computer Entertainment Europe and co-founder of Sony Worldwide Studios’ Audio Standards Working Group. He is speaking about ‘Audio Mastering for Interactive Entertainment’ at this year’s Game Developers’ Conference (GDC) in March. Twitter: @tetley_uk