By Dan Daley, Audio Editor
Thursday, October 28, 2021 – 12:42 pm
Print This Story | Subscribe
It’s October, and the NHL is back on the ice and the air. And so is NHL 22, EA Sports’ sim version of the game. Powered by the Frostbite game engine, NHL 22 aims to bring gamers closer to the ice than ever before, this year with its most visually distinct upgrade to date, along with improved environments, overhauled player likenesses, and cool micro-details like improved skate spray.
What’s also startlingly realistic, though, is NHL 22’s newly immersive audio, and EA Sports Audio Director Eric Paul sat down with SVG to discuss how it came about.
How did EA source SFX for the game? The league? NHL Studios? Field-capture your own?
For the yearly title, the NHL audio team has generated and sourced audio in many different ways, with the focus generally being on the current feature set. In the past, we have had sessions where we would record players on ice with a couple of Sanken COS-11D’s, one at ice level and one from players’ POV. We bolster in-game sounds with broadcast recordings and our large central sound library. UI and broadcast sounds are individually designed by our audio artists using samples, synthesizers, and modern design tools and plugins like Ableton Live, Native Instruments, and Serum.
As you navigate around our game this year, you’ll hear contemporary UI sounds that often have layers of modified hockey samples blended in to maintain that feeling of a unified sound world. A few examples of this: the ambient sounds of the boot[-up] screen mix hockey organs with washed-out skating, carves, and player call-outs. When you press “start,” the transition sound is a mix of modulated whooshes, board checks, and a puck hitting the crossbar. These are mixed to be impressionistic, but you’ll hear these easter eggs through the title.
Of course, the big issue these past few cycles has been that our team is working from home and have had restrictions on field recording. Nowhere was this more challenging than recording our commentary team Ray Ferraro, James Cybulski, and Carrlyn Bathe. Our solution was to provide our talent at home with the equipment we would use in the studio: Neumann U87, Sennheiser HMD 300 PRO XQ 2 (for that sideline reporter sound), studio headphones, portable booths and laptops with Pro Tools.
During the session, which we run through a combination of Zoom and CleanFeed, our recordist logs into the talent’s PC and runs the Pro Tools session: recording, playback, and dropping in markers. Assistant Producer Danny Lopes will work with the talent on their performance, and another audio artist will listen for technical issues.
We take a break every few cues to output the audio and check that the lines are consistent and that they will “stitch” well in runtime. We run the whole pipeline in-house, from pre-databasing, recording, editing, mastering through to logic writing, implementation, and final mix. During NHL 22 [production], we were able to record 59 sessions this way, and we’re very happy with how it went and how well the new lines are stitching with legacy content.
EA Sports’ Eric Paul: “We work towards reality, but our games play out more like a highlight reel, so more-pronounced swings in crowd excitement levels and stadium atmosphere make sense.”
Are they processed/enhanced in any way? Hi-res 24/96?
NHL Audio uses all the modern audio tools and techniques depending on the requirements of each particular task. Especially when it comes to sound design, anything goes. We love to grab new plugins and audio toys and try to innovate where we can. This year UI, broadcast wipes, and X-Factors got a lot of love in this area.
We also did a complete remaster of all our goal horns for NHL 22 to increase the hype and push us closer to the feeling you get from hearing those live. We did this by A/B-ing our horns with live recordings and using frequency analysis and our ears along with EQ, saturation, and spatial tools to get as close a match as we could.
Additionally, our commentary and music are mastered in-house to maintain a standard LUFs level. Last year, we had a big push to migrate to EA’s Frostbite engine, and that has opened up a number of tools (filtering, compression, delays, distortion, etc.) so we can perform more of our processing and enhancements during runtime.
This allows us to be more flexible and customize our processing to the needs of a particular game moment. A common example of this is sweeping down a low-pass filter in [the] “Be a Pro” [feature] on background sounds to draw attention to a decision your player needs to make. But this also goes deep into how the game is mixed every frame. Currently, we design with high-resolution sound up to 96 kHz and 24 bit, but, due to the limitations of game consoles, we implement at 48 kHz and 16 bit.
Are any NHL player/coach/ref voices included? Any vocalizations?
Vocalizations come from a number of different sources. You will hear player calls throughout gameplay that were sourced through broadcast recordings for verisimilitude. Coaches and refs are hired talent that we use for to further the depth and authenticity of what you hear.
This year, with the move to Gen 5, we’ve added a number of our in-game voices to the PS5-controller speaker along with hits, shots, goal music, and goal horns. We’re loving the feeling this gives to players of being more “in the game” and will continue to evolve that interface.
Where did you get crowd sounds, and how are they mixed for reaction to play?
[EA Sports’ NHL gaming franchise] has a very deep crowd system that leverages Context, the logic system that also drives our commentary-stitching decisions. Essentially, [within] every frame, Context is looking at game states and making decisions about what is happening in the game, prioritizing events that reflect the current crowd sentiment.
The whole system is granulated into beds, major events (cheers, boos, ohs, ahs, applause, etc.), individual crowd heckles or support, and close crowd events. This is a feature we work on every year to get much more reactive and to build anticipation to generate the most exciting and immersive gameplay experience. It’s also another area that will benefit from tools in the Frostbite engine in coming iterations.
How do you achieve overall sonic ambience, the sound of the arena, etc.?
We achieve overall sonic ambience, first, by being very attentive to the range of sonic events that can be heard during games in a variety of locations. Whether it’s live hockey, broadcasts, or smaller local rinks, indoor or outdoor, we’re listening and taking note of the range of sound events in each context and paying special attention to those sounds that are related to excitement or that generate a certain emotion, whether it’s from a moment in the game — like a late game tie-breaker — or related to nostalgia — think organ [sound].
We then focus on sourcing the sounds, editing and creating beds or granular patches, and finally layering and mixing it all together to re-create that initial listening experience. To add another layer of realism, we send our sounds into 7.1 arena impulse-response reverbs [an acoustic process that digitally models the sonic fingerprint of a space] that were captured by EA audio artists; any sound put through those IR reverbs will sound like it is in that space, with all the same reverberance and reflections.
We also have ambient arena room-tone “air” recordings that activate when the game sound gets to a low enough volume level, like at the end of a game. All this creates the experience of all the sounds living in the same arena environment.
Are you using Atmos or other immersive audio formats?
We are currently shipping our game in the 7.1 format. Beyond mixing, we also capture and create our crowd assets and reverb in that format.
For NHL 22, we also began using Frostbite’s BigWorld tool that gives us the ability to randomly generate events or “one-shots” (voices, whistles, ambient sounds) around the player, elevating the surround experience.
We are looking at what Atmos can bring to our game but want to make sure it’s elevating our game-atmosphere experience or improving some aspect of gameplay and not just checking a box. More to come!
Are you striving for complete reality, or is there any aspect of the audio that is purposely hyperreal?
[Because it’s] a sim game, we do tend to work towards reality, but a typical game of NHL 22 takes place over the course of 15 minutes or so, considerably shorter than real life. Because of this, our games play out a little more like a highlight reel of all the best stuff, so more-pronounced swings in crowd excitement levels and stadium atmosphere make sense in our product.
We also do our best — using whatever tools we have at our disposal, regardless of whether they’re technically authentic — to bring out the excitement and emotion when designing our soundscape. For example, our “Play of Period” and “Play of the Game” replays feature the following runtime effects: as the replay shifts to slow motion, the music pitches down, the crowd is filtered out, and a bass sweep matches the player’s motion.
Also, this year, with the addition of the X-Factor feature, we added some hyperreal sounds to elevate the experience of performing any X-Factor shots on goal. When a player performs a slapshot, wrist shot, backhand. or one-timer with that X-Factor activated, we add a sweetener layer to augment the sharpness, stereo width, and resonance of the shot.
© 2022 Sports Video Group. All rights reserved. Site by Brightgreen Design/Arturan/Sfera Interactive.
A lowercase letter
A capital (uppercase) letter
Minimum 8 characters
By Dan Daley, Audio Editor