New York, NY – March 2018… The Band’s Visit, a critically acclaimed new musical that celebrates the deeply human ways music and laughter connect us, opened in November to rave reviews at the Barrymore Theatre on Broadway. It is the story of the Alexandria Ceremonial Police Orchestra’s arrival in Israel for the opening of an Arab Cultural Centre, only to find out that they have boarded the wrong bus to the wrong town – with some inevitably amusing consequences.
A big part of the musical’s success is celebrated sound designer Kai Harada’s innovative sound design, at the heart of which is Astro Spatial Audio’s true object-based approach to immersive, three dimensional audio.
Harada recounts how he first became aware of Astro Spatial Audio’s creative possibilities: “I have known (Astro Spatial Audio Director) Bjorn van Munster for a number of years, and although I had heard about the system, it was only last year that I saw and heard a demo of the system in California. A few years ago, I had used a competitor’s system on a Broadway show that required precise localisation, and then last year I found myself designing The Band’s Visit, which would require the same precision, and I thought it would be a perfect chance to try the Astro Spatial Audio system.”
At the heart of the Astro Spatial Audio (ASA) solution is the conversion of audio signals into audio objects. ASA’s SARA II Premium Rendering Engine – a 3U road and rack ready processor offering up to 128 MADI or 128 Dante configurable network pathways at 48kHz/24-bit resolution – utilises extensive metadata attached to each audio object. The result is a precise calculation of that object’s position within virtual 3D space, processed in real-time up to 40 times per second for each individual object, as well as that object’s acoustic effect on the virtual space around it. The result for the engineer is a truly three-dimensional audio canvas on which to play.
Harada explains how ASA was integrated into the production: “In The Band’s Visit, several musicians play their instruments in a variety of locations on the stage, and it was incredibly important to me to preserve a transparent sound system design – in my opinion, the more we attracted attention to the sound system, the less the audience would connect with the actors and the story on stage, so natural-sounding reinforcement was the goal.”
It quickly became apparent how easy the system was to set up: Harada’s associate, Josh Millican, drafted all the speaker positions in CAD, and when it was time to commission the system in SARA II the measurements were verified and the values simply entered.
“ASA allowed us to precisely place the instrument source as an audio object within a graphical interface, while it did all the calculations to make it sound correct. Changes to staging were easily accommodated. In addition, having used other acoustic enhancement systems on other shows, I was eager to try the ASA room enhancement to give the illusion that the theatre was a larger acoustic space for some key moments in the show.
“Also, there were a number of very localised sound effects – coming from a prop radio, or a jukebox, or a baby – and although we had many wireless loudspeaker systems to play with, we used SARA II to reinforce the localisation through the main PA: the initial waveform comes from the practical loudspeaker, but SARA II ensures that the sound is localised correctly for all audience members.”
All the stage band and practical sound effects inputs were routed, post-fader, from a Studer Vista 5 console into SARA II, where they were represented as audio objects; the Studer fired MIDI changes to QLab, which in turn fired OSC commands to SARA II to move between snapshots. SARA II’s outputs were routed back into the console and routed to the appropriate loudspeaker systems, which were then processed using Meyer Galileo units. The system had to function first as a traditional reinforcement system, and secondarily integrate all of SARA II’s power.
The production is configured with 162 mono inputs and 24 stereo inputs – which comprise a cast of 15 performers, four musicians that play in a purpose-built room under the stage, the five additional stage musicians (who also play in the room when they are not on stage) – totalling 68 band inputs, 26 playback (QLab) inputs, 36 SARA II returns into the console, and a host of reverb returns and utility channels. Reflecting the fact the Astro Spatial Audio is entirely brand agnostic, 90% of the loudspeakers used in the show are from Meyer Sound (M1D, LINA, UPJ-1P, UPJr, UPQ-1P, MM4, UPM-1P, UMS-1P, UPA-2P), with the rest from d&b audiotechnik with E5s as surrounds.
“The ASA system is not tied to any one loudspeaker brand,” explained Bjorn Van Munster. “We believe it is in the best interests of the market that Astro Spatial Audio remains brand independent. Users should benefit from object-based immersive audio regardless of which loudspeaker they invest in. Similarly, we support a range of protocols, including MADI and Dante, and we intend to continue working closely with our good friends in the industry to bring our technology to as many people as possible, and to create incredible experiences for audiences everywhere.”
While automation is a key feature of the ASA immersive solution, the system is equally focused on allowing the audio operator limitless creativity in a live environment. “The show is mixed manually,” says Harada. “My operator, Liz Coleman, mixes every word, line by line, and helps augment the dynamics of both the stage musicians and the musicians in the trap room. The console’s automation helps out by grouping inputs in logical ways, but Liz is very much performing along with the musicians. Nothing is on timecode on our end; sound effects are triggered manually by Liz, sometimes based on a visual cue, sometimes on a musical cue. All commands to the SARA II system are also triggered by Liz.”
ASA’s ultimate purpose is of course enhanced audience enjoyment. “The goal was not merely an immersive audio experience; the goal was a transparent audio experience, and I think we were very successful,” says Harada. “Many people have commented about the quality of the audio on the show, and I am quite proud of it. I do believe we have achieved our goal of creating an intimate, organic-sounding show but still delivering dynamics when appropriate. The story is so human and conversational that we needed to preserve that feeling but still ensure that everyone in the audience had a very good aural experience.
“The localisation algorithms helped create a very natural sounding reinforcement system also for the musicians onstage. I did appreciate the room acoustic enhancement feature as well, although I chose to use it sparingly and subtlely, and only when dramatically appropriate for the piece. Most theatres I get to work in already have an acoustic – some of them, like at the Barrymore Theatre, are quite nice, so it was never my intent to fight the acoustic, just to augment it.”
Harada sums up his experience with Astro Spatial Audio: ‘I think it’s a great tool – it is quick to set up and commission the system, which is VERY important in an industry where time is very, very expensive. Having what are essentially two separate features: object-oriented audio and acoustic enhancement in one box is a great boon. Not having to manually calculate delay times to a given reference point was also a huge time-saver – just entering the x, y, z coordinates of the loudspeakers got us very close to having a functioning system in a short amount of time, and then we could spend the rest of the time listening and adjusting.
“Without the functionality of the ASA system, the sound of the The Band’s Visit would have remained two-dimensional. I’m extremely pleased with the results. I am eager to find another show that would lend itself to using Astro Spatial Audio.”
He acknowledges the support received throughout the project from the Astro Spatial Audio team, led by Bjorn van Munster, who says that 2018 is “proving to be a very significant year for ASA in the theatrical environment, and the success of The Band’s Visit is a very proud moment for us all. We only just begun to demonstrate the extraordinary potential of true object-based audio. There is much more to come.”
About Astro Spatial Audio
Astro Spatial Audio (ASA) combines SpatialSound Wave (SSW) technology, developed by the Fraunhofer Institute for Digital Media Technology IDMT, and licensed to ASA, with the intelligence and power of the SARA II Premium Rendering Engine to bring a sophisticated spatial sound platform to the sound engineer. The result is the world’s leading independent solution for scalable and easy-to-operate fully object-based immersive audio. Delivering new creative options on tour with major artists and in theatres worldwide, ASA can also be found in venues as varied as houses of worship, planetariums, theme parks, museums, nightclubs, cruise ships and more. For more information, visit the company online at astroaudio.eu