A freelance audio engineer (A1) for Dome Productions in Canada, John Hunterfirst discovered his passion for audio with live music. Eventually, his path led him to sports and he (in normal times) mixes every Toronto Raptors home game on TSN and Sportsnet and has travelled with the production team to work every playoff game, including the 2019 NBA Finals. He worked as an A1 for the famous 2018 NBA Raptors/Pelicans game in 4K HDR and Dolby Atmos surround sound, which was the first to be produced in these formats and distributed live to North American households.
- We know you’ve been mixing audio for the NBA Toronto Raptors for quite some time now, but that music was your first passion. How did you get into professional audio and to where you are now?
My path to professional audio evolved from a passion for performing live music. After completing audio training at O.I.A.R.T. (1999), I furthered my education at the University of Toronto (Hon. B.A. 2006), while playing guitar in a band and signed to a record label in New York City. Recording an album and doing select tours in support of the album opened my eyes to different opportunities available in music and the live production industry. During this period, I was mixing live music at the Distillery Historic District in Toronto and I believe this really gave me the chops to make amplified signals sound good. I was also working as a freelance audio-visual technician in hotels and conference centres and this taught me a great deal about building temporary AV systems from the ground up. In fall 2006 I was hired as a freelance audio technician by Courtney Ross, Senior Audio Engineer at Toronto’s busiest sports venue (now known as Scotiabank Arena) where the Maple Leafs and Raptors play. From 2010 to 2019 I worked as a full-time A1 for Maple Leaf Sports & Entertainment (MLSE). I currently work as a freelance A1 for Dome Productions, continuing my work with the Raptors, along with other live sporting events and projects.
My love of performing translates directly to mixing because ultimately, I see my job as one that connects people through technology. If I can translate the excitement of a live sporting event to fans watching the broadcast – if they can feel that excitement through the audio – then I have done my job successfully. I believe that mixing is similar to playing an instrument; there is a rhythm and creativity involved that requires critical listening, intense focus and great attention to detail.
- In more detail, what does your current role as the Home Show A1 for the Raptors entail?
In November of 2013 I began mixing the Toronto Raptors broadcasts on TSN and Sportsnet and haven’t missed a home game since. I believe mixing is a subjective process that makes every A1’s style unique. My aim is to bring continuity and cohesion to the audio “brand” of the Raptors home broadcasts. With viewership and resources expanded during playoffs, I have been fortunate to mix every Raptors playoff game, home and away, since 2014.
My role is to ensure all external audio sources, commentator mics, field of play (FX) mics, crowd/ambience mics, as well as internal sources (EVS, Spotbox, Chyron, etc.), are tested and working properly prior to the broadcast. Our broadcasts are somewhat unique compared to most regional NBA shows because of the number of commentators; during the playoffs, our shows can have up to 12 different commentators in various positions, inside and outside the venue. (The “watch party” outside Scotiabank Arena is often referred to as “Jurassic Park”.)
I am responsible for all communications for talent and crew. We employ a point-to-point RTS intercom system that I have programmed in AZEdit. Over the years I have refined and improved comms files for each mobile facility that we have used.
Our broadcast produces news hits as well as a pre-game show for the networks we air on. Once the game begins, I am following the game action visually and listening intently to the director and producer’s cues so that I can deliver an audio experience that matches the compelling images on the screen.
- Can you describe a typical broadcast audio workflow that you use?
To me, a successful audio workflow starts with great paperwork, which is a road map for the A2s and the A1. I try to include all of the venue patching as well as internal truck patching (i.e. console I/O, IFB and comms ports). It’s crucial that the EIC (Engineer-in-Charge) and the A1 are on the same page; I always provide an audio routing list with offline recordings for EVS operators, main transmission paths (M1 & M2) and various clean feeds as required by the NBA. The vast majority of Raptors home games on TSN and Sportsnet are broadcast in 4K video which requires a secondary pair of main audio paths (M3 & M4) which are delayed (by approx. 50 ms) so that any packaged items recorded by EVS through the switcher have audio and video in sync. Select games require NBA clean feeds where I employ a third transmission path (M5 & M6).
We currently use analog audio (DT12s) to connect to the venue’s I/O patch room and I typically feed the visiting broadcast A1 individual FX mics over MADI (using a Calrec JM series device). Again, I make sure to provide paperwork for the visiting A1 so they know exactly what mics I will be sending them.
In terms of microphone placement, I use Sennheiser MKH 416 shotgun mics placed strategically around the court, with each mic picking up a different but important area of action on the court. My goal is to highlight the bounce of the ball whenever possible since the eyes naturally follow the ball handler. The “squeak” of the sneakers on the court is another key sound that gets picked up by the shotgun mics. I use Sennheiser MKE 2 lav microphones taped under the basket rims to pick up the “swish” sound of a 3-point shot or an impactful “dunk” – with compression settings critical to achieving the desired effect. I prefer to use 416s on all handheld cameras, basket crowd mics and high crowd mics (mounted to the game follow camera). I find using the same shotgun for the court/crowd sounds is really helpful for maintaining a uniform sound image. The Sennheiser 416 is not new technology but it really captures the feel of basketball, in my opinion. I use Sennheiser MD46 microphones for all stand-up commentator positions because they are designed to reject background sound in a noisy environment. I use a combination of Sennheiser HMD-25s and HMD-26s for headsets, depending on the preference of the announcer.
- We know you were an A1 for the 2018 NBA Raptors/ Pelicans 4K broadcast that was one of the first to be produced in the Dolby Atmos format for households. How did you feel about setting that up? What were the inherent challenges of working in the Dolby format, and what advantages did it bring?
When I started as an A1, surround sound (5.1) processing had moved downstream and up-mixed at each network’s facility – not generated innately from the mobile’s audio console. I had training and knowledge in 5.1 but no practical broadcast experience. Essentially, in one broadcast, I made the jump from a stereo mix to Dolby Atmos (5.1.4). It was a steep learning curve that involved a significant amount of setup and routing, but I had great support from Mike Babbitt from Dolby and two A1s that had prior experience with Dolby formats – Andrew Roundy and John Rootes.
One of the main challenges was preserving the quality of the stereo mix for Sportsnet and TSN, while delivering a distinct Atmos mix for DirecTV. The first Atmos broadcast was on Dome Production’s “Vista” mobile, which is equipped with a Calrec Apollo and Waves software integration. Waves really helped with up-mixing stereo sources to 5.1 and mono camera mics to a stereo image. The Apollo allows for efficient control of the Atmos mix for several reasons but ergonomically because of the dual layer faders. I could quickly access the crowd mics and other sources feeding the overhead speakers on the top layer of faders, with the bottom layer of faders assigned to my main commentator mics, FX mics and playback sources.
One of the most interesting elements was routing the arena’s direct P.A. feed into the overhead speakers. The direct sound perceptively draws the listener’s attention away from the P.A. bleed and reflections that can muddy up the mix in the L-C-R speakers.
I had the chance to listen to my Raptors 5.1.4 mix in a Dolby Atmos theatre at Microsoft headquarters in Redmond, Washington – the experience was truly immersive!
- When did you first start working with Calrec products?
After months of training on mobiles with my colleague John Rootes, my first solo broadcast as an A1 was in December 2011 on Dome Productions “Thunder” for a Toronto Maple Leafs broadcast. This truck is equipped with a Calrec Sigma (with Bluefin). The first year as an A1 was definitely challenging and filled with new experiences, but as I continued to hone my skills, every mix became more detailed and richer sounding. The reliability and inherent familiarity of Calrec consoles gave me the confidence in my first years as an A1 to navigate the unfamiliar and unique terrain of live sports broadcasting.
- What Calrec consoles have you used over the years and for what projects?
I have operated many Calrec consoles over the past ten years: S2, Sigma, Omega, Alpha, Artemis, Apollo and Brio. My main projects have been for the following sports and leagues: basketball (NBA, NBA G-League, NBA Summer League), hockey (NHL and AHL), soccer (MLS and USL) and lacrosse (NLL).
- What’s a recent example of a project where you used Calrec technology?
One of my main projects outside of the Raptors broadcasts is for the Toronto Maple Leafs minor league team – the Marlies. These broadcasts are produced from Dome Productions Suite 1 studio using a Calrec Brio and RVON for communications. During my time as an MLSE A1, I helped implement an audio workflow that would not over-tax the single A2 on-site – their hands are full with setting up the commentary booth, ice rink FX mics, comms and sideline reporter mics/IFBs. We rely heavily on the EIC at the venue to embed individual audio channels over multiple video services to be mixed back at the remote studio. The EIC also routes the audio returns for IFBs, PGM and I.S. feeds.
One of the biggest challenges of a REMI is latency caused by sending signals to the commentators’ headsets from a remote studio. This was solved by routing dry IFB outputs into the Brio with discrete mix-minuses assigned to transmit audio return paths. These mix-minuses are then combined with a split of each of the commentators’ mic locally using a DirectOut Andiamo MC-1 at the venue.
The Brio is an ideal console for this size of production. The layout of the console makes it accessible for guest A1s who are hired to mix the show when I am assigned to other duties. My goal was to create a logical workflow with clear console labelling so that any A1 can load my file without show-specific training or having to build a file from scratch.
- What are the technological advantages of the Calrec console(s) that you’re using?
All Calrec consoles seem to be designed with the broadcast A1 in mind, but the newer designs (Artemis, Apollo, Summa, Brio) are especially user friendly. In many cases, A1s are under a time crunch to build a show from a default console file. Calrec consoles allow the user to efficiently create and re-arrange new channels (Fader Layout), route sources/destinations clearly, and copy and paste settings quickly. The EQ and dynamics processing have been fine-tuned on the newer consoles and the overall sound is quite transparent – clean, clear and crisp without clipping. The Automixer feature is unbelievably handy; I use it on all commentator mics, but it is especially crucial when I have a panel of four commentators using stick mics on a basketball court in a loud arena.
- What have been the key technological milestones you’ve witnessed in your time in broadcast audio and how have they changed what you do/how you work?
In my opinion, Calrec’s introduction of the Artemis and Apollo consoles were significant milestones in broadcast audio, from an operator’s perspective. The level of control and ease of building a new file on the Artemis/Apollo consoles really allows an A1 more time to focus on crafting the sound.
The shift to Remote Integration (REMI) broadcasts has also been significant. I have been on the ground floor of building a smaller but cost-effective REMI production for the Toronto Marlies broadcasts as mentioned above.
Dolby Atmos is another important milestone in the evolution of broadcast audio. As demand for Atmos increases and infrastructure to deliver this content expands, I believe there will be further experimentation in this immersive format.
- Can you talk us through the project you’ve most enjoyed working on and/or that really stands out in your career?
The 2019 NBA Finals is by far the most exciting and memorable event that I have been involved in. The Raptors are the only Canadian NBA team, so TSN and Sportsnet had the rights to produce and broadcast all playoff games in Canada. It was thrilling to find out that my audio mix was being heard across the nation in massive numbers –approximately 20.5 million viewers or 56% of the Canadian population watched all or part of the 2019 NBA Finals. With “watch parties” and “Jurassic Parks” organized in cities across Canada, I felt a great duty to make these broadcasts sound as full and exciting as possible.
Throughout the playoffs, I worked closely with the audio departments from TNT and ESPN and learned a great deal from these industry veterans. Since I was mixing my own FX and not taking a composite I.S. feed, I needed individual sends of court and crowd mics, as well as press conference feeds. The A1s from these American networks were very accommodating and helped to ensure that I had everything I needed to produce a great mix for the Canadian broadcast.
Being in San Francisco with the team after they won the NBA Finals was a perk of the job I will never forget. I was also one of the lucky staff members to receive a personalized NBA Championship ring by the organization.
- What are your thoughts about AoIP and where we’re at and what it means for the future?
I am excited about the opportunity to gain more experience with AoIP and to use it for projects in the near future. I understand eSports is a massive industry that is growing exponentially and I hope to become more involved in these projects. I see the burgeoning of eSports broadcasts/streaming as a great platform for the continued development and implementation of AoIP products.
In terms of traditional broadcast models, I see AoIP being introduced as facilities upgrade older infrastructure. This won’t happen overnight of course, so there will likely be hybrid systems in place. As an operator, I welcome any technology that helps make our jobs more efficient so we can concentrate our efforts on the craft of mixing. As with many industries, certain jobs will be lost to automation, but I believe others will be created in response to different demands. Training will be a key component to help people adapt to these new climates.
- How do you see audio evolving in the next five years?
As we know, the industry is migrating toward IP-based solutions for audio control, networking and distribution. With many live sports on hold in North America as a result of the COVID-19 pandemic, I am using this time to research the many changes taking place in our industry. With industry alliances like AIMS and technical standards like AES67, I would say we are in good hands to achieve “interoperability” between existing infrastructure and newer IP-based technology.
My hope is that the growth of remote productions and disbursement audio will allow content producers to broadcast/stream more events, since this technology has the potential to make live event coverage more cost-effective. The global pandemic we are experiencing is pushing the development of remote production even further in response to an immediate need. However, once it is safe for live sports to resume, I believe there will always be events that require mobile facilities. Positioning mobile production units in close proximity to the venue has a lot of advantages, including intangible ones that can affect the feel and rhythm of a broadcast.
I consider myself lucky to be working alongside the knowledgeable engineering staff at Dome Productions who are always seeking to innovate and experiment with new technology in hopes of delivering excitement to fans, wherever they may be.