Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

A High-flying Audiovisual Backbone, Part 2

The Red Bull Air Race is one of the most exciting TV events ever produced with live video from the planes, giant viewing screens for thousands of fans, and a huge job of RF transmission and audio/video networking on MediorNet

A High-flying Audiovisual Backbone, Part 2

Oct 26, 2010 12:00 PM,
With Bennett Liles

Listen to the Podcasts

Part 1

|

Part 2

Editor’s note: For your convenience, this transcription of the podcast includes Timestamps. If you are listening to the podcast and reading its accompanying transcription, you can use the Timestamps to jump to any part of the audio podcast by simply dragging the slider on the podcast to the time indicated in the transcription.

The Red Bull Air Race is one of the most exciting TV events ever produced with live video from the planes, giant viewing screens for thousands of fans, and a huge job of RF transmission and audio/video networking on MediorNet. Riedel Communications made it all happen, and Thomas Riedel is here to explain MediorNet and how they used it to bring you the race.

 
Related Links

A High-flying Audiovisual Backbone, Part 1

Roaring planes, a huge crowd, and excitement building faster than g-forces in a tight turn: the Red Bull Air Race with live video and sound right from the cockpits…

OK, Thomas, thanks for being back with me for part 2 on the Red Bull Air Race and how Riedel’s MediorNet was used for all the communications. For anyone who wasn’t up on that, Riedel also bought the company that did RockNet, but MediorNet is the backbone of the whole operation at the Red Bull Air Race. So tell me a little bit about MediorNet. How does it all work?
Thomas Riedel:
Well, just think about nodes within a fiber network which can be connected in whatever topology. It might be in a ring, in a star, daisy-chain, or just a point-to-point, or any combination, and these de-centralized nodes within that fiber network can be seen as a de-centralized router. So basically, we combine two disciplines here, which is routing and fiber transport. But one more discipline comes on top, which is about signal processing—which, in the broadcast world, is called the glue features—and that’s basically all in one system, which really is a challenge on the technology, but also for people. It’s a brand new idea; that’s why people are not used to that. [Timestamp: 1:49]

And this is a whole video, sound, and communications network. How is this different say from 802.3 Ethernet?
Well, the main difference is we’re talking about a realtime network, so it’s not IP and it’s really talking realtime. Talking about an event and the requirements for audio/video and data signals are realtime requirements, so you want to have no delay of, let’s say, 20, 40 milliseconds, but you might accept some microseconds of delay, which an IP system could never deliver. At the same time, you want to have the advantages of the topology you might have in IP switches, and that’s exactly what MediorNet does. It provides you with all the advantages and flexibility you find in modern IT infrastructures, but at the same time, it has realtime capabilities, and that’s exactly the difference between these kind of systems. [Timestamp: 2:48]

And one of the features I’ve heard a lot about is MediorNet’s ability to mix and cross-switch multiple media formats, and it handles all this in software. What video formats are you using on the air race coverage?
MediorNet accepts all kind of digital video formats: HD-SDI, SD-SDI, it might be 1080i, or 720p, or whatever format we can take in, in the inputs. And in our outputs, we just tell the output what kind of format it should deliver. So basically, that’s exactly what I meant with glue features and signal processing, all conversion—up-/down-/cross-conversion—is done inside the engine. And talking about the Red Bull Air Race, the production format was 1080i50 while in some areas we needed still SD signals, and some other HD formats as well. But again, to our system, it doesn’t matter. You just feed in what you like, and you get out what you like as well. [Timestamp: 3:45]

I was wondering about that, because the aircraft have live signals coming from them all the time with live video, and they’re twisting and turning all over the place. How did you manage to avoid dropouts in the video?
Well that’s a different topic, since it’s a little jump from MediorNet to that topic, but it’s basically also connected here. So we’re talking about a COFDM technology being used for the transmission from the planes to the ground, and with several antennas on the planes to really make sure that whatever situation you have in the air, you always get proper signal to the ground to the receivers. But then these receivers are connected to MediorNet, wherever they are around the event locations, and the whole signals are shipped over fiber-optic in our network to the broadcast station, which is basically the central location where all the broadcast signals get together and where basically the feeds are produced. [Timestamp: 4:40]

You mentioned that the signal cross-conversion happens in the software. There may be some who might say that as versatile as this type of system is, maybe it’s not durable or rugged enough for doing live remotes where the gear gets banged around and it’s outside in the elements. How does it perform?
Well on one hand, you always think about whether you want to put all eggs in one basket or basically have several systems working together. Talking about this project, we can only say that we have never had any issue with that system, but basically we should ask our clients about that since we are the manufacturer, and we had the system in use at the World Cup in South Africa as well. The system is used in several other—in fixed locations in the mean time—in Australia, in China, in Europe as well as the U.S.—and all people feel that, “Well, yeah putting all eggs in one basket certainly sounds like a higher risk, but also thinking about many different systems which are in a kind of chain also provides you many more possibilities for failure.” So under the bottom line, we feel that that concept is more rugged than what we have had in the past. [Timestamp: 5:56]

1 2Next

A High-flying Audiovisual Backbone, Part 2

Oct 26, 2010 12:00 PM,
With Bennett Liles

You also have a lot of control signal traffic going over the routing, and that’s a huge job in itself. Is there one central control point, or was it a distributed control system so that configuration could be done from different locations?
It can be distributed, and of course you can have user rights to make sure that any controller can only control what he is allowed to do. In the case of the Red Bull Air Race, we have had one person in charge for the signal distribution, and that person really managed all else. And in this case, this was the best choice, but again the system could do it either way. You can also have several people working on it at the same time and putting it in partial areas where only certain guys only manage their area with the user rights and such. [Timestamp: 6:45]

And that requires a lot of setup and planning a head of time too. I was thinking about the huge crowd out there watching the show, and they have to know what’s going on and not just be standing there and watching a bunch of planes flying around. Where was the main PA system control point, and what mixer did you use for that?
Well the main control point for audio also was in the tower—and the tower, as I already said, is basically a building looks like an airport tower and it really acts like one—and one floor was dedicated to audio signals, audio distribution, audio control. And there we have an audio engineer working with the DiGiCo SD8 console which was connected to the MediorNet as well as our RockNet infrastructure, really managing all the audio signals in and out. [Timestamp: 7:34]

Usually for a PA mixing operator, you want to be out in the sound environment where the crowd is so you can hear what you’re doing.
Well we had one or two people walking around. But basically it’s about really a lot of experience with the team and with the system in place—where you can listen to all signals at a time, since that’s again a part of the feature of our infrastructure—then you don’t really need to be there. At the same for some things you need to be there and really listen with your own ears to what’s going on there, and that’s why we had a group of people with walkie-talkies walking around they could listen to the sound, so that the sound guy could make adjustments to the commands that he was getting. [Timestamp: 8:18]

 
Related Links

AV Asset Control on a University Campus, Part 1

Called “one of the rising stars of education” by U.S. News and World Report, Atlanta’s Kennesaw State University recently began an upgrade of their AV control systems to include the new Extron TouchLink touchpanel controllers…

Television Production Systems, Part 1

Public Access Television has been around for a good while—getting by on low budgets and sometimes shaky gear—but at Brookline Access TV, they’re doing it like the big guys…

Audio Networks for TV Shows, Part 1

Viewers of the steel cage matches on The Ultimate Fighter TV show may not be thinking about audio networking, but that’s what brings revved up sound from the crowd, the fighters and the announcers. Fernando Delgado of Stickman Sound in Las Vegas takes us backstage for audio networking to go with RockNet…

What sources did they feed to the main PA system? Obviously the crowd’s hearing the announcers. What else was being fed to the main PA?
Not just the announcers; there was music, certainly, and since there were big screens in all areas, there was also the sound coming with the big screens, and basically all kind of audio signals you would expect from a large PA system you would hear there as well. [Timestamp: 8:42]

And the production communication is a big job so many things have to be coordinated on so many levels. What did you use for intercom and how many com stations did you have to manage there?
Well intercom systems basically is an expertise we have for 15-plus years, since Riedel really started as a communications company before we moved into fiber-optic networks. And our product’s called the Artist Intercom, which also is nodes within a fiber network. Yeah, I know it sounds familiar, and there’s a reason why we are doing these things, so it’s also a realtime platform. And we’ve had six nodes of the Artist Intercom system in several areas of the event and had about 130 keypanels—all our 1000 series keypanels—connected to it, so it was really a huge system. [Timestamp: 9:32]

Right, and those Artist 1000 stations have several features that I’m sure came in very handy for this, and they’ve got some more sophisticated labeling and some sound-level features.
Well most people know intercoms system which only have four characters, and our labels have eight characters and it’s an LED display, which also works in the bright sunshine. You also have a listen control—volume control, basically—next to each key, so these are key features which really are very different from com systems other people might know—especially North America. [Timestamp: 10:11]

A complex enough setup just on the hardwired side, but of course you had also wireless links, telephone interfaces, and walkie-talkies. How was all that done?
Well we’ve used a digital trunk radio network, and that was also connected to the com system intelligently—which means you could have point-to-point or group calls, individual calls, between each key panels and a certain number of radios and vice versa, which is very different from just having a radio 4 wire connected with the com system. No, this was a real intelligent installation with about 400 radios in that network, talking via our [RiFace] interface to the Artist intercom. [Timestamp: 10:56]

And a huge amount of preplanning had to go into that. I noticed during the coverage several times, I would catch a glimpse of the big screens out there that the crowd was watching. What was used to transmit the video signal to those? That was wireless, right?
Yeah, that was a wireless video link; well figured out. Basically that’s a product which our rental department has developed, and we only offer this on the rental service, which is called the Best Boys. So basically it’s little boxes, waterproof boxes, which you just put next to the videowalls and provide you with digital video signals, and that’s basically the way we distribute these signals on really a wide, wide area. [Timestamp: 11:37]

And right at the beginning of part one, you mentioned that you also provided data services. What else did Riedel provide for the event in terms of Internet and servers that you didn’t see on TV coverage but are obviously essential for the whole thing to work?
Yeah, the whole backbone behind the scenes—I would call it the nerve system, basically, of the event—we are responsible for, and this includes not just all the broadcast and event-related technologies, but also all the IT stuff. So basically in the media center, for all the photographers and journalists, they had a need of connecting to the Internet. We had connections to servers and overseas and all this complex IT network certainly needs to be protected as well. That’s why it was all the firewalls and such, we put that in as well. So it was a huge network in the IT field on top of what we have done on the broadcast, entertainment, communications, and signal-distribution side. [Timestamp: 12:36]

A huge event, fantastic coverage; I was just knocked over by the show and this was a tremendous challenge handling sound, video, and communications. I congratulate you, Thomas on, a well-deserved innovation award from IBC. It’s been great having you here on the SVC podcast. Thomas Riedel from Riedel Communications and the Red Bull Air Race, thanks so much for being here.
Yeah, thank you very much for your time.

Previous1 2

Featured Articles

Close