Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

4K Networked Video Image Quality, 1-Gig, and the Latency Discussion

Don’t miss a social beat – follow #MyInfoComm2018 and make sure you visit Crestron located in the Central Hall of the Las Vegas Convention Center at booth C2562.

Last year justifying 4K video product purchases was a stretch.

According to interviews we’ve done with end users this year, when it comes time to purchase projectors or flat panel displays many are opting for 4K as the price-point is coming down and the higher-resolution provides future-proofing. Depending on the application, 4K camcorders and PTZ cameras are beginning to make their way into more corporate, medical, and higher education classroom environments. Finally, the “no 4K content” argument is disappearing.

In addition, mission critical video content viewed on 4K displays in applications such as simulation training, medical procedure streams, and in command and control centers demands the highest quality and the least latency possible.

“If you think about these larger spaces where you have more complex AV, those are typically the customers that have higher requirements, because they not only have more sources, they have a higher demand for media in general,” says Daniel Jackson, Director of Enterprise Technology at Crestron. These customers are already investing in 4K because no one builds a room with a two-year lifecycle in mind. “You’re trying to build rooms for a minimum of five-years, usually 7 to 10 years or longer, and 4K is expected.”

Customer demand for 4K has increased steadily in the past six to 12 months. Alex Peras, Manager of DigitalMedia, cites key drivers as the release of the Apple TV 4K and MacBook Pro and other 4K-capable laptops. “4K has become ubiquitous,” says Peras. “If you actually want to make full use of the computer that you carry around, you want to support 4K in all your spaces.”

The jury is no longer out: Whether in large or smaller spaces, a 4K ecosystem is a must. The important part is how to put this 4K ecosystem on the network.

91 Times to the Moon and Back

Give or take a few trips to the moon and back, there are more than 70 billion meters of CAT5e and CAT6 deployed throughout the world. Unless an installation is part of a major renovation or a new build, Jackson says, “that’s an absolutely staggering amount of installed cable in buildings that support 1-Gig and soon, 2.5-Gig and 5-Gig Ethernet.” Stepping up to Cat6a or Cat7 to support a 10-Gig infrastructure to every endpoint is a rip-and-replace when it’s not necessary—especially when all of that bandwidth is taken up by a single application—video.

New advanced video compression technologies are enabling the transport of 4K signals over standard 1-Gig Ethernet. The question has to be asked, why wouldn’t you use the existing 1-Gig infrastructure? If you don’t ask it, your CFO surely will.

There’s more than one way to transmit 4K60 4:4:4 HDR signals over a network. No matter how you slice it, the 4K signal that comes across HDMI or DisplayPort connections is a whopping 18 Gigabits per second. “You either say I’m going to compress it a little bit and send the 4K signal around on a 10-Gig network,” notes Jackson. “Or, you say I’m going to compress it a little bit more and send it over standard 1-Gig Ethernet.” It all comes down to what the picture looks like on the other side.

Fuzzy Numbers

There’s no shortage of controversial topics in the AV industry. It’s widely accepted that when reading a video display spec sheet, the stated contrast ratio should be taken with a grain of salt and a pinch of skepticism. To a certain extent, these numbers can be gamed, depending on the ambient light when the contrast ratio is measured, as well as other factors.

Some contend that spec sheets mask a multitude of fuzzy math, especially when interpreting networked video quality. Comparing two products from different companies with the same specs such as 20:1 video compression running on a 1-Gig Ethernet doesn’t present the full picture. “There’s no spec you can read for the quality of compression,” says Peras. “You’ll get vastly different performance results depending on the type of process used to achieve the compression.”

Jackson agrees, “You can’t just read it off of a spec sheet anymore because we’re in a world of compression. It’s all about what the video quality looks like on real-world content.” The best way to determine if a product meets your quality standards is to ask vendors for a demo in your facility using content familiar to you.

The Latency Discussion

In 2014 a team of neuroscientists from MIT found that the human brain can process entire images that the eye sees for as little as 13 milliseconds. The study appears in the journal Attention, Perception, and Psychophysics. “This ability to identify images seen so briefly may help the brain as it decides where to focus the eyes, which dart from point to point in brief movements called fixations, about three times per second,” says Mary Potter, an MIT professor of brain and cognitive sciences and senior author of the study. “Deciding where to move the eyes can take 100 to 140 milliseconds, so very high-speed understanding must occur before that.”

Anything below 13 milliseconds is considered imperceptible by mere mortals—and numerous terms are used to describe what is nearly imperceptible in the sub-50 milliseconds latency range. Some refer to this as “near-real-time,” while others say “zero latency. Crestron uses “no additional latency.”

Latency is often added because of scaling or compression, thus networked video compression inherently adds latency. “If a manufacturer is not optimizing for it, you could have a ton of latency,” warns Peras. “There are solutions out there that can add anywhere from 50 to 150 milliseconds of latency.”

Acceptable latency matters in many applications. If video is being streamed to an overflow room, then latency hardly matters at all. A couple of seconds of delay might be acceptable. “However, imagine trying to move a mouse around, and it’s dragging behind you ever so slowly on the screen,” says Jackson. “It becomes really frustrating.” Acceptable latency in this application is under 50 milliseconds.

Make sure you visit Crestron during InfoComm at booth C2562 to discuss these topics and check out a live demo, so you can learn to judge video quality beyond the spec sheet.

During InfoComm watch for the fourth of six installments of the InfoComm Networked AV Series where Jackson and Peras discuss, “The Single Point of Failure?” With several pieces of AV hardware, various software applications, multiple user access, and more; identifying a single point of failure might be several points of failure. We discuss how to mitigate failure.

InfoComm18 Networked AV Series:

One of Six: Networked AV: More Than a Disruptor

Two of Six: Not All Digital AV Needs to be on the Network

Featured Articles

Close