Deep Color
Sep 17, 2013 11:01 AM,
By Tom Kopin, CTS
A sometimes-unnecessary feature for today’s AV ecosystem.
The HDMI standard, which is now more than 10 years old, has brought along with it some challenging interfacing twists and turns—not the least of which is deep color (a.k.a. xvYCC and extended color).
Deep color came about because today’s direct-view and projection displays can show many more shades of colors than yesterday’s CRT displays could. While the standard for computer-generated color images was stuck at 8 bits per pixel (or 16.7 million colors) for many years, deep color has blown by that barrier and makes it possible to reproduce 10 bits, 12 bits, and even 16 bits per pixel, resulting in images with billions of color shades and wide color gamuts.
This is all well and good except that many pieces of digital AV gear can’t support the pixel clock rates required to transport these signals. And for many years, they didn’t have to: The Blu-ray format doesn’t handle anything larger than 8-bit color. As a result, AV systems that transitioned from analog to HDMI during the past decade didn’t need to pass much more than about 1.5Gbps of data per color channel, or about 4.5Gbps total. But that number jumps up dramatically with deep color modes, often to the point where it exceeds the maximum bandwidth of a category wire extender or even just a long HDMI cable.
To make matters worse, Blu-ray players activate these deep color modes automatically, based on Extended Display Identification Data (EDID) stored in the connected display and read by the player. Now the latest generations of video cards in desktops and laptops are activating deep color modes when they detect support for DC in connected monitors, TVs, and projectors.
While deep color can usually be disabled in Blu-ray player menus, there’s usually no way to turn it off on desktop and laptop video cards. And the end-user may unwittingly create a deep color connectivity problem by simply downloading the latest drivers for their video cards, resulting in a loss of picture and much hair pulling.
There’s no easy fix for this problem, short of completely disabling deep color. Using an EDID storage dongle between the video card and display may work, but what specific EDID set will be copied and stored in that dongle? You can’t use just any monitor, nor do you want to set your video cards to run at lower resolutions to bring the clock rate back down, not with more displays moving to 1920×1080 or wide UXGA (1920×1200) resolution.
One thing is for certain: Computer and video card manufacturers need to understand that (a) not every HDMI connection is made over a 6ft. cable to a TV and, (b) that there is a much larger AV ecosystem using their products than they may realize.
Better communication between both computer and interface manufacturers would help, and so would providing user controls to switch and disable CE features on video products that many customers just don’t need or want such as deep color. Thanks, but no thanks!
Tom Kopin, CTS, is an engineer for Kramer Electronics. He works closely with Kramer’s factory with regards to product development, product quality assurance, and product application support.