Jan 1, 2006 12:00 PM, By Jeff Sauer
Number inflation obscures what really matters about image perception.
I don't know about you, but I'm sure becoming jaded by display industry contrast ratio specifications. No, I'm becoming rather annoyed. It was disconcerting enough when the industry started measuring contrast differently, and we were hearing numbers like 2000:1 and 3000:1. Now some manufacturers are throwing out numbers like 10,000:1, 20,000:1, and even 200,000:1.
Frankly, I have no idea what those numbers mean. It sounds as if they would hurt; like going to the beach on a sunny day. Maybe it's my blue eyes (which are more sensitive to bright light than brown eyes), but if I walk out into the bright sun, I seriously have to squint, and for a long time. If I try to look into the dark depths of a beach bag to find my sunglasses and the sunscreen, well, I might as well rely on touch, because my eyes will be worthless in that kind of contrast.
Physiologically, our eyes have an instantaneous or spot contrast ratio of somewhere between only about 100:1 and 200:1. Going beyond that can lead to the kind of pain I'm talking about at the beach. So, what good is a 200,000:1 contrast ratio from a display?
There is a good reason, of course, why manufacturers focus on strong contrast ratios as a measure of quality. And there is more to contrast ratio than literally meets the eye at any given instant. But leave it to marketers to rip the subtlety out of any positive concept and hammer away until it is nearly meaningless. Alas, that's what is happening with contrast ratio.
THE RISE AND FALL OF CONTRAST
Once upon a time, ANSI set forth an industry-standard way to measure contrast ratio using a black and white checkerboard of 16 equal-size rectangles in a 4×4 grid that covered the entire area of a screen. Introduced in the days of CRT projectors, that ANSI checkerboard was a direct reaction to an industry in which ad hoc measuring and reporting of contrast ran amok.
Of course, CRT projectors could achieve a brighter white if all the light were concentrated into a small point in the middle of the screen rather than if the light were spread across the entire screen. (The same is true today of plasma monitors, and it makes it problematic to compare plasma screens with LCD panels, which are equally bright whether there is a small white circle on a black background or a fully white screen — assuming a properly defused backlight.) Without any established industry practices at that time, measuring “white” could mean anything from measuring a fully white screen or a white circle in the center of a back screen. And that white circle got increasingly smaller as marketers got more aggressive.
Today, after a few comfortable years of relying on the ANSI checkerboard, the display industry has effectively moved backward, and it's to the detriment of image quality in more than one way. I like to blame the full-on/full-off measurement that most companies now use to claim high contrast, but that's not entirely the case.
Full-on/full-off contrast ratios gained favor, among marketers at least, with the rise of DLP technology, and the reason is fairly straightforward. Texas Instruments, recognizing that its mirror-based technology had an inherent advantage over the literal light-blocking nature of LCD panels and filters, jumped to full on/off as a way to accentuate that difference. OK, fair enough.
Unfortunately, we've reached the point of ridiculousness. It has no real-world application. (I never watch all white or all black screens, do you?) And it doesn't account for light leakage across the picture or outside the device. Far worse, however, many manufacturers now use electronic tricks to accentuate contrast ratio by forcing “black levels.” But that's where this quest for marketing advantage has really become hurtful to true image quality.
Forcing blacks to be blacker often also makes dark grays black. That rich blackness might look stunning on the showroom floor when Joe Consumer gives carefully selected test footage a quick once-over, but it ultimately defeats the reason why high contrast ratios are important in the first place: wide grayscale range. Some companies even have clever marketing names and image presets like “dynamic” color or “film,” using film's reputation for rich blacks and high contrast.
IT'S THE IN-BETWEEN THAT COUNTS
Our eyes are amazing instruments, and their complexity exemplifies why a single contrast ratio number, especially when pushed to an extreme, is so awkward. Though our eyes have a limited ability for instantaneous contrast (remember the beach), we can nevertheless discern remarkable detail both in very dark corners and on sunlit beaches. Is that a contradiction?
There are two advantages that our eyes, with their limited ability for instantaneous contrast, have over the seemingly larger contrast ratio numbers of display makers. First, within that instantaneous contrast ratio of 200:1, we have the ability to discern very small changes in luminance — or grayscale — in terms of what a projector or panel is trying to produce. That's how we can make out so much detail in shadows or in a dark alley. There is never any banding in the gradients our eyes see in nature: none across a blue sky, none across the shadows and light reflections on a white wall, and none in the countless shades of luminance and colors in a pool of water.
The emergence of 10-bit color is a big step in the right direction. Traditional 8-bit image processing yields a grayscale range of just 256 shades, while 10-bit processing gives 1,024 steps from black to white. New High Dynamic Range images use 16-bit processing that yields more than 65,000 shades of gray.
Second, 200:1 is indeed our eye's capability at one instant in time. Yet our eyes are also constantly moving, and our irises are constantly adjusting to new screens and new light levels. So while each momentary screen we see can have a limited effective contrast, our irises, and in turn our brains, are capable of an extremely wide contrast ratio over a short period of time — measured in the millions-to-one.
It's that very wide contrast range, with the ability to properly light all images and all screens, that causes display manufacturers to strive for higher contrast in individual displays. And, yes, that does make it appropriate that contrast ratio remains one of the critical measures of quality.
Unfortunately, clamping blacks and forcing dark grays to appear black — and shining ever brighter yet uncontrolled light — to beget a high number for marketing purposes is leading the technology in exactly the wrong direction. And it's to the detriment of the image — just like forgetting to put on sunglasses as you walk over the dunes and see the sunlight reflecting on the water.