Piling On The Pixels
When will enough display resolution be enough?
Over the past 20 years, we've gone from 525-line video to 720p, 1080i, and now 1080p. The question is, “How many more pixels do we need?” There's been a lot of buzz lately in the home theater marketplace about “1080p” — more specifically, projection and direct-view displays with native 1920x1080 pixel resolution, scanned progressively. LCD monitors 37 inches and larger have achieved it, while a new crop of 50-inch, 55-inch, 60-inch, and larger 1080p plasma monitors and TVs with this level of resolution will be arriving in stores this fall.Contrast vs. perceived resolutionAccording to the late Kim Milliken's acclaimed “Angles of View” tutorials for Da-Lite Corp., visual acuity of 1 arc-minute would be equivalent to picking out a black dot of 20-point weight at a distance of 28 feet! Most of us don't have eyes that sharp. Milliken goes on to state that text shown on a screen must be at least 9 arc-minutes in size as a general rule, which then would allow us to clearly see that same black dot. Otherwise, applying that “one-size-fits-all” factor of nine means we'd need the screen to be no more than 3.1 feet away for comfortable viewing.A little bit closer nowPackin' those pixelsClearing the barAs we increase the number of imaging pixels in our electronic display systems, we need to maintain sufficient contrast and bandwidth — the former to ensure the display of as much image detail as possible, and the latter to make sure that detail doesn't get choked off before it hits the screen. One way to quantify the relationship between contrast and bandwidth is to measure the modulation transfer function. This is a way to plot the degradation in contrast ratio as the resolution, or frequency, of information in an image is increased, and is often used to describe the performance of camera and projection lenses, along with imaging devices such as charge-coupled devices (CCDs).Through the screen doorTying it all together
The microdisplay (MD) rear-projection TV industry, wary of the rapid decline in plasma prices, is moving to 1080p imaging as fast as it can. The 1080p front projectors for consumer and professional applications are also coming into the market, albeit with mostly hefty price tags.
But the rush to high resolution hasn't stopped there. LCD monitor manufacturers are now showing prototypes with 3K and even 4K resolution. In the projection arena, Sony skipped right by 2K resolution and introduced a pair of 4K (4096x2160 pixel) SXRD projectors a couple of years back for everything from digital cinema to rental & staging applications.
Is that enough, or should we move to 6K, or even 8K, as NHK and Olympus have shown at NAB 2006 with special interlaced high-resolution camera systems? It's not an easy question to answer, especially because it has more to do with the limits of human visual perception than anything else.
Resolution can be measured in a number of ways. For years, film-based imaging (and cathode ray tubes later on) defined image resolution in lines, or line pairs per millimeter.
Imagine a sequence of black and white lines, like a standard luminance multiburst pattern. (These could be arranged vertically or horizontally, but we'll go with the vertical orientation for now.) Each black-white set is considered a “pair,” and the limiting resolution of the projected image is the smallest pair of black and white lines that can be clearly discerned by a viewer.
If you think about it for a moment, you'll realize that such a test works best when the black is 100 percent black, and the white is 100 percent white. If either line moves more toward the gray region, your ability to tell them apart degrades. The same number of lines is still present with 25 percent/75 percent grayscale ratios, but you can't see them as clearly. In fact, if the lines have the same gray value, they'll appear to be a solid mass.
Right away, it's apparent that the perception of resolution is limited by contrast and human visual acuity which at its absolute best is about 1/60th of a degree, also known as 1 arc minute. (Normal human vision takes in about 30 degrees of arc horizontally.) Industry consensus is that the average viewer can perceive about 30 line pairs per degree.
Coincidentally, that number is similar to the recommended viewing distance (3.1X) for the original analog high-definition TV system proposed and developed by NHK. That system, which used 1,125 interlaced scan lines (about 1,080 visible), was developed because rooms in Japanese homes were too small. That's right! The average viewer couldn't put in a standard-definition television with a larger screen, as their seating distance couldn't be extended, and they'd be more likely to see the picture scan lines.
Our NTSC system, also used in Japan, called for a minimum viewing distance of seven times the screen height. That number was calculated in the 1940s, way back when the maximum screen size for a TV was never expected to exceed 20 inches diagonally. A 20-inch TV would have a screen height of about 12 inches. Applying the 7X rule results in a minimum viewing distance of 84 inches, or 7 feet. This was all well and good, except for the fact that TV screens did get larger — a lot larger. In fact, the largest NTSC set that ever made it to production was made by Mitsubishi in the 1990s and had a 40-inch diagonal screen.
Following the 7X rule would result in a minimum viewing distance of 14 feet for this TV! Hence, the move to HDTV, which wasn't developed so that screens could get larger, but so viewers could sit closer to the screen. The wide aspect ratio also helped, as it more closely approximated that 30-degree field of view. In tandem, higher resolution and wider viewing angles have resulted in a more life-like video presentation.
Even though Mitsubishi could make a 40-inch picture tube, the resulting TV weighed a ton — or, more accurately, 1/6 ton. Consequently, the move to higher picture resolution would be stalled until the first fixed-pixel imaging systems made their debut, starting with LCD overhead projection panels and high-temperature polysilicon LCD projectors in the early 1990s.
Other manufacturers harnessed combinations of cathode-ray tubes and liquid crystal light valves to produce products such as General Electric's Talaria and Hughes-JVC's Image Light Amplifier. At their peak, products like this were able to achieve 1280x1024 resolution, slightly over 1K of pixel density.
Along with fixed-pixel displays came another way to define resolution. This definition didn't use alternating black and white lines; it used rows of imaging pixels. That was fine as long as the source images contained an equal number of pixels, and the electronic display system's bandwidth was sufficient to get all those pixels to the screen.
A decade ago, 1280x1024 (also known as Super Extended Graphics Array) was considered the pinnacle of electronic display resolution. You needed a CRT projector with 9-inch tubes to do it justice, or one of those mechanically complex light valve projection systems that cost up to $80,000 — all to get a whopping 2,500 lumens onscreen.
Things have come a long way since then. SXGA isn't even widely used anymore; it's been largely replaced by SXGA+ (1400x1050) in notebook computers and several new projectors. The industry is zeroing in now on 1920x1080-pixel displays and variants, including 1900x1200, 2048x1080, and 2048x1536 pixels.
Threshold of Visual Acuity. This United States Air Force 1951 resolution test target shows diminishing line spacing on groupings. It’s used to visually ascertain resolution in optical systems or visual systems. (“Digital Cinema Resolution — Current Situation and Future Requirements,” Matthew Cowan, Entertainment Technology Consultants, copyright 2002, www.etconsult.com.)
Think back to the luminance multiburst pattern I mentioned earlier. With sufficient bandwidth, we should be able to see the black-to-white transitions at higher and higher frequencies. The SMPTE recommendation for HDTV bandwidth is 30 MHz, but in real life the image processing electronics in many HD displays starts to roll off high-frequency detail above 20 MHz. If roll-off is severe, contrast suffers, and the transitions between black and white are minimized to the point where the test pattern appears as solid gray. The point at which this occurs is essentially the limiting resolution of the camera or display system. The culprit may be the camera, imaging sensors, imaging devices, or the projection lens. For that matter, it might even be the electronics used in between to process the signals.
So having 1920x1080 pixels in a display doesn't necessarily mean you'll automatically see all that resolution onscreen. Even if all other parts of a 1080p display are working correctly, the content you're viewing may be the limiting factor if it hasn't been encoded correctly or has been format-converted from another picture standard — one that had lower resolution to begin with.
One compelling reason to adopt higher resolution in a fixed-pixel display is that, at lower resolutions, the pixels themselves form an interference pattern over the content being displayed. This phenomenon is commonly known as the “screen door effect” and was first used to describe the look of low-resolution LCD projectors that were showing video. Just as NHK determined the closest viewing distance for 1080i video to be three times the screen height, so too are there minimum distances for viewing fixed-pixel images. Depending on the imaging device used and the pitch of the pixels, they can be slightly longer or even shorter than the 3X rule.
Matt Cowan of Entertainment Technology Consultants in Ontario describes the perceived limiting resolution for digital projection systems in his paper, “Digital Cinema Resolution: Current Situation and Future Requirements” (2002). His calculations (see table below) show that the sweet spot for viewing projected images is typically 2X to 2.5X the screen height — not that much different from the HDTV recommendation. He also concludes that a projector with 4K (4096x2160-pixel) resolution hits its limiting resolution at less than two screen heights viewing distance. At this point, the pixel structure becomes apparent to the average viewer, manifesting as a “screen door” over the image being projected.
In my own tests with a variety of 720p displays (including 1280x720, 1366x768, and 1280x768), the rule of thumb seems to be between 3X and 4X the screen height. With LCD projectors, the pixel structure is more noticeable than with LCoS or DLP projectors. Likewise, pixel structure is more easily spotted in plasma TVs than on LCD TVs.
As you can see from this discussion, there's a lot more to high-resolution imaging than purely pixels. The key point to remember is that human visual acuity can be measured (ever take an eye chart test?). Thus, the ability to detect grain, pixels, or other artifacts in an electronic image at given distances can be predicted (see chart at left). Ultimately, the question of how much image resolution you need depends on not much more than (a) the viewing distance and (b) the screen height. (Ironically, the 7.1X viewing distance for NTSC was supposed to result in “high-resolution images.”)
Surprisingly, in recent tests with a JVC DLA-HD10KU 1920x1080 D-ILA projector, I found my own home theater setup (82-inch diagonal Stewart matte screen with 12-foot viewing distance) didn't have a large enough screen to take advantage of all the picture detail from the HD-DVD and Blu-ray movies I was screening! My seats are closer to the 4X minimum distance for 720p projection, optimized for my setup.
Based on these and other tests, it would seem the need for 1080p resolution in screens smaller than 42 inches is unwarranted, is debatable between 42 and 50 inches, may be a good idea over 50 inches, and is a must at 60 inches and larger. Indeed, more and more microdisplay rear projection HDTVs in the 50-inch to 70-inch size range are now making use of 1920x1080 devices as prices continue to fall.
There are no limits to how much resolution we can pack into a display system, other than technical and financial considerations. We're more likely to run out of real estate for the screen before then. As studies have shown, there's no advantage to higher resolution once specific screen heights and viewing distances are achieved. At a given viewing distance with a given screen size, there's a limit to how much detail can be seen in an image. Improvements in contrast, accurate color shading, and bit depth can enhance the image. So can 3D — but not additional resolution.
That said, 2K imaging systems should find lots of applications in our industry over the next 10 years, whether they're used for screening high-definition video or as high-resolution image tiling systems to combine graphics, text, and video. In fact, it's safe to say that low-resolution video content is the biggest limiting factor right now for any mass movement to higher display resolution.
As for the marketing “hype” about 1080p, you may want to exercise caution before buying any such display. Having all of that pixel resolution will more clearly show the warts in scaled-up lower resolution video. However, as more high-definition content becomes available, and the quality of 1080 deinterlacing improves, 1080p imaging will come of age. (By then, manufacturers will probably be showing 8K projectors at InfoComm.)
Pete Putman is a contributing editor for Pro AV and president of ROAM Consulting, Doylestown, PA. Especially well known for the product testing/development services he provides manufacturers of projectors, monitors, integrated TVs, and display interfaces, he has also authored hundreds of technical articles, reviews, and columns for industry trade and consumer magazines over the last two decades. You can reach him at firstname.lastname@example.org.