Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

If You Can’t See It, It’s Not There

Just how important is true 1920x1080 resolution in a display?

If You Can’t See It, It’s Not There

Just how important is true 1920×1080 resolution in a display?

IF YOU follow the electronic display market as I do, you know that one of the hot topics now is “1080p.” More specifically, it means having a front projector, rear-projection monitor, or flat-panel display with true 1920×1080 pixel progressive-scan imaging. 1920×1080 is significant for a number of reasons. It’s a legacy HDTV format that evolved from the original 1,152-line analog standard used in Japan and is also a common display standard for TVs and personal computers. Finally, 1920×1080 interlaced TV is the most popular picture format for HDTV distribution via satellite, cable, or over-the-air.

It would seem that everyone wants to have electronic displays capable of showing every one of those 2,073,600 pixels, and that anything with lower resolution would be a compromise. But would it? Right now, the most common display formats for computer screens and TVs are all grouped around 720 to 768 vertical lines/pixels. LCD TVs all have 1280×768 (Wide XGA) resolution. Plasma monitors from 42 inches to 65 inches have 1024×768 non-square or 1280/1365/1366×768 pixel matrices. Rear-projection HDTVs come with microdisplays that use 1280×768 or 1280×720 imaging devices. Widescreen front projectors for home theater and commercial use are equipped with 1280×720 and 1366×768 chips and panels.

There are plenty of products out there that can support native 720p (921,600 total pixels) HD programs, and of course all of these products will show 1080i as well with some pixel decimation (about 29 percent) to fit the smaller screen. So why aren’t these products sufficient?

To best answer the question of how much resolution is required to show HDTV, bear in mind that our current analog TV system assumed way back in the early 1940s that TV screen sizes would never exceed 20 inches diagonally (that works out to about 12 inches vertically with a 4:3 aspect ratio). The optimal seating distance was then calculated to be 7.1 times the screen height when showing 525-line interlaced video, or around 8 feet with a 20-inch TV.

It was determined that at that distance, the human eye would not be able to make out the interlaced picture scan lines and the images would appear to be smooth with high resolution. Consider that at a viewing distance of 12 inches, the visual acuity of the normal human eye is 0.0035 inches. At 120 inches (10X), that number would drop by a factor of ten (0.035).

In a 50-inch plasma display with an array of 1366×768 pixels, the pitch of individual pixels is typically less than 1 mm (about 0.9 mm), which equals 0.039 inches. Do the math, and you’ll see that standing 10 feet from a 50-inch plasma means you can barely perceive the HD pixel structure, and that’s only if you have 20-20 vision.

To jam 1920×1080 pixels into that same 50-inch screen size means we’d have to shrink the pitch of each pixel to 0.025 inches. And I’ll bet the average person couldn’t tell the difference.

Remember that the optimal viewing distance from an HD (1920×1080) display is 3.1X the screen height. In other words, you can sit farther back, but you shouldn’t sit any closer to avoid seeing picture scan lines. At a distance of 10 feet, the differences between a 1080p and 720p image in a 50-inch projected image will be hard to spot, particularly if created with fine-pitch microdisplays such as LCoS and DLP.

There are other factors to consider. A high-resolution image with image artifacts such as motion smearing, incorrect white balance or color points, and grayscale rendering problems may not look as realistic as a lower-resolution image without these problems. There have even been instances when higher-resolution displays look softer with HD content than lower-resolution models. In a recent test I conducted of CableCARD TVs, I had one wide VGA (852×480) plasma TV showing crisper-looking 720p and 1080i content than a native 1920×1080 LCD TV sitting nearby. I’ve also seen sharper HD pictures on those wide VGA plasmas than on higher-resolution 1024×1024 ALiS plasmas, probably because of the tricky scaling required to re-size the HD images to the non-square AliS pixel format.

The argument for higher and higher resolution only makes sense when all other image parameters are set correctly. And we’ve also got problems in the transport stream and original content to overcome. Live 1080i programming at lower bit rates can come up short to live 720p material with higher bit rates, as macro blocks and mosquito noise all affect picture detail. In an ideal world, our 1080p content would be served up at a high bit rate (20 Mb/s or more) using MPEG-2 to a color-corrected, calibrated display with equal spectral response from the illuminating source and a nice, clean grayscale with no crushing of blacks or whites. The refresh rate would be either 24 Hz tripled to 72 Hz, or 60 Hz native — nothing lower. The signal interface from source to screen would be 100 percent digital to eliminate analog artifacts such as clipped bandwidth and ringing from standing waves. Many high-end home theater displays I’ve tested don’t have enough analog signal bandwidth to show even 720p HD signals.

Featured Articles

Close