Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Clearing the fog

A primer on High Definition video and what it means to the systems integrator and his customers.SMPTE 240M defines "High Definition" (HD) as video that

Clearing the fog

May 1, 2000 12:00 PM,
Josh Kairoff

A primer on High Definition video and what it means to the systemsintegrator and his customers.

SMPTE 240M defines “High Definition” (HD) as video that has a width of1,920 active pixels, 1,080 lines, a 60 Hz or 59.94 Hz frame rate, and 30MHz of bandwidth. This is the original analog HD specification that is usedas the baseline of HD quality. Additionally, SMPTE 260M describes thedigital implementation of the 240M standard. HD is subsequently defined as”an image having approximately double the number of horizontal and verticallines as current broadcast television with approximately the same framerate.” The line rate is approximately doubled (from about 15 kHz to about34 kHz), and the image must also have a 16:9 aspect ratio. The net resultis high-quality and life-like imagery, better sharpness and detail, a truerreproduction of color, a film-like appearance, digital storage andtransmission, variable bit rates for optimal balance of quality to medium,compatibility with film and computer image formats, global compatibility,and, perhaps most important, lower cos!ts for production, post production and distribution.

Further, within the scope of HD, there can be various levels of quality anddifferent configurations. Although it was initially developed as anadvancement of television, its quality and features have brought HD to theattention of other media industries. Film production, computer graphics,special effects, digital cinema, multimedia artists, medical imagery, postproduction and others have begun to make the transition to HD.

HD may have segmented configuration standards designed for specificapplications. Within this article, HD refers to any High Definition videoimagery. HDTV, on the other hand, is the specific subset configuration thatis part of a broadcast television system. HDTV (High DefinitionTelevision), because of its quality and limitation on data rate, is more ofa consumer-oriented product. HD can be used at whatever quality and datarate is desired. Thus, its flexibility (and the increased costs) make HDmore of a professional medium.

HDTV

At the end of 1995, after many false starts and modifications, the AdvisoryCommittee on Advanced Television Service submitted its final report onadvanced television (ATV). In this report, the committee recommended thatthe ATSC digital television standard be adopted as the United States ATVbroadcast standard. The ATSC established that broadcast HDTV systems mustbe digital and part of the ATSC’s ratified formats, use video compressionsyntax that conforms with MPEG-2 at a bit rate of approximately 18.9 Mbps,provide a signal quality equal or superior to SMPTE240M/274M, not take upmore bandwidth than the existing 6 MHz of normal television channels, anduse Dolby AC3 for audio at a nominal data rate of 384 kbps.

In 1994, the Moving Pictures Experts Group defined a standard for thedigital coding and handling of moving pictures and audio. Improving on anearlier standard, (MPEG-1, ISO/IEC-11172), MPEG-2 (ISO/IEC 13818) wasdesigned to be flexible, expandable, scalable and have a higher qualitysuitable for the broadcast industry. The original design of MPEG-2 wasfounded upon the requirements of normal video, but it soon expanded toinclude increased requirements of HD. Although MPEG-3 was being developedfor HD, it was realized that MPEG-2 could be modified to include HD, andMPEG-3 was dropped.

MPEG-2 has various profiles and levels that are set up to segment andclassify the configuration, bit rate and resolution of the video. In thisway, the most appropriate and least expensive encoding systems can be usedwhile maintaining compatibility with the MPEG-2 standard. Table 1 shows thedifferent profiles and levels in MPEG-2 video. Normal DTV video would beMP@ML, and ATSC HDTV 1,080i would be MP@HL.

MPEG-2 also defines a data transport stream structure that can be used todistribute the encoded image data. SMPTE 310M is a transport stream basedon MPEG-2 that is used for the ATSC 19.4 Mbps HDTV standard. The DVBtransport protocol used for satellite, cable and European digital videotransmission is also based upon the MPEG-2 transport stream structure.

HD signal flow

HD begins life as either film transferred to HD, a production captured withHD equipment, computer animation rendered as HD or standard videounconverted to HD. Within the production and post-production environment,HD is usually transferred between equipment as an SDI-HD (serial digitalinterface-HD, SMPTE 292M, 1.5 Gbps) signal. SDI-HD has become thepredominant standard for uncompressed digital HD signals.

Editing and post production can take place with uncompressed HD, but unlessquality matters more than money, MPEG-2 encoding usually happens first.MPEG-2 encoding is logistically simple but technically complex.Functionally, an engineer simply hooks the SDI-HD up to the encoders’input, selects the output format and runs with it. Technically, however,there is a wide variety of settings and adjustments that can or may need tobe set for the specific use. In this case, professional encoding servicesshould be consulted and used, especially for the first few projects.

Given good computing algorithms and enough processing power, the signalthat goes in does not have to equal the signal that comes out. Although itis always true that you cannot completely recreate missing resolution, Ihave seen some extremely good attempts. Some of the NTSC-to-HD upconverterson display at NAB this year produced output that would be acceptable to allbut the most critical viewer.

The MPEG-2 encoders’ output can be set to any number of establishedstandards. For DTV in the United States, the standard is an ATSC-compliantMPEG-2 transport stream (SMPTE 310). Some encoders can change bit rates bytaking MPEG-2 in, processing it and outputting MPEG-2 at a different rate.This data stream output can be stored or distributed and broadcast asdesired. The particular methods of storage or distribution can dictate theformat and bit rate from the decoder. It should be noted that the manydifferent ways of handling these large amounts of data has helpedcontribute to the confusion of HD.

The telecommunications industry has always been in the business ofcontrolling the synchronized distribution of large amounts of data. Theequipment, terminology and data rates they established were folded into theearly architecture of digital HD. Image data streams, unlike files ore-mail, work best with network protocols that maintain constant packetorder and signal flow, like ATM’s point-to-point rather than TCP/IP’ssend-it-everywhere-and-see-whatever-gets-there-whenever. The data rates forthe various types and combinations of network connections became the bitrate for many MPEG-2 transport stream standards.

Storage

Once you have encoded your HD content, you can store it, stream it ordecode it. In the computer world, HD is usually stored on hard drives,digital liner tape (DLT) and DVD-ROMs. At a data rate of 19.4 Mbps (2.425MBs) for ATSC-compliant HDTV MPEG-2 files, it takes 145.5 Mb a minute and8.73 GB per hour to store. As the bit rates rise, the storage requirementsfor HD grow quickly. Compressed but high-quality HD can have 64 Mbps to 120Mbps (8 MBs to 15 MBs). An uncompressed HD data stream requiresapproximately 11.25 GB a minute to store.

In the video world, tape is the preferred medium with Panasonic’s D-5format and Sony’s HDCam being the most popular. Incidentally, an HD versionof the DVCPRO with 100 Mbps was demonstrated at NAB. Designing an HD VCR ona lower cost platform like DVCPRO will bring more people into the world ofHD production.

For storage and playback, tape has the clear economic and flexibilityadvantage. Hard drives are better suited to nonlinear access and repetitiveplay, but with high bit rates or long content, they can becomeprohibitively expensive.

Streaming

This year at NAB, broadcasters, content creators and systems integratorswere offered methods of using some kind of data network as a distributionpathway for streaming media. Data networks and their interconnective, opennature are currently being introduced as an upward migration path fromtraditional analog baseband and RF distribution. Analog is far from dead,but it is no longer the only way to deliver content. Real-time HD, muchless regular video, is not yet realistic over the public network, but it isclear that the direction has been established. Within a private network,paying for the connection is the only real limit that you will find instreaming.

There are some protocol issues to keep in mind with streaming MPEG-2content over the Internet. The TCP/IP data packets and MPEG-2 data packetsare not the same size. This means that when MPEG-2 is sent over theInternet, there is unused space in some of the TCP/IP packets. Also, theInternet was never designed for maintaining sequential packet delivery.Between fractionating MPEG-2 packets and Internet latency, reliability andtransfer rates can suffer. To solve this issue, many companies providesoftware to pre- and post-process MPEG-2 to slip efficiently across theInternet.

Decoding

How the data stream is decoded depends upon its ultimate destination. If itis going into a digital device of some sort, then transcoding or bit ratechanging may be required. If analog signals are necessary, then an analogdecoder is needed. In either example, the solution is a piece of interfaceequipment. Not too long ago, finding SDI-HD-to-analog decoders wasextremely difficult because they were expensive and not readily available.Now, high-quality, inexpensive decoders are coming to market from manyvendors.

PCs even offer solutions. Some computer graphics cards have inexpensive DTVtuners as options. With such a card, a PC and a TV antenna, watching HDTVon your computer will be about as easy as watching a DVD-video. The qualitymay not be at the presentation level, but as an inexpensive way to beginwith HDTV, it simply cannot be beat.

Viewing HD signals

To display HD, the requirements are fairly straitforward. Any displaydevice (CRT, projector or flat panel) that can accept an image with a 16:9aspect ratio and 1,920 infinity 1,080 active pixels coming in at around 34kHz horizontal and 60 Hz vertical can potentially show HD. MostXGA-compatible or higher displays can show HD, although aspect ratio andcolor distortion may occur. Variable-resolution displays will most likelydo better than those with fixed resolution, unless the scaling devices onthe fixed-resolution displays have an understanding of HD. For testing andoccasional watching, a good multi-frequency computer monitor that can show1,600 infinity 1,200 seems to work just fine. Using a display designed forHD should work better with less connection complications, but be preparedfor a hefty price premium. For the near term, video projectors withHD-compatible inputs will probably be the most popular method of displayingHD signals.

Additionally, certain hardware is necessary to view HD signals. At theconsumer and prosumer level, HDTV systems need to have a tuner ordemodulator to separate the digital signal from the analog carrier wave(8VSB) on which they are transmitted, a demultiplexer to separate the audioand video portion of the ATSC transport stream, an MPEG-2 video and audiodecoder, and an HDTV-compatible display. All of these functions, except thedisplay, are performed within an ATSC DTV receiver. All of the DTVreceivers on the market take in the ATSC MPEG-2 transport stream andproduce analog outputs. The outputs range from NTSC composite tohigh-frequency component (RGB or YPbPr) at SMPTE-240M. Every manufacturerseems to have its own opinion on what connectors will have which outputsignals, and some DTV receivers produce unique outputs designed primarilyfor specific television models.

Because many displays accept high-frequency RGB, not YPbPr, you may needeither a DTV receiver with selectable outputs or a YPbPr-to-RGB converter.This brings up the ongoing issue of component video’s having a number ofdifferent meanings. RGB, RGBS, RGsB, RGBHV, YPbPr, YR-yB-y, YCrCb, YUV andY-C are all called component video by knowledgeable people from differentindustries. The first four are red, green and blue signals with differentconfigurations of sync. The next four are luminance (Y) and various methodsof calculating color space difference encoding. The last one is simplycomposite video with separate chroma and luminance channels, and hopefully,it will have nothing to do with HD. Given a choice, I always recommend RGBSas the means of establishing connections between devices.

HDTV content is already out there in a few different ways. HBO and ShowTimecurrently broadcast HDTV content via satellite, but a compatible HDTVsatellite receiver is required to view it. Most larger cities have at leastone television station that performs some HDTV transmissions. Content canbe put on an HD playback device and used like any other controlled videoplayback equipment. Such companies as Visual Circuits, Electrosonics,Videon Central, Alcorn Mcbride, Sencore, Quvis, Pluto, Panasonic and Sonyoffer equipment to store and playback HD and HDTV content. Some have 8VSBoutput and require a DTV tuner, some digital (SDI-HD or a DVB transportstream) and some analog component. It is best to make sure you understandthe input and output format for the equipment you are using. It is reallyeasy (and somewhat frustrating) to get everything together only to discoverthat you have incompatible pieces of equipment and that you cannot get theconverter from what you have to w!hat you need.

Featured Articles

Close