Phil Hippensteel on Video Codecs

Author:
Publish date:
AVOVERIP_HEAD_01.jpg

In the AV industry, there are many interpretations of the term codec. We’re going to discuss video codecs in this, our fourth newsletter.  However, most systems we call a codec share some common characteristics.

  1. They can be based on hardware, software or a combination of both.
  2. Their principal purpose is to prepare an audio or video signal for transport across a circuit or link that is outside their own device or software system.
  3. They will convert an analog signal to a digital signal at the point of origin.
  4. After they convert the signal to digital, they may or may not compress the bit stream.
  5. They may also insert the steam as blocks for placement in packets for use in a network such as IP networks.

The term codec is derived from their earliest use in voice communications.  When AT&T Bell Labs discovered that voice could be converted from analog to digital for transmission, it referred to the device that converted the signal into digital a coder, and the device that recovered the signal at the destination a decoder. Hence, the combination was referred to as a set of codecs.  When this happened in the 1950’s and 1960s, the resulting bit stream was transferred at 64kb/sec. Soon afterward, the idea was conceived that the digital stream could carry data as well as voice.  The data transmission industry was born.  Compression techniques were introduced so that many signals could be multiplexed onto a single wire.

Some of the first video codecs were introduce by Real Network in 1995 for the Real Player and was popular in the early days of the World Wide Web.  Soon after, Adobe entered the competition with the Flash encoder and player.  At nearly the same time, Microsoft introduced the Windows Media video and audio codecs that created wmv and wma compressed files for video and audio respectively. The codecs were changed several times but ultimately settled on the name WMP9.

Meantime, the standards groups were developing specifications in parallel with the greater video industry.  Most of the early specification came from the MPEG Group (mentioned in our last newsletter) and were targeted to the digitization, compression and transport of TV signals over satellites and land.  First there was MPEG-1.  But that codec didn’t support interlaced video.  For example, our typical standard definition is 480i, the i referring to the fact that the video is interlaced half frames.  Then, a short time later came MPEG-2 which could be used for both interlaced and progressive video.  It also provided for a transport protocol called MPEG-2 Transport Stream (also discussed in the last newsletter).  This format is used to store video files and for recording to secondary storage units such as CD’s and DVD’s. The next generation of codecs compressed using a method described as H.264.  This added wide flexibility in level of quality, frame rate, and advance compression techniques.  It is currently, by far, the most popular of the standard based compression codecs.

Proprietary vs. Standard Based Codecs

First, there are two types of standards: de jure and defacto.  A de jure standard is one that has undergone the scrutiny and ratification by an industry group that represents a broad consensus of the industry.  Defacto standards are ones that basically develop a large following.  However, they may be owned and licensed by a particular vendor who spent the time and money to develop the method.  H.264 is a de jure standard.  Adobe Flash is a defacto standard.  Google has the VP8 and VP9 codecs, which are popular but still are defacto standards.  Often such defacto standards are later ratified into de-jure standards.

 Which Path to Take

So, which should you use – proprietary (defacto) or standardized (de jure)?   This is a controversial subject but I lean heavily on the side of using standardized codecs such as Mepg-2 and H.264 which come from standards bodies such as MPEG and the IETF.  Generally, you get wider interoperability and longer life from the standards based codecs. Recently, there have been web articles and discussion about the death of Flash. Many lesser known defacto standards have been abandoned.  In my own experience, the last Windows 10 update my computer did, gave me a warning that the upgrade might cause certain wmv and wma files to become unplayable.  Meantime, the latest Windows Media Player fully supports the current H.264 standard.

Featured

Related

NJR-T01UHD_f_THmbnail2

IDK IP-NINJAR

This video over IP solution for high-definition signal extension, switching, and manipulation, leverages off-the-shelf 10Gb Ethernet switching and enables signal management of 4K/UHD 4:4:4 at 60 Hz signals with zero frame latency and seamless, multi-format switching. Available ...read more

primeshot-20-hdmi-duo-sm-web_med

Vaddio PrimeSHOT 20 HDMI

Unveiled at InfoComm 2018, the Vaddio PrimeSHOT 20 HDMI camera is a 1080p/60 unit with a 20x optical zoom, 55 degree horizontal field of view, simultaneous HDMI 1.3, S-Video and IP streaming outputs and a web based setup and control interface. The zoom and resolution make this a ...read more

hall-research--fhd264-s-133869

Hall Research FHD264

While some AVoIP products are focused on 10G, this family of HDMI over LAN senders (encoders) and receivers (decoders) uses video encoding techniques to distribute up to 64 Full-HD video signals to hundreds of displays on a simple 1G local area network (LAN). The devices also ...read more

ClearOne_VIEW_Pro

ClearOne View Pro

This platform enables high-quality, 4:4:4 multimedia streaming on an existing IP network. It comprises a complete AVoIP distribution solution—including AV encoders, decoders, amplifiers, and advanced software licenses that support any multi-imaging need from simple to complex ...read more