Phil Hippensteel on Video Codecs

Author:
Publish date:
Social count:
0
AVOVERIP_HEAD_01.jpg

In the AV industry, there are many interpretations of the term codec. We’re going to discuss video codecs in this, our fourth newsletter.  However, most systems we call a codec share some common characteristics.

  1. They can be based on hardware, software or a combination of both.
  2. Their principal purpose is to prepare an audio or video signal for transport across a circuit or link that is outside their own device or software system.
  3. They will convert an analog signal to a digital signal at the point of origin.
  4. After they convert the signal to digital, they may or may not compress the bit stream.
  5. They may also insert the steam as blocks for placement in packets for use in a network such as IP networks.

The term codec is derived from their earliest use in voice communications.  When AT&T Bell Labs discovered that voice could be converted from analog to digital for transmission, it referred to the device that converted the signal into digital a coder, and the device that recovered the signal at the destination a decoder. Hence, the combination was referred to as a set of codecs.  When this happened in the 1950’s and 1960s, the resulting bit stream was transferred at 64kb/sec. Soon afterward, the idea was conceived that the digital stream could carry data as well as voice.  The data transmission industry was born.  Compression techniques were introduced so that many signals could be multiplexed onto a single wire.

Some of the first video codecs were introduce by Real Network in 1995 for the Real Player and was popular in the early days of the World Wide Web.  Soon after, Adobe entered the competition with the Flash encoder and player.  At nearly the same time, Microsoft introduced the Windows Media video and audio codecs that created wmv and wma compressed files for video and audio respectively. The codecs were changed several times but ultimately settled on the name WMP9.

Meantime, the standards groups were developing specifications in parallel with the greater video industry.  Most of the early specification came from the MPEG Group (mentioned in our last newsletter) and were targeted to the digitization, compression and transport of TV signals over satellites and land.  First there was MPEG-1.  But that codec didn’t support interlaced video.  For example, our typical standard definition is 480i, the i referring to the fact that the video is interlaced half frames.  Then, a short time later came MPEG-2 which could be used for both interlaced and progressive video.  It also provided for a transport protocol called MPEG-2 Transport Stream (also discussed in the last newsletter).  This format is used to store video files and for recording to secondary storage units such as CD’s and DVD’s. The next generation of codecs compressed using a method described as H.264.  This added wide flexibility in level of quality, frame rate, and advance compression techniques.  It is currently, by far, the most popular of the standard based compression codecs.

Proprietary vs. Standard Based Codecs

First, there are two types of standards: de jure and defacto.  A de jure standard is one that has undergone the scrutiny and ratification by an industry group that represents a broad consensus of the industry.  Defacto standards are ones that basically develop a large following.  However, they may be owned and licensed by a particular vendor who spent the time and money to develop the method.  H.264 is a de jure standard.  Adobe Flash is a defacto standard.  Google has the VP8 and VP9 codecs, which are popular but still are defacto standards.  Often such defacto standards are later ratified into de-jure standards.

 Which Path to Take

So, which should you use – proprietary (defacto) or standardized (de jure)?   This is a controversial subject but I lean heavily on the side of using standardized codecs such as Mepg-2 and H.264 which come from standards bodies such as MPEG and the IETF.  Generally, you get wider interoperability and longer life from the standards based codecs. Recently, there have been web articles and discussion about the death of Flash. Many lesser known defacto standards have been abandoned.  In my own experience, the last Windows 10 update my computer did, gave me a warning that the upgrade might cause certain wmv and wma files to become unplayable.  Meantime, the latest Windows Media Player fully supports the current H.264 standard.

Featured

Related

INFO18_Bestofshow_header (1)

InfoComm 2018 Best of Show nominations open

The program recognizes outstanding products exhibited at the InfoComm Show that are new since the prior InfoComm Show. Winners are selected by panels of professional users and editors based on descriptions provided by you via the nomination form as well as on judges’ inspection ...read more

SVC_01_18_Dante_Cover_v2 (1)

Supplement: Dante Enabled

In this supplement we take a look at this networking technology and the ecosystem that has grown up around it over the past 10+ years to include more than 1000 products. Includes a history of the technology, a selection of products and a hands-on case study for configuring a ...read more

mynab

Join The Social Conversation

Visit our NAB Show social hub and stay up to date with show news. To join the conversation, use Twitter hashtags #myNAB2018 and #NABShow. Read the NAB Newslink for headlinesSubscribe to ProAV Today ...read more

fleetio

Fleet Management

Control your fleet operations with Fleetio Manage, an all-in-one fleet management software for fleets of all sizes, whether you have 10 or 10,000 vehicles and pieces of equipment. The software provides asset management, so you can properly manage assets, allowing you to acquire, ...read more