Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now


No Big Loss

BANDWIDTH ISN'T free, but is it fair to give up some image quality in order to use fewer network resources? That question isn't new, but a software company just coming out of stealth mode thinks that it has a fresh answer.

No Big Loss

BANDWIDTH ISN’T free, but is it fair to give up some image quality in order to use fewer network resources? That question isn’t new, but a software company just coming out of stealth mode thinks that it has a fresh answer.

BANDWIDTH ISN’T free, but is it fair to give up some image quality in order to use fewer network resources? That question isn’t new, but a software company just coming out of stealth mode thinks that it has a fresh answer.

Fig. 1. Qbit’s lossless compression technology shrinks files anywhere from 3:1 to 20:1, but with no deterioration of image quality. These screen shots (fig. 1. & fig. 2.) feature scenes from films such as “Madagascar” and “Treasure Island,” with tools covering attributes such as entropy, which is a measurement of the number of bits of “useful” information. “If a 16-bit image has entropy of 8, then it’s easy to replace the 16-bit image with an equivalent image with only 8 bits per pixel,” says Daniel Kilbank, Qbit’s co-founder. “That’s equivalent to reducing the file size by a factor of two.”  

In January, Qbit announced its first product: Q-Image, which the company says losslessly encodes and decodes raw raster data — the collection of values that can be assigned to each pixel in an image. Qbit also says that its techniques can be paired with other, widely used codecs — such as MPEG-2 and MPEG-4 — to compress images and video even more. “We can achieve a 25- to 30-percent gain losslessly on top of those existing codecs,” says Daniel Kilbank, president and CEO of the Bethesda, MD-based company.

Analysts who have been following the company say that its technology is promising, both in terms of what it can do and its market potential.

“Qbit’s technology can be applied to very large files, such as those used by companies that are exploring for oil fields, or medical services storing X-Ray and MRI files,” says Gerry Kaufhold, a principal analyst at In-Stat, a research firm based in Scottsdale, AZ. “Qbit’s technology reduces the sizes of these very large files but can recover the original image with zero loss. This is a very important and fruitful advantage.”

Waste not

By storing only the differences between each frame, compression codecs such as MPEG-4 — named after the Moving Pictures Experts Group —shrink video down to the point that it can be streamed over networks with bandwidth as low as 40 kb/s. Lossless compression uses a different means to the same end: As its name implies, it saves every bit rather than discarding the ones that other techniques consider redundant.

In the case of Qbit, a lossless approach doesn’t mean less compression. “It offers anywhere from three to 10X compression, depending on what flavor of the codec you’re using,” says Kilbank, who co-founded Qbit in July 2003, along with John Sculley, the former CEO of Apple Computer and Pepsi. And depending on source content, even greater compression ratios are possible, Kilbank says.

Fig. 2.

Qbit’s technology can be embedded in a wide variety of hardware that handles audio and video, including cameras, scanners, and set-top boxes. The company won’t reveal many details about how exactly it achieves that efficiency, mainly because the techniques are proprietary and patent pending. From a high level, Qbit forgoes common compression techniques such as fractal compression and discrete wavelet transforms. (For an overview of those techniques, see “The Future of Video Compression” in the September 2005 issue of Pro AV.)

Qbit divides the pixels into two priorities, but instead of discarding bits that the human eye won’t miss, it groups them into a low-priority transform. “As a final step, we’ve built an adaptive arithmetic encoder that takes those different pieces, assembles them back together, and your end result is a 100-percent lossless conversion,” Kilbank says.

Qbit isn’t the only company that’s developed lossless compression, but it appears to have gone a step further.

“Hewlett Packard Labs (HP Labs) has been offering Lossless JPEG for some time,” says InStat’s Kaufhold. “It’s based on the Lempel-Ziv lossless algorithms. These work just fine, but they only achieve about a 30- to 50-percent savings. The Qbit algorithms can provide lossless compression of up to 20:1.”

Help from Hollywood?

Qbit says that this approach has a side benefit of security because the transforms don’t contain information about the encoder’s parameters. So even if the data stream is intercepted, it can’t be decoded. Built-in security should help Qbit target the digital cinema market, where studios and exhibitors are looking for ways to ensure that movies can’t be, say, grabbed off of a theater’s server and pirated.

Qbit hopes to leverage the trend toward digital cinema, where movies are shipped to theaters over a broadband pipe rather than as reels of film trucked in. Although service providers have spent the past decade laying fiber all over North America, the cost of bandwidth still isn’t anywhere near the point that compression no longer is needed.

Plus, digital movies are huge files. During production, a single movie can produce 100 Terabytes or more of footage, depending on factors such as the film’s length. Without some form of compression, the raw footage would be unwieldy, especially if it has to be handled by multiple production staffs at different sites. The edited version for theatrical release is smaller, but at an average of 6 to 10 Terabytes, it’s still a handful for even the fattest broadband pipes.

Even though studios want to eliminate overhead costs such as striking prints and shipping them, those savings can’t come at the expense of image quality. Hence the appeal of a compression technique that claims to preserve all of film’s nuances — even when there’s no film involved.

A de facto standard

Qbit currently is in due diligence with several major Hollywood studios. Adoption there would help validate the company’s technologies in the eyes of potential customers skeptical about using a proprietary technology rather than a standards-based solution. “It’s hugely important in terms of validation,” Kilbank says.


For more information about compression in pro AV, including digital cinema, check out:

The Future of Video Compression – MPEG-2, MPEG-4, MPEG-7, and beyond — the selection of video compression standards just keeps growing. Here’s an update on the latest developments in existing and emerging technologies, image quality, and bandwidth, September 2005

What’s Next: 3D On The Big Screen – Digital cinema is finally coming, and 3D hopes to ride it into the mainstream, June 2005

Cheat Sheet: Sending Rich Media Over IP – New video compression standards and IP networking techniques have made it possible to deliver high-quality video and audio reliably over local area networks (LANs) or wide area networks (WANs). Here’s how it’s done, April 2005 Qbit offers white papers, upon request, that provide more details about its lossless compression technologies, including how they compare to other solutions. For more information, visit

One way to build a following in Hollywood is to deliver on the ideal of a low-cost distribution system that doesn’t require a tradeoff in image quality. “The short term hurdle for digital cinema is they’re very limited in terms of technology that has the capability of maintaining the integrity of those files,” Kilbank says. “A lot of the common infrastructure for MPEG-4 and MPEG-2, where they’re just throwing away a lot of the information, isn’t suited to that new agenda.”

Besides winning over studios, Qbit also would have to get the blessing of key players such as the American Society of Cinematographers and high-profile directors who have sworn off film, such as James Cameron and George Lucas. Their support would help build a following on the production side that then could spill over into distribution. “It gives us an opportunity to become kind of a de facto standard in the back-office world and then pushing for either standardization or de facto standardization in the distribution model,” Kilbank says.

Qbit also hopes to increase its market potential by developing technologies that can work with existing hardware. “This could be hooked directly into a KONA capture card instead of running a fiber optic from a digital camera back to a storage area network and then doing compression afterward,” Kilbank says. “This could be done in real time, right off of the camera, right to a flash card. So in terms of upfront, on-the-fly compression and storage, it has a huge impact on their workflow.”

Although Qbit’s algorithms are robust enough to keep up with real-time video, the company says that its technology still fits a lot of off-the-shelf hardware. “We’re less complex than JPEG-2000, so we’re able to embed on standard systems-on-a-chip hardware,” Kilbank says.

The chances of adoption outside of pro AV are worth noting because that would help drive up volumes to the point that the technology’s price starts to fall, improving the chances that it can be used in a wider range of applications. On the consumer side, for example, Qbit’s technology could be used to wring more bandwidth out of DSL services.

“I believe that the key early adopters for Qbit’s technology will be professional or medical imaging storage applications, where the lossless nature fits their needs the best,” says In-Stat’s Kaufhold, who has been tracking the company since its inception.

In a sense, Qbit is similar to Qualcomm, which almost single-handedly cheerled CDMA from a proprietary, niche wireless technology into the basis for the networks used by carriers such as Sprint and Verizon Wireless. Qualcomm overcame skepticism by proving that its approach was more efficient than incumbent technologies — even those that were already widely adopted.

Says Kaufhold, “If Qbit’s technology really is as good as they claim it is, then they should be in a position to follow the Qualcomm model, in which a proprietary technology from a single-source vendor provides such powerful benefits that the world beats a path to their door

Tim Kridel is a freelance writer and analyst who covers telecom and technology. He’s based in Kansas City and can be reached at [email protected]

Featured Articles