ProAVmag

A Practical Approach To Live Streaming

How to ensure a successful live streaming event. 12/25/2006 7:34 AM Eastern

A Practical Approach To Live Streaming

How to ensure a successful live streaming event.

In August of 2001, newspaper editors from across the state of Texas were able to ask questions of Gov. Rick Perry while sitting comfortably in their own offices during the 4th Annual Texas Transportation Summit in Irving, TX. Next, they were able to see and hear him personally respond during an interactive webcast of the governor's statewide editorial board meeting, which was held in conjunction with the Transportation Summit. This interactive forum is believed to be the first time a governor used live streaming technology to hold such a meeting. Let's take a look at what went into staging such an event.

Pulling it all together

Prior to the Transportation Summit, Dallas, TX-based ViewCast Corp. arranged for access to two of six ISDN circuits at the site for dial-up, multi-stream video uplinks to Seattle-based Activate Corp. An analog telephone line at the site also offered dial-up laptop access to the Internet for receiving questions from remote participants. A pair of ViewCast Niagara portable streaming encoders, each coupled to one of the ISDN circuits via Cisco 804 ISDN routers, was used to encode three simultaneous streams as follows: ISDN-1 (28K audio only and 56K audio and video with video playback in a QCIF-sized display window); ISDN-2 (100K audio and video with a CIF-sized display window).

Dallas Edit/Lone Wolf Productions managed the live production. For image capture, a three-camera setup was used, with one camera locked down to show the governor and moderator. A second camera was mounted in the center of the room on a low tripod, used to show individuals from the local audience. A third, wide-angle, fixed camera was occasionally used for “crowd shots.” All of the equipment was mixed with a Videonics switcher and fed to the two Niagara encoders. ViewCast and Dallas Edit also supplied lighting, lavaliere microphones for the governor and the moderator, cameras, promotional clips, and static content — not to mention all the necessary audio and video mixing, encoding, uplink equipment, and services for the webcast.

How it works

This event is a real-world example of the type of opportunities production companies, video companies, and a host of other related firms can offer their customers. Thanks to the evolution of streaming technology, it's becoming easier to install and operate these systems.

The fact that streaming media via the Web is a fast, cost-effective way to provide direct contact to various constituencies means the demand will continue to grow.

Basically, you need a video source, an encoder, a streaming video server, a content delivery network, and a web server (see Fig. 1. on page 54). Installation can be as easy as cabling a camera to the encoding station, which is then cabled to the streaming server. Access to a content delivery network and a web server are gained via the Internet. While most encoding stations are located in a broadcast studio, portable units that weigh less than 10 pounds are now available, enabling anytime, anywhere streaming video.

The video source is typically one or more streams of analog video from cameras, DVD players, or VCRs. These video sources have an analog video connection to the video encoding station. It's common for live broadcasts to connect the cameras to video production and editing equipment, or to an interactive video network, before being passed on to the encoding station.

The encoding station is a computer workstation that captures and encodes the video and audio. These systems must have the power to encode one or more video and audio streams, either in software or via a hardware codec. Individual compressed streams can vary from 20 Kilobits/second (Kb/s) to 500 Kb/s or more. The connection between the encoding station and the video streaming server must be able to accommodate the total of the bandwidths of the individual streams, and must be a clear and reliable connection.

The video streaming server is responsible for delivering compressed video to each individual request for a particular video stream. This is usually handled by one of the commercial streaming media software packages, such as RealNetwork's Helix Server or Microsoft's Windows Media Services. The bandwidth connection to the video streaming server must accommodate the total bandwidth of all the requests for a video stream (unlike the encoding station, which must only accommodate one copy of each).

As a result, the video streaming server usually has a direct connection to a very high bandwidth line. For example, if there were 100 requests for a video stream compressed at 28.8 Kb/s, the server would require at least a 3 Mb/s connection. It's possible for the encoding station and the video streaming server to be one single system. However, this would typically be for a situation with limited performance requirements (e.g. a single input stream and a small number of viewer requests) and would still require a fairly high-performance system.

It's much more common to have two separate systems. To make the streaming process easier, to ensure the best quality video for viewers, and to streamline podcasting, ViewCast recommends using a content delivery network (CDN).



1 2 3 Next

A Practical Approach To Live Streaming

How to ensure a successful live streaming event.

The web server for video streaming is in no way different from other web servers. The website contains a URL link to the video streaming server —one for every available video stream. Typically this is a link to be selected on the web page.

The systems requesting a video stream over the Internet (or an intranet) must have playback capabilities. Microsoft offers the most commonly used player, but Quicktime (Apple), RealNetworks, and others offer players as well. Some video streaming applications are implemented in such a way as to include the player download in the stream. The players can be downloaded for free.

Making streaming a success

There are five key issues to successful live streaming video, which pertain to live and on-demand events (when content is played back at a later time), including:

  • The most critical part of the webcast is the uplink quality. It must be a continuous, uninterrupted link. It's possible — and common practice — to use the Internet, but it needs to be a broadband link. You can run performance speed tests to verify this link. A good rule of thumb is the bandwidth needs to be at least four times faster than the transmission of your signal (the bit rate of your feed).
  • The second most critical element is production value. Proper lighting, sound pickup, mixing, multi-camera use, and the use of teleprompters when needed are key elements to good production. This can be done with low-cost equipment, including new PDA-driven teleprompters for semi-pro cameras, such as those available from Canon and JVC.
  • Pre-arrange the network uplink with the content delivery network, and test the link in advance of the broadcast. Depending on the vendor, most delivery network producers will provide guidance on test methods, offering the end-user a test board and test feed to ensure everything runs smoothly.
  • Don't use ISDN for uplinks, unless there's no other way. They're always very difficult to work with. Because these links are often misconfigured by the telephone company that provides them, it's difficult to determine exactly what they're capable of. In 2001, ISDN was one of the only uplinks you could get. Today, it's actually one of the least available options.
  • Have before and after content ready to stream to provide some flexibility in the schedule and give participants time to log in and get comfortable.
  • Technology takes great strides

    After much tweaking, live streaming media on the web is easier today than ever to provide and view. Companies, schools, and government agencies are quickly learning the advantages to offering live and/or stored video to their various audiences for live presentations or prerecorded material.

    The fact that streaming media via the web is a fast, cost-effective way to provide direct contact to various constituencies means the demand will continue to grow. No other technology offers such a rich, convenient means for a host of entities to deliver their message. That's why systems integrators and pro AV content producers should embrace the technology, allowing it to take them as far as they can go.

    CURES FOR COMMON LIVE STREAMING CHALLENGES

    Despite the evolution of easier-to-install systems, problems do occur. Here are some of the most common problems — and their solutions:

    Audio and video is out of sync. Audio/video capture cards perform audio/video capture only. AV synchronization is performed by the application. There are a few possible issues of the source video device that could result in an application having AV sync problems. During capture, the audio/video capture card locks to the timing of the incoming source signal. This allows the card to properly time the video and audio sampling, which are taken together. If the source video signal is unstable, or if the source is switched between different sources during capture (and those sources aren't locked to a common clock), then AV sync loss can occur. You may need to place a time base corrector (TBC) or frame synchronizer between the source and the card input to correct these problems.

    For some cards, when capturing from digital sources via SDI, DV, etc., the sampling rate of the audio must match the sampling rate of the source. Some cards also require that when capturing from digital source via SDI, DV, etc., the source must be started before the card starts capturing.

    Is it possible to extract TeleText captioning from a PAL source — as it is to extract closed captioning from an NTSC source with the audio/video capture cards? Closed captioning (CC) is specific to NTSC only. The PAL standard doesn't support CC. Instead, PAL supports a scheme called TeleText, which is transmitted through the source video in the vertical blanking interval (VBI) part of the analog signal. Most cards capture this part of the signal, and output the raw VBI data from any analog source connected to it, from which TeleText (or any other kind of data that happens to be there) can be extracted. Processing of raw VBI data into the desired form requires a custom application. The format of the data isn't defined by the card but by a published specification, an application or third party, etc. Raw VBI data is captured on most cards in both NTSC and PAL modes.

    No audio is recorded. This could be related to the primary card preferences in the Windows control panel. To make certain that the audio/video capture card is the preferred recording device in Windows, do the following:



    A Practical Approach To Live Streaming

    How to ensure a successful live streaming event.

  • Using Windows Server 2003, you must make sure that audio services are enabled. To do this, go to Start, Control Panel, and Sounds and Audio Devices. If audio services are DISABLED (the default), this window will contain only a single check box that says “enable.” Check this box, and press “OK,” and you'll be prompted to reboot. If the service is already enabled, you'll see the usual tabbed Sound and Audio Devices window.
  • If you're connecting with Microsoft Remote Desktop, you must apply the following settings: When starting Windows Remote Desktop, you have an “Options” button. Under Remote Desktop connection Options, Local Resources Tab, Remote Desktop has a setting for “Remote Computer Sound.” The default is “Bring to this Computer.” Change this setting to “Leave at Remote Computer.”
  • How can I create custom software applications with audio/video capture cards?

    Custom application development on many cards is done using the Microsoft DirectShow SDK. DirectShow is a sub-set of DirectX, and is primarily designed for the C/C++ programming language. Specific details and specs about DirectShow can be obtained from the Microsoft DirectX website, the Microsoft Developer Network site and from the DirectX/DirectShow documentation. Your card supplier should provide developer documentation about specific interfaces and functionality, as well as sample applications with source code in C++.

    I'm trying to capture from my audio/video capture card using Premiere Pro, but I can't select the device. Why? For Adobe Premiere Pro (versions 7x) a special Premiere plug-in is required. This also requires card-specific WDM-based drivers for Windows XP or Windows 2003. This type of driver is NOT available for Windows 2000.

    My video preview goes black during capture, but the encoded stream is fine. When capture is stopped, the preview comes back. Why? This is normal behavior by design in accordance with Microsoft's recommendations, and not really a problem or bug with the card or the driver. The driver is somewhat restricted in the combinations of capture and preview video that it can produce at the same time. If the capture pin or preview pin alone are used, the driver can produce video in any size and rate.

    Mark Hershey is vice president of engineering, ViewCast Corp., Plano, TX. He has served in various roles, including customer service, product management, marketing, and engineering management at Collins Radio/Rockwell, VMX, and Intecom. He can be reached at markh@viewcast.com.



    Want to read more stories like this?
    Get our Free Newsletter Here!
    Past Issues
    July 2015

    June 2015

    May 2015

    April 2015

    March 2015

    February 2015

    January 2015

    December 2014