Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Phil Hippensteel on The Effect of Network Delay

In our last newsletter, we discussed the important differences that result from using TCP (Transmission Control Protocol) rather than UDP (User Datagram Protocol) as the transport protocol for IP video.  You will recall that TCP guarantees delivery, adapts automatically to network conditions, and adapts to competing traffic levels.  On the other hand, UDP simply sends packets as fast as the application demands without regard to the quality of the network or the traffic level on the network.  In this newsletter, we want to focus on one of the network impairments, latency, and investigate its effect on the transfer of IP video.

First, let’s suppose the video is one of the common types that use UDP.  For example, consider that the video is IPTV, digital signage streams, or video conferencing.  When the source application creates the stream, it will send the stream to the network based on the bit rate of the video.  Let’s say it is standard definition quality and part of a video conference session.  Also, suppose the actual playout rate of the video is 3 Mb/sec.  Then the sending encoder will create IP packets with an IP, UDP, and RTP header added to the video payload.  The headers will add 40 bytes to each block of payload.  The payload can typically be 600-1200 bytes.  So, the overhead of the headers is adding 1-6% to the required bandwidth when the traffic enters the network.  Consequently, about 3.03 to 3.2 Mb/sec will actually be added to the network load.  As long as the available network bandwidth has an average value at or above 3.2 Mb/sec, the video should play smoothly.

However, at the receiving end, there is a receive buffer that temporarily captures the incoming video packets.   If the network suddenly introduces significant delay, this buffer might empty and the screen will pause because there is nothing available to play out.  One other impact of such a delay is that the network might drop packets.  This can be caused by the over filling network buffers in the routers, switches and gateways.  These losses will cause deterioration in the video but not add to the issue of paused video.

The impact of delay or latency on TCP video is completely different.  TCP video most often is web streamed video or video being sent to a recording device such as a DVR. First, network delay will cause the TCP transmit rate to be lowered.  This is because the rate that TCP sends packets is based on the rate at which the destination acknowledges receipt of those packets. So, if the network and the receiver are responding slowly, the sending TCP application will decrease its transmit rate.  Also, suppose there is competing traffic on the links used by the TCP flow.  This will also cause delay in both delivery of the video and the acknowledgements of those delivered packets.  So, competing traffic will cause TCP to lower the transmit rate.

This behavior of TCP can create a significant issue for access links to the Internet.  They are often asymmetrical, with a download speed ten times as high as the upload speed.  Consequently, when you are playing Netflix on a 10Mb/sec downstream link, you assume you have a lot of extra bandwidth.  However, as noted above, your upstream link is carrying the acknowledgements and it’s very small by comparison.  If someone else in the household starts an upload to YouTube at the same time you are watching Netflix, you’ll likely have an unsatisfactory viewing experience.

Featured Articles

Close