Streaming media is slowly driving traditional broadcast TV out of the market – it offers unparalleled convenience, quality and customizability. Platforms like Netflix, Hulu and HBO Go are spoiling their users with a plethora of titles available at just a click of a button. This gives ample reasons to cut the cord and never look back.
Today’s streaming technology has evolved so much it allows real-time live broadcasts, allowing people to take part in the vicarious video experience together – watch concerts, matches and other live shows in real time as the event unfolds. This, until recently, was only possible with traditional cable TV.
But there is one thing that pops the bubble – there is no real real-time in video streaming. The “live” streaming video you think watch from the comfort of your sofa may be up to 30 seconds delayed against what the audience at the venue is seeing. This is latency – the time the video captured by the camera needs to be processed and displayed on a viewer’s screen.
Let’s have a closer look at the origins of the issue and find out why low latency is so important in streaming.

What Is Low Latency?

In streaming, latency is added to the streaming process by numerous components like the type of camera or encoder used, upstream network, streaming server, or at the viewer’s end: network efficiency (buffering) or the video player.

Of course, what passes as low latency is very subjective. HD cable TV’s stream defaults at 5-second latency. Apple’s own HLS streaming protocol defaults at 30-45 seconds. If we take this for a benchmark, low latency would be anything under five seconds.

Why and When It Matters?

In most situations, a few seconds’ delay wouldn’t make any difference for the viewer. But there are certain use cases which necessitate very low latency. The specific reasons and situations where low latency matters include:

Social importance of low latency

Keeping latency low and making streaming video as close as possible to real-time allows users to engage and interact with each other (e.g. using social media) as the streamed event unfolds. Take, for example, watching live events like football matches. The bigger the latency, the bigger disconnection there is between the comments and the video.

In very extreme cases of latency (e.g. about a minute), there may even be spoilers for some viewers – e.g. when a part of the low-latency commenters have seen the player score a goal when the others still haven’t.

Betting and real-time buying scenarios

In time-sensitive cases associated with competing against others via live video, seconds are of critical importance. Think sports-track betting or real-life auctions – placing the final bid is contingent on perfect timing. A lack of near real-time video synchronization defaults the purpose of taking part in a highly competitive auction.

For instance, some horse-racing events use high-speed satellite feeds to allow people around the world to bet on horses online. However, most of the sites which offer online betting display disclaimers about external factors and deny responsibility for specific technical issues – latency included.

Low latency eliminates excessive delays and gives everyone equal opportunity to place their bets. This is especially important because in some online auctions big money is at stake and any undue delay makes bids record too late. In such cases, time is money (literally).

Live video interviews

Remember the TV interview where the reporter spoke to someone at a remote location, and the other person remained silent for a period that felt like ages instead of answering the question? It wasn’t because the other person was deaf or dumb. It was because there were a few seconds of latency at play. About 150 milliseconds of latency in either direction is considered the upper limit as far as latency goes. That’s short enough to allow for smooth conversation without awkward pauses.

Second screen apps

If you’re watching a live event on a second-screen app (such as a sports league or official network app), you’re likely running several seconds behind live TV. While there’s inherent latency for the television broadcast, the second-screen app must match that same level of latency to deliver a consistent viewing experience.

For example, if you’re watching your alma mater play in a rivalry game, you don’t want your experience spoiled by comments, notifications or even the neighbors next door celebrating the game-winning score before you see it. This results in unhappy fans and dissatisfied (often paying) customers.

Video game streaming and e-sports

Game streaming has been gaining steam over the recent years – much thanks to the growing popularity of the Twitch. This also serves as a great example of why low latency matters: games are typically very fast-paced experiences, and a mere few seconds of delay will make the excited fans’ comments completely irrelevant.

Low Latency Is Really a Thing Now

We attended this year’s NAB Show and stumbled upon an interesting presentation about low latency: Delivering Ultra-Low Latency at Scale with Reduced Bandwidth. The fact that there are speakers talking entirely about various protocols to deliver video with low latency just proves that it’s a very current and important topic. Many companies are working on ways to lower the latency in streaming and enable better scalability of video streaming media.