What is Latency

In simple terms: latency or lag  is delay.

When you're live streaming, you're capturing video and broadcasting it over the internet to your viewers.

Latency is the amount of time between:

Your camera recording your video . . .

. . . and your viewers being able to see what you just recorded.

Latency and your internet bandwidth

You can see your current latency on an online speed test. Just look for the "ping" or ping rate.

Note that your latency isn't really affected by your internet speed. Many confuse these two but they're quite different.

Getting a higher speed internet won't reduce your latency. It might make a page load faster, and it might let you stream higher quality video. But it will still delay your livestream.

Broadcasting is basically sending a stream of data along a connection - so think of it like water going down a pipe.

  • Bandwidth is how wide the pipe is. A wider pipe lets more water travel through it. This is the theoretical maximum amount of data that can travel through the connection at any time.
  • Latency is how long it takes to go from one end of the pipe to another. It's the time difference between turning on the water supply, and actual water  coming out the other end of the pipe.

And while we're on the subject:

  • Throughput is the actual amount of data that is transferred over a given time period, and it's mostly less than the bandwidth. Maybe your pipe has a lot of bends slowing the water down. Maybe there's a leak or obstruction. Maybe your water source doesn't pump enough pressure to fill the pipe.  

Why do streamers have latency

  • Connection type: The  type of connection affects latency and speeds. Optic  fibers, for example, transmit video faster than wireless internet.
  • Encoding: A  lot depends on the encoder, and it needs to be optimized to send a  signal to the receiving device with as little delay as possible.
  • Video format: Larger file sizes mean that it will take longer to transmit the file via the internet, increasing the streaming latency.
  • Distance: Your videos can have an increased delay if you are located far away from your ISP, internet hub, or satellites.

What's a "good" latency

When streaming live, you want to broadcast as close to real-time as possible. But because of the way the internet works there will always be some latency.

So how much is ok?

The truth is there is no standard for live streaming latency. When someone says low latency what they mean is it's lower than the average in their particular industry.

Even a live TV broadcast (think  Superbowl) has a five second latency due to little delays building up over the transmission chain.

So what's the industry average you should be aiming for? Most online streamers get a latency of 30 seconds to a minute. When surveyed, 53% of video developers said they expect to achieve a latency of less than five seconds. That’s the benchmark for “low latency” in live  streaming.

Do you really need low latency?

For most live streamers today - not really. If you're teaching, demoing or just entertaining chances are no one will notice any lags.

The only exception is when there's a need for 2-way communications. For example:

  • QnA sessions - Say you're hosting a live QnA session where the audience asks you questions by chat and you respond in the video. With a 30 second latency:
    • it takes 30 seconds for viewers to know you've opened the floor to questions
    • it takes another 30 seconds after they've typed it out for you to start respnding
    • it takes another 30 seconds after you've started responding before your viewers can see what you're saying. Meaning you'd have to stall for 90 seconds for each question.
  • Online gambling - anyone who's live on location gets an advantage over all online bettors with latency because they know what's happening before anyone else does.
  • Live Auctions - With high latency bidders won't have time to respond to raises. Here's what'll happen at auctions.
    • Bidder 1: I bid 200.
    • Auctioneer: We have 200. Do I hear 250? . . . anyone has 250? Final call for 250. Going once. Going twice. Sold for 200 to . . .
    • Bidder 2: I have $250
    • Auctioneer: . . . well. . . . this is awkward.
  • Video calls and chat - Same as live auctions - you want to talk to your friends in real-time. Not wait around watching each other awkwardly as you wait for each other to hear what you said.

How do I get lower Latency?

  • Switch to a hard line: WiFi routers can't transmit data as fast as old fashioned Ethernet cables. We love wireless too, but we recommend an Ethernet connection for your main broadcast device.
  • Switch to old fashioned fiber: An Ethernet line won't help if it feeds into satellite internet. No one likes stringing wires all over the house, but satellites are tens of thousands of miles away in space. Every request has to go to the satellite, then to the main ISP hub and internet, and then back again, multiple times. As a result satellite connections have ping rates.
  • Reduce network congestion: Latency is also affected by what else your network is doing. If you've got 3 downloads running, and your neighbor or roommate is streaming Netflix on your connection at the same time you'll see a lot more lag. Either kick every other device off your connection or get a dedicated streaming connection.
  • Upgrade your hardware: Even after using the right connection with no congestion you might get a bit of lag simply because your hardware can't keep up. Every device - from your laptop or computer to your router has some hard limits on how much data they can transmit. You could benefit from upgrading your network devices.
  • Get closer to your viewers: The more distance your stream has to travel the more latency your viewers will get. To reduce latency you can use a CDN to get closer to your viewers.
  • Use a faster encoder: Your video encoder converts your live video into a compressed video file before transmitting. This adds to the latency in your stream. You can cut down on lag by getting a dedicated high powered machine for your software encoder or a high-speed hardware encoder.

The streaming protocol hack

You can lower latency of your live streams simply upgrading your connection and hardware, but that's a pretty big investment.

A much easier way is to change the streaming protocols you use.

In layman’s terms, a streaming protocol is the way data travels from one device or system to another. These protocols are layered on top of one another to form a protocol stack. That way, protocols at each layer can focus on a specific function and cooperate with each other.

Here are the major streaming protocols in use today, and which to use:


✔️ High speed ❌ Low Quality
✔️ Low latency ❌ no CDN support*

Ideal for: real-time data transfer and video conferencing


✔️ High speed ✔️ High Quality
✔️ Low latency ❌ Uses Flash

Ideal for: high speed video transfer to people near you


✔️ High speed ✔️ High Quality
✔️ Low latency ✔️ CDN support

Ideal for: live streaming in high quality

The quick and easy way to lower your latency

If you want to stream in high quality, with low latency to multiple channels at once your best bet is a livestreaming service and there's no better than Castr.io.

Castr supports HLS, and gives you Akamai CDN support out of the box. You also get your own embeddable player so you can stream from your own website, as well as every other streaming platform out there.

And the best part? Castr can get your latency as low as 5 seconds. That's the closest to real time streaming anyone can get today. Try it free for 7 days.