How to Get Low Latency on Your Live Streams7 min readReading Time: 6 minutes
What is Latency?
In simple terms: latency or lag is delay.
When you’re live streaming, you’re capturing video and broadcasting it over the internet to your viewers.
Latency is the amount of time between:
1. Your camera recording your video
2. and your viewers being able to see what you just recorded.
Latency and Internet Bandwidth
You can see your current latency on an online speed test. Just look for the “ping” or ping rate.
Note that your latency isn’t really affected by your internet speed. Many people confuse these two terms; but they’re quite different.
Getting a higher speed internet won’t reduce your latency. High-speed internet might make a page load faster and might let you stream higher quality video — but it will still delay your livestream.
Broadcasting is basically sending a stream of data along with a connection — think of it as water going down a pipe.
Bandwidth is how wide the pipe is. A wider pipe lets more water travel through it. This is the theoretical maximum amount of data that can travel through the connection at any time.
Latency is how long it takes to go from one end of the pipe to another. It’s the time difference between turning on the water supply, and actual water coming out the other end of the pipe.
And while we’re on the subject:
Throughput is the actual amount of data that is transferred over a given time period, and it’s mostly less than the bandwidth. Maybe your pipe has a lot of bends slowing the water down. Maybe there’s a leak or obstruction. Maybe your water source doesn’t pump enough pressure to fill the pipe.
Why Does Latency Occur?
Connection type: The type of connection affects latency and speeds. Fiber-optic, for example, transmit video faster than wireless internet.
Encoding: A lot depends on the encoder, and it needs to be optimized to send a signal to the receiving device with as little delay as possible.
Video format: Larger file sizes mean that it will take longer to transmit the file via the internet, therefore increasing the streaming latency.
Distance: Your videos can have an increased delay if you are located far away from your internet service provider (ISP), internet hub, or satellites.
Overloaded device: When many people share the same internet connection at the same time, it could affect your latency.
Malware or viruses: Computer viruses or suspicious files can wreak havoc on your internet speed and streaming latency as well.
What’s a “Good” Latency?
When streaming live, you want to broadcast as close to real-time as possible. But because of the way the internet works there will always be some latency.
So how much latency is ok?
The truth is there is no standard for live streaming latency. When someone says low latency what they mean is it’s lower than the average in their particular industry.
Even a live TV broadcast (think Superbowl) has a five-second latency due to little delays building up over the transmission chain.
So what’s the industry average you should be aiming for? Most online streamers get a latency of 30 seconds to a minute. When surveyed, 53% of video developers said they expect to achieve a latency of less than five seconds. That’s the benchmark for “low latency” in live streaming for your reference.
Do You Really Need Low Latency?
For most live streamers today, not really. If you’re teaching, giving demos or just entertaining chances are no one will notice any lags.
The only exception is when there’s a need for two-way communications. For example:
Q&A sessions: Say you’re hosting a live Q&A session where the audience asks you questions via chat and you respond in the video. With a 30-second latency:
- it takes 30 seconds for viewers to know you’ve opened the floor to questions
- it takes another 30 seconds after they’ve typed it out for you to start responding
- then it takes another 30 seconds after you’ve started responding before your viewers can see what you’re saying. This means you’d have to stall for 90 seconds for each question.
Online gambling: anyone who’s live on location gets an advantage over all online bettors with latency because they know what’s happening before anyone else does.
Live auctions: With high latency, bidders won’t have time to respond to raises. Here’s a scenario of what might happen at auctions:
- Bidder 1: I bid 200.
- Auctioneer: We have 200. Do I hear 250? Does anyone have 250? Final call for 250. Going once. Going twice. Sold for 200 to…
- Bidder 2: I have $250
- Auctioneer: … well… this is awkward.
Video calls and chat: Same as live auctions—you want to talk to your friends in real-time, not waiting around watching each other awkwardly as you wait for everyone to hear what you said.
How do I Get Lower Latency?
Switch to a hard line: Wifi routers can’t transmit data as fast as old-fashioned Ethernet cables. We love wireless too, but we recommend an Ethernet connection for your main broadcast device.
Switch to old-fashioned fiber: An Ethernet line won’t help if it feeds into satellite internet. No one likes stringing wires all over the house, but satellites are tens of thousands of miles away in space. Every request has to go to the satellite, then to the main ISP hub and internet, and then back again, multiple times. As a result, satellite connections have higher ping rates than optic-fiber internet.
Reduce network congestion: Latency is also affected by what else your network is doing. If you’ve got 3 downloads running while your neighbor or roommate is streaming Netflix on your connection at the same time, you’ll see a lot more lag. In this case, either kick every other device off your connection or get a dedicated streaming connection.
Upgrade your hardware: Even after using the right connection with no congestion, you might get a bit of lag simply because your hardware can’t keep up. Every device — from your laptop or computer to your router has some hard limits on how much data they can transmit. You could benefit from upgrading your network devices.
Get closer to your viewers: The more distance your stream has to travel, the more latency your viewers will get. To reduce latency you can use a CDN to get closer to your viewers.
Use a faster encoder: Your video encoder converts your live video into a compressed video file before transmitting. This adds to the latency in your stream. You can cut down on lag by getting a dedicated high powered machine for your software encoder or a high-speed hardware encoder.
The Streaming Protocol Hack for Low Latency
You can lower latency of your live streams simply upgrading your connection and hardware, but that’s a pretty big investment.
A much easier way is to change the streaming protocols you use.
In layman’s terms, a streaming protocol is the way data travels from one device or system to another. These protocols are layered on top of one another to form a protocol stack. That way, protocols at each layer can focus on a specific function and cooperate with each other.
Here are the major streaming protocols in use today, and which to use:
|✔️ High speed||❌ Low Quality|
|✔️ Low latency||❌ no CDN support*|
Ideal for: real-time data transfer and video conferencing
|✔️ High speed||✔️ High Quality|
|✔️ Low latency||❌ Uses Flash|
Ideal for: high-speed video transfer to people near you
|✔️ High speed||✔️ High Quality|
|✔️ Low latency||✔️ CDN support|
Ideal for: live streaming in high quality
A Quick and Easy way to Lower Your Latency
If you want to stream in high quality with low latency to multiple channels at once, your best bet is through a livestreaming service such as Castr.
Castr supports HLS, and gives you Akamai and Fastly CDN support out of the box. You also get your own embeddable player to stream from your own website, as well as every other streaming platform out there.
And the best part? Castr can get your latency as low as 5 seconds. That’s the closest to real time streaming anyone can get today. Try Castr free for 7 days.