-->

Will Caching Still Impact Latency in a Post-HLS World?

Watch the complete panel from Streaming Media East Connect, Latency Still Sucks (and What You Can Do About It) on the Streaming Media YouTube channel.

Learn more about low-latency streaming at Streaming Media West 2020.

Read the complete transcript of this clip:

Casey Charvet: A lot of the latency discussion in the last couple of years has been shaped by the dominant technology that we use to distribute streaming content: HLS. The very nature of that protocol means that the videos are chunked and you're going to have to have a certain number of chunks. And that adds a huge amount of latency to the whole process. With all of these UDP protocols that are coming out, we might be reshaping the way we deliver at scale and we might be heading towards a post-HLS or post-chunked media world. If we get there, I think a lot of these talks about caching are going to be reshaped. Hardware definitely does play a huge role in that--we could get ridiculous throughput through a box now. It's pretty easy to order up 40-gigabit line cards, SSDs, and all that. So these servers that we use for caching are immensely powerful. To turn that back around--if we move to a post-chunked media world, is that going to matter?

Marc Symontowski: Yeah. If we can handle the scalability for those large-scale events in a real-life scenario, that's a different challenge than rebalancing caches. I think a lot of acceleration on the edge could happen there, like optimized hardware for ingest portal and egress, so you can actually optimize certain protocols right at the edge.

Jason Thibeault: That's a good point. It's it's not one size fits all. One would think that your profile for your caching or your caching strategy has to be reflective of the kind of traffic and content you're pushing through it. To Casey's point, if we move away from segmented video delivery, or if it's something else, then obviously that would impact how those content deliverers optimize or tune those servers to improve the caching, or any caching is not even needed. I don't know. Crazy, crazy thought.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

How Open Caching Solves Content Delivery Bottlenecks

How does open caching solve content delivery bottlenecks? According to Gautier Demond of Qwilt, the biggest problem-solving advantage of open caching is removing the traditional bottlenecks of CDN networks while reinventing the overall analytics-sharing relationship with ISPs.

How Much Do Low-Latency Codecs Reduce Latency?

GigCasters' Casey Charvet and CenturyLink's Rob Roskin discuss the efficacy of new low-latency revisions to existing protocols to decrease streaming latency in this clip from Streaming Media East 2020.

Encoding Best Practices to Reduce Latency

How can streaming professionals fine-tune their processes to prioritize low latency at the encoding stage? Streaming Video Alliance's Jason Thibeault, GigCasters' Casey Charvet, and Haivision's Marc Cymontowski discuss strategies for reducing latency in this clip from Streaming Media East Connect.

Companies and Suppliers Mentioned