CACHE

LEAN CACHE

Lean Cache server solution automatically routes stream request to the appropriate server based on servers utilization and routing tables

Multi-Hop Cache

To reduce the load on streamer, multi-hop cache is good solution for a network of servers when users are not concentrated in one location.

It can be used either at the headend to provide linear scalability of streaming throughput, or stream delivery to distant locations to save bandwidth between locations.

Streamer and end users

Pull-Mechanism Routing Tables

LEAN Cache solution works on Pull Mechanism where it pulls the stream once there is a demand and then replicates the same to stream to end user devices. This saves bandwidth drastically and enables smooth streaming even in low bandwidth networks.

Alternative cache server prioritization based on routing tables enables to create robust redundant content delivery network.

Significant Bandwidth Savings

Cache servers ingests only streams which are used by at least one end-user device.

Decoupling

Concurrent caching and streaming feature allows streaming to the end user before it is completely downloaded from the origin server or streamer. It behaves like a stream-through proxy with caching. Caching and streaming is fully decoupled so cache can fill as fast as it is allowed by maximum download speed of server or network. This results in zero latency which is directly affecting quality of service.

Grouping Requests

When using HLS protocol, many devices request same stream at the same time. With traditional cache servers this can lead to multiple download of the same stream from origin. Lean Cache pulls the stream once there is a demand and then streams to end user devices. This grouping saves bandwidth extensively.