Time to Leave and Eviction Policies in Caching

Get to know what time-to-leave and eviction policies are in caching.

In this lesson, let’s extend our knowledge of some core caching concepts and caching types that we will need to keep in mind while working on distributed systems.

Time to leave (TTL)

We can see that frequently accessed data can be cached and retrieved faster. But we cannot store all data in the cache forever. There are several reasons for this:

  • Unless data is absolutely static and never changes, at some point, data is supposed to be updated. Continuing to store data on the cache for a long time will result in a stale response.
  • A piece of data that is frequently accessed today may not be of much use tomorrow. There is no point in storing it in the cache if it is not accessed much.
  • Finally, a cache is an expensive component as it is based on RAM—and obviously much smaller than a database—so a cache has limited capacity.

Time to leave (TTL) is the amount of time after which a piece of data (for example, a key-value pair) will be evicted from the cache. This time begins being counted after the last update for the particular piece of data.

TTL is generally pre-configured on the cache server. It is automatically handled on the server-side.

Eviction policies

At some point, the cache will be full. We will have to strategically evict some data from the cache to make room for more relevant data. There are a few common algorithms that we can choose from to do this.

Least recently used (LRU)

This is the most commonly used eviction policy. The idea is pretty simple: if the cache is full, remove the least recently used data first.

The motivation is that if some data was last used a long time ago, the same data probably won’t be accessed much in the future. This means that the more ‘trendy’ data will be preferred in the cache.

For many use cases, this strategy works fairly well. Think of your Instagram photos. Recently posted photos are more likely to be accessed by your followers. Older photos eventually lose traffic. An LRU policy will keep the latest photos in the cache and evict the older ones.

Least frequently used (LFU)

Another common strategy for cache eviction is to evict the least frequently used data. This algorithm keeps track of how many times data items were accessed. If the cache requires eviction, the data items that were used the least number of times are removed.

For some systems, this algorithm might work, but this may not be ideal for systems where recent data is more relevant than older data. Think of the stock exchange. Most users are keen on the current prices of stocks of various companies, not historical prices. If LFU is used in such a system, recently updated prices will initially be evicted from the cache up to a certain point where the counts become more than historical data items.

On the other hand, when you type on your phone and get suggestions for possible words, there can be an LFU cache that stores words and their counts. This cache will prefer storing the most frequently used words and evicting the least used ones.

Most recently used (MRU)

This is just the opposite of an LRU cache. Here, most recently accessed data is evicted, and older ones are kept. For systems where most recently used data is unlikely or not preferable to be accessed again, MRU can be a good choice.

As an example, think about Facebook friend suggestions. If you remove someone so that they do not appear in the suggestion, the system should evict this most recently removed item from the cache.


Based on your system and business requirements, you can choose one of the eviction policies from the above types or go for simple policies like first-in-first-out (FIFO) and last-in-first-out (LIFO). On the other hand, for some systems, more advanced cache eviction policies are suitable—for instance, a custom policy tuned for business use cases.

Key takeaways

  • Time to leave (TTL) ensures a piece of data is evicted from the cache so that the data does not go stale.
  • Choosing the right eviction policy is critical for the performance of the caching system.

Get hands-on with 1200+ tech skills courses.