Search⌘ K
AI Features

Solution: LRU Cache

Explore how to implement an LRU Cache by combining a doubly linked list and a hash map for efficient get and set operations. Understand the algorithm for evicting least recently used items when capacity is reached. This lesson helps you optimize cache performance by balancing access speed and memory use, preparing you to create custom data structures for specialized problems.

Statement

Implement an LRU cache class with the following functions:

  • Init(capacity): Initializes an LRU cache with the capacity size.
  • Set(key, value): Adds a new key-value pair or updates an existing key with a new value.
  • Get(key): Returns the value of the key, or 1-1 if the key does not exist.

If the number of keys has reached the cache capacity, evict the least recently used key and then add the new key.

As caches use relatively expensive, faster memory, they are not designed to store very large data sets. Whenever the cache becomes full, we need to evict some data from it. There are several caching algorithms to implement a cache eviction policy. LRU is a very simple and commonly used algorithm. The core concept of the LRU algorithm is to evict the oldest data from the cache to accommodate more data.

Constraints:

  • 11 \leq capacity 3000\leq 3000
  • 00 \leq
...