LRU Cache

Medium Hash TableLinked ListDesignDoubly-Linked List

Approaches

Hash map maps keys to nodes in a recency-ordered doubly-linked list. The list uses sentinel head/tail to keep the splice/eviction code free of None checks.

An LRU cache needs O(1) for two unrelated things: keyed lookup, and shifting an entry to 'most recent'. A hash map alone gives the first; a linked list alone gives the second. Combining them — hash map keys point to list nodes — gives both. The list is doubly-linked because eviction removes from the LRU end, which requires a back pointer.

Algorithm

  1. Initialize an empty hash map and a doubly-linked list with sentinel head and tail nodes.
  2. On get(key): if key not in map, return -1. Otherwise unlink the node from its current position and splice it after head; return its value.
  3. On put(key, value): if key exists, update value and promote to MRU as in get. Otherwise create a new node, store in map, splice after head. If size now exceeds capacity, unlink tail.prev and remove its key from the map.

Solution

Time O(1) Space O(capacity)
class _Node:
    __slots__ = ("key", "value", "prev", "next")

    def __init__(self, key, value):
        self.key = key
        self.value = value
        self.prev = None
        self.next = None


class LRUCache:
    def __init__(self, capacity):
        self.capacity = capacity
        self.cache = {}
        # Sentinel head/tail keep splice + eviction free of None checks.
        self.head = _Node(0, 0)
        self.tail = _Node(0, 0)
        self.head.next = self.tail
        self.tail.prev = self.head

    def _remove(self, node):
        node.prev.next = node.next
        node.next.prev = node.prev

    def _add_to_front(self, node):
        node.prev = self.head
        node.next = self.head.next
        self.head.next.prev = node
        self.head.next = node

    def get(self, key):
        node = self.cache.get(key)
        if node is None:
            return -1
        self._remove(node)
        self._add_to_front(node)
        return node.value

    def put(self, key, value):
        node = self.cache.get(key)
        if node is not None:
            node.value = value
            self._remove(node)
            self._add_to_front(node)
            return
        node = _Node(key, value)
        self.cache[key] = node
        self._add_to_front(node)
        if len(self.cache) > self.capacity:
            lru = self.tail.prev
            self._remove(lru)
            del self.cache[lru.key]
ByteLingo

See algorithms come alive.

© 2026 ByteLingo