LRU Cache
Problem Statement
Section titled “Problem Statement”Design a data structure that implements a Least Recently Used (LRU) cache with O(1) time complexity for both get and put operations. The cache should have a fixed capacity and evict the least recently used item when the capacity is exceeded.
Requirements
Section titled “Requirements”Functional Requirements
Section titled “Functional Requirements”- Store key-value pairs with fixed capacity
- Get value by key in O(1) time
- Put key-value pair in O(1) time
- Evict least recently used item when capacity is full
- Update item’s position when accessed (get or put)
- Support generic key-value types
- Thread-safe operations for concurrent access
Non-Functional Requirements
Section titled “Non-Functional Requirements”- O(1) time complexity for get and put
- O(n) space complexity where n is capacity
- Thread-safe implementation
- Memory efficient
- Support cache statistics (hit rate, miss rate)
Simplified Class Diagram
Section titled “Simplified Class Diagram”Simplified Overview
Section titled “Simplified Overview”Detailed Class Diagram
Section titled “Detailed Class Diagram”Key Design Patterns
Section titled “Key Design Patterns”- Strategy Pattern: Different eviction policies (LRU, LFU, FIFO)
- Singleton Pattern: Cache instance management
- Template Method: Common cache operations with customizable eviction
Design Pattern Diagrams
Section titled “Design Pattern Diagrams”1. Strategy Pattern - Eviction Policies
Section titled “1. Strategy Pattern - Eviction Policies”2. Template Method Pattern - Cache Operations
Section titled “2. Template Method Pattern - Cache Operations”Code Snippets
Section titled “Code Snippets”Core LRU Cache Implementation
Section titled “Core LRU Cache Implementation”public class LRUCache<K, V> implements Cache<K, V> { private final int capacity; private final Map<K, Node<K, V>> cache; private final DoublyLinkedList<K, V> list; private final ReentrantLock lock; private final CacheStatistics stats;
public LRUCache(int capacity) { this.capacity = capacity; this.cache = new HashMap<>(); this.list = new DoublyLinkedList<>(); this.lock = new ReentrantLock(); this.stats = new CacheStatistics(); }
@Override public V get(K key) { lock.lock(); try { Node<K, V> node = cache.get(key); if (node == null) { stats.recordMiss(); return null; }
// Move to front (most recently used) list.moveToFront(node); stats.recordHit(); return node.getValue(); } finally { lock.unlock(); } }
@Override public void put(K key, V value) { lock.lock(); try { Node<K, V> existingNode = cache.get(key);
if (existingNode != null) { // Update existing node existingNode.setValue(value); list.moveToFront(existingNode); } else { // Create new node Node<K, V> newNode = new Node<>(key, value);
if (cache.size() >= capacity) { evict(); }
cache.put(key, newNode); list.addToFront(newNode); } } finally { lock.unlock(); } }
private void evict() { Node<K, V> lru = list.removeLast(); if (lru != null) { cache.remove(lru.getKey()); stats.recordEviction(); } }
@Override public boolean remove(K key) { lock.lock(); try { Node<K, V> node = cache.remove(key); if (node != null) { list.removeNode(node); return true; } return false; } finally { lock.unlock(); } }}Doubly Linked List Implementation
Section titled “Doubly Linked List Implementation”public class DoublyLinkedList<K, V> { private Node<K, V> head; private Node<K, V> tail; private int size;
public DoublyLinkedList() { // Dummy head and tail head = new Node<>(null, null); tail = new Node<>(null, null); head.next = tail; tail.prev = head; size = 0; }
public void addToFront(Node<K, V> node) { node.next = head.next; node.prev = head; head.next.prev = node; head.next = node; size++; }
public void removeNode(Node<K, V> node) { node.prev.next = node.next; node.next.prev = node.prev; size--; }
public Node<K, V> removeLast() { if (tail.prev == head) { return null; } Node<K, V> last = tail.prev; removeNode(last); return last; }
public void moveToFront(Node<K, V> node) { removeNode(node); addToFront(node); }}Node Class
Section titled “Node Class”public class Node<K, V> { private K key; private V value; Node<K, V> prev; Node<K, V> next;
public Node(K key, V value) { this.key = key; this.value = value; }
public K getKey() { return key; }
public V getValue() { return value; }
public void setValue(V value) { this.value = value; }}Usage Example
Section titled “Usage Example”// Create cache with capacity 3LRUCache<Integer, String> cache = new LRUCache<>(3);
// Add itemscache.put(1, "one");cache.put(2, "two");cache.put(3, "three");
// Access item (moves to front)String value = cache.get(1); // Returns "one"
// Add new item (evicts least recently used)cache.put(4, "four"); // Evicts key 2
// Check statisticsSystem.out.println("Hit rate: " + cache.getHitRate());System.out.println("Miss rate: " + cache.getMissRate());Extension Points
Section titled “Extension Points”- Add TTL (Time To Live) for cache entries
- Implement cache warming strategies
- Add write-through/write-back cache policies
- Support cache persistence to disk
- Implement distributed LRU cache
- Add cache entry size limits (weighted LRU)
- Support cache invalidation patterns
- Add metrics and monitoring integration