How would you design and code an LRU cache policy in C++?
Question Analysis
The question asks you to design and implement an LRU (Least Recently Used) cache policy in C++. An LRU cache is a data structure that stores a limited number of items, with the capability to quickly retrieve and update items based on access. The core concept of an LRU cache is to remove the least recently used items first when the cache reaches its capacity limit. This involves efficiently managing the order of item usage and providing fast access to the items.
For this task, you will need to:
- Use a data structure that provides fast access, insertion, and deletion.
- Maintain the order of usage to identify the least recently used item.
- Implement functions for adding, retrieving, and deleting items based on access patterns.
Answer
To implement an LRU cache in C++, we can use a combination of a hash map and a doubly linked list. The hash map provides O(1) average time complexity for access, and the doubly linked list allows us to efficiently manage the order of usage with O(1) operations for insertion and deletion.
Below is a C++ implementation of an LRU Cache:
#include <iostream>
#include <unordered_map>
#include <list>
class LRUCache {
private:
int capacity;
std::list<int> keys; // Doubly linked list to store keys in order of usage
std::unordered_map<int, std::pair<int, std::list<int>::iterator>> cache; // Hash map to store key-value pairs and their positions in the list
public:
LRUCache(int cap) : capacity(cap) {}
int get(int key) {
if (cache.find(key) == cache.end()) {
return -1; // Key not found
}
// Move the accessed key to the front (most recently used)
keys.erase(cache[key].second);
keys.push_front(key);
cache[key].second = keys.begin();
return cache[key].first;
}
void put(int key, int value) {
if (cache.find(key) != cache.end()) {
// Key exists, update the value and move it to the front
keys.erase(cache[key].second);
keys.push_front(key);
cache[key] = {value, keys.begin()};
} else {
// Key does not exist
if (cache.size() == capacity) {
// Cache is full, remove the least recently used item
int lru = keys.back();
keys.pop_back();
cache.erase(lru);
}
// Insert the new key-value pair
keys.push_front(key);
cache[key] = {value, keys.begin()};
}
}
};
// Example usage
int main() {
LRUCache cache(2); // Cache capacity of 2
cache.put(1, 1);
cache.put(2, 2);
std::cout << cache.get(1) << std::endl; // returns 1
cache.put(3, 3); // evicts key 2
std::cout << cache.get(2) << std::endl; // returns -1 (not found)
cache.put(4, 4); // evicts key 1
std::cout << cache.get(1) << std::endl; // returns -1 (not found)
std::cout << cache.get(3) << std::endl; // returns 3
std::cout << cache.get(4) << std::endl; // returns 4
return 0;
}
Explanation:
- Doubly Linked List (
keys
): This maintains the access order, with the front being the most recently used and the back being the least recently used. - Hash Map (
cache
): Maps each key to a pair of its value and its position in the linked list. This allows O(1) average time complexity for bothget
andput
operations. - Operations: The
get
operation retrieves the value for the key and moves it to the front of the list. Theput
operation inserts or updates the key-value pair and manages the cache size by evicting the least recently used item when necessary.