What is LRU Cache? & It’s Implementation in Python
Computers have cache memory that temporarily stores the most frequently used data. Getting the data from cache memory is super fast whereas, retrieval of data from random-access memory is more expensive.
However, cache memory is limited in size and there needs to be a way to manage what data needs to be removed from the cache to store new data. That’s where LRU cache comes into the picture. LRU cache is a replacement algorithm that removes the least recently used data to make room for new data.
Let’s discuss how we can implement an efficient algorithm.✅
To implement the algorithm we need to divide the code into different functions.
LRUCache
class:LRUCache(int capacity)
Initialize size of the LRU cache.int get(int key)
Return the value of thekey
if the key exists, otherwise return-1
.void put(int key, int value)
Update the value of thekey
if thekey
exists. Otherwise, add thekey-value
pair to the cache. If the number of keys exceeds thecapacity
from this operation, evict the least recently used key.
The functions get
and put
must each run in O(1)
average time complexity.
🎯Approach 1: Using Array and Dictionary => NOT Optimized (Brute Force)👇
🎯Approach 2: Using OrderedDict Dictionary => Optimized but NOT for Interviews as it is using built-in functions👇
🎯Approach 3: Using LinkedList and Dictionary => Optimized for Interviews👇
Please drop any suggestions below, and follow me if you like my content :)