Implement the Least Recently Used algorithm for Cache Optimization


We need to implement the least recently used algorithm in order to maintain a cache
Suppose we have a cache memory of a given size i.e. Cache Capacity
So at any point say the cache can store only 4 items, if we need to store a new item, we will need to get rid of one stored item
The one that we choose to remove is decided using one of the Cache eviction policies, in this case, we will remove the least recently used one!

More about cache eviction policies can be read here

What Are We Aiming At?

We have to implement a data structure.
It should have the following two operations available with it at all times:

  1. Add a key-value pair into the Cache

  2. Tell the value of a key from the cache if it exists, else return -1

  3. Update the value for a key already in the cache.

  4. If the cache is full and you need to add a key-valye pair, remove the least recently used item and then add!

Now that we understand what the problem needs us to do. We will think of what data structures we can use to implement this Cache policy.

Thinking About What We Need

  • For fast access to key-value pairs, there is nothing better than a hashMap! (A dictionary)

  • Also we need to be able to easily delete an entry and insert an entry. This is the most frequent operation.

  • To do this in constant time, the best suited

  • data structure would be nothing but a doubly linked list.

  • Hence we can easily take out an element do some rewiring and also add the element in a doubly linked list easily

Major Objectives

Following are the two operations that we need to take care of:
1. Put: To insert something into the cache
2. Get: Find an element's value from the cache

Note that we will need a dummy head and a tail to begin with.
Hence the initial representation of the linked list would be as follows.

Amazon Interview Question: Implement the LRU Cache
Other than this we only take care of capacity and obviously the mapping that will be rigorously used

PUT Operation

If the key-value pair we are about to put into the cache is already in the cache, then we just need to update its value.
Also, note that now we have tampered with an item that means it has been used. So it should be brought to the front of the linked list. It is the most recently used one.

If it is not in the cache, and we can find ourselves in two situations.
1. Cache can still insert more elements
2. Cache has reached its capacity.
If it can insert more element (i.e. size(cache)< capacity_cache), we simply create a new key-value pair and put it in the front of the linked list.

However, if it can not insert anything else, then we first need to remove the least recently used one!
This is where the cache eviction policy comes into the picture.
The last element in the linked list is the least recently used on, for it has not been touched for a long time now.
So we first remove it safely, do the rewiring and then we can do our insertion operation as discussed above.

Amazon Interview Question: Implement the LRU Cache

GET Operation

If the key we are looking for is present in our hastable, simple enough.
Just take it out, return its value and also bring it to the front of the linked list.
This process will be a two step process consisting of
  • 1. Removing the item from its position

  • 2. Adding it to the front of the linked list.
However if the key whose value we are looking for is not there. Then we will return a -1.

Amazon Interview Question: Implement the LRU Cache

Some Important Operations That Will Help Us

For doing all these operations we need the following crucial functions:
1. removeFromList
Used when we do the get opertion, and pick up the node from its original location
2. insertIntoHead
This will be the most used one, just putting the item at the head of the doubly linked list
3. removeFromTail
This will be used when we have reached the cache's maximum capacity.
Now that we understand what these function will do, we can code them out easily, along with the other operations explained above

Implementation in Python:

class ListNode: def __init__(self, key, value): self.key = key self.value = value self.prev = None = None class LRUCache: def __init__(self, capacity: int): self.dic = dict() self.capacity = capacity self.head = ListNode(0, 0) self.tail = ListNode(-1, -1) = self.tail self.tail.prev = self.head def get(self, key: int) -> int: if key in self.dic: node = self.dic[key] self.removeFromList(node) self.insertIntoHead(node) return node.value else: return -1 def put(self, key: int, value: int) -> None: if key in self.dic: node = self.dic[key] self.removeFromList(node) self.insertIntoHead(node) node.value = value else: if len(self.dic) >= self.capacity: self.removeFromTail() node = ListNode(key,value) self.dic[key] = node self.insertIntoHead(node) def removeFromList(self, node): = = node.prev def insertIntoHead(self, node): headNext = = node node.prev = self.head = headNext headNext.prev = node def removeFromTail(self): if len(self.dic) == 0: return tail_node = self.tail.prev del self.dic[tail_node.key] self.removeFromList(tail_node)


Get Answers to your Programming Questions

Recommended Deals ends in

Quiz For You:

Search Tags

    Cache Optimization using LRU algorithm