Header Ads Widget

Responsive Advertisement

LRU Cache for Page replacement policy

The Least Recently Used (LRU) Cache is a popular algorithm used in memory management and page replacement policies. It works on the principle that the page that hasn’t been used for the longest time is the one that should be replaced. This is commonly implemented using a combination of a doubly linked list and a hash map in Java.

Java Implementation: LRU Cache

Below is a Java implementation of an LRU Cache using a combination of a doubly linked list and a hash map.

1. Node Class

This class represents each node in the doubly linked list. Each node stores a key-value pair.

java

class Node {

    int key;

    int value;

    Node prev;

    Node next;

 

    public Node(int key, int value) {

        this.key = key;

        this.value = value;

    }

}

 

2. LRUCache Class

This class implements the LRU Cache using a HashMap and a doubly linked list.

java

import java.util.HashMap;

 

class LRUCache {

    private final int capacity;

    private final HashMap<Integer, Node> cache;

    private Node head, tail;

 

    public LRUCache(int capacity) {

        this.capacity = capacity;

        this.cache = new HashMap<>();

        this.head = new Node(0, 0);  // Dummy head

        this.tail = new Node(0, 0);  // Dummy tail

        head.next = tail;

        tail.prev = head;

    }

 

    public int get(int key) {

        if (!cache.containsKey(key)) {

            return -1;  // Return -1 if the key is not present in the cache

        }

 

        Node node = cache.get(key);

        remove(node);

        insertToFront(node);

        return node.value;

    }

 

    public void put(int key, int value) {

        if (cache.containsKey(key)) {

            remove(cache.get(key));

        } else if (cache.size() == capacity) {

            remove(tail.prev);  // Remove the least recently used element

        }

 

        Node node = new Node(key, value);

        insertToFront(node);

        cache.put(key, node);

    }

 

    private void remove(Node node) {

        cache.remove(node.key);

        node.prev.next = node.next;

        node.next.prev = node.prev;

    }

 

    private void insertToFront(Node node) {

        node.next = head.next;

        node.prev = head;

        head.next.prev = node;

        head.next = node;

    }

}

 

3. Usage Example

Let's demonstrate how the LRUCache works with a simple example.

java

public class Main {

    public static void main(String[] args) {

        LRUCache cache = new LRUCache(2); // Capacity is 2

 

        cache.put(1, 1);  // Cache is {1=1}

        cache.put(2, 2);  // Cache is {1=1, 2=2}

 

        System.out.println(cache.get(1));  // Returns 1 and moves key 1 to the front, Cache is {2=2, 1=1}

 

        cache.put(3, 3);  // Evicts key 2, Cache is {1=1, 3=3}

 

        System.out.println(cache.get(2));  // Returns -1 (not found)

 

        cache.put(4, 4);  // Evicts key 1, Cache is {3=3, 4=4}

 

        System.out.println(cache.get(1));  // Returns -1 (not found)

        System.out.println(cache.get(3));  // Returns 3, Cache is {4=4, 3=3}

        System.out.println(cache.get(4));  // Returns 4, Cache is {3=3, 4=4}

    }

}

 

4. Explanation

  • get(key):
    • If the key exists in the cache, it returns the value associated with the key and moves that key-value pair to the front (most recently used position) of the doubly linked list.
    • If the key doesn't exist, it returns -1.
  • put(key, value):
    • If the key already exists in the cache, it updates the value and moves the node to the front of the list.
    • If the key doesn't exist, it checks whether the cache has reached its capacity. If so, it evicts the least recently used item (which is at the end of the list) before adding the new key-value pair to the front of the list.
  • Doubly Linked List:
    • The list is used to maintain the order of usage. The most recently used elements are at the front (near the head), and the least recently used elements are at the back (near the tail).
  • HashMap:
    • The map allows for O(1) time complexity for the get and put operations by providing a quick lookup of nodes.

Use Case

The LRU Cache is commonly used in scenarios where we need to store a limited number of items but want to ensure that the most frequently accessed items remain available, such as in caching web pages, database queries, or file system blocks.

This implementation ensures that both insertion and lookup operations are efficient, operating in O(1) time, making it suitable for high-performance applications.

LRU Cache
LRU Cache



 

Post a Comment

2 Comments

Kartik Chandra Mandal
Unknown said…
Nice Implementation. I Remember this studied in college days.