Building an LRU Cache in Java
Step-by-Step Instructions for Creating a Least Recently Used Cache
An LRU (Least Recently Used) Cache is crucial for efficient data management. It helps maintain fast access to recently used data by evicting the least recently used items. This quick guide will walk you through the essentials of building an LRU Cache in Java.
What is an LRU Cache?
An LRU Cache is designed to manage a fixed amount of data, keeping the most recently accessed items readily available while evicting the least recently used items when the cache is full. This ensures that frequently accessed data is quickly retrievable, enhancing overall system performance.
Key Operations
The LRU Cache supports two main operations:
Get: Retrieve the value associated with a key, updating its usage in the cache.
Put: Add or update the value for a key. If the cache exceeds its capacity, remove the least recently used item.
To implement an efficient LRU Cache, we use:
HashMap: Provides fast access to cache entries.
Doubly Linked List: Maintains the order of access, allowing for quick updates to the most recently used items.
Let’s start step-by-step building LRU Cache:
Create a Node class:
static class Node { int key; int value; Node prev; Node next; Node(int key, int value) { this.key = key; this.value = value; } }LRU Cache class:
Constructor: Sets up the cache with a specified capacity.
public LRUCache(int capacity) { this.capacity = capacity; this.count = 0; // Initialize the HashMap map = new HashMap<>(); // Create dummy head and tail nodes head = new Node(0, 0); tail = new Node(0, 0); // Connect head and tail head.next = tail; tail.prev = head; }Methods:
deleteNode: To remove a node from the list, we connect the node before it (
prev) directly to the node after it (next).private void deleteNode(Node node) { // Remove node from the list node.prev.next = node.next; node.next.prev = node.prev; }addToHead: Moves an item to the start of the list (making it the most recently used).
private void addToHead(Node node) { // Add node right after the head node.next = head.next; node.next.prev = node; node.prev = head; head.next = node; }get: Retrieves an item based on its key & if the item is found, it’s moved to the start of the list to mark it as recently used.
public int get(int key) { Node node = map.get(key); if (node == null) { return -1; // Key not found } // Move the retrieved node to the head deleteNode(node); addToHead(node); return node.value; }put: Adds a new item or updates an existing item in the cache. If the item already exists, update its value and move it to the start. If it’s new and the cache is full, remove the least recently used item from the end before adding the new one.
public void put(int key, int value) { Node node = map.get(key); if (node != null) { // Update the value of the existing node and move it to the head node.value = value; deleteNode(node); } else { // Create a new node node = new Node(key, value); if (count >= capacity) { // Remove the least recently used node (from the tail) map.remove(tail.prev.key); deleteNode(tail.prev); count--; } map.put(key, node); count++; } // Add the new or updated node to the head addToHead(node); }Complete LRU Cache code:
import java.util.HashMap; import java.util.Map; class LRUCache { static class Node { int key; int value; Node prev; Node next; Node(int key, int value) { this.key = key; this.value = value; } } // HashMap to store node references private Map<Integer, Node> map; private int count, capacity; private Node head, tail; public LRUCache(int capacity) { this.capacity = capacity; this.count = 0; // Initialize the HashMap map = new HashMap<>(); // Create dummy head and tail nodes head = new Node(0, 0); tail = new Node(0, 0); // Connect head and tail head.next = tail; tail.prev = head; } private void deleteNode(Node node) { // Remove node from the list node.prev.next = node.next; node.next.prev = node.prev; } private void addToHead(Node node) { // Add node right after the head node.next = head.next; node.next.prev = node; node.prev = head; head.next = node; } public int get(int key) { Node node = map.get(key); if (node == null) { return -1; // Key not found } // Move the retrieved node to the head deleteNode(node); addToHead(node); return node.value; } public void put(int key, int value) { Node node = map.get(key); if (node != null) { // Update the value of the existing node and move it to the head node.value = value; deleteNode(node); } else { // Create a new node node = new Node(key, value); if (count >= capacity) { // Remove the least recently used node (from the tail) map.remove(tail.prev.key); deleteNode(tail.prev); count--; } map.put(key, node); count++; } // Add the new or updated node to the head addToHead(node); } }Conclusion:
In this blog, we learned how to build an LRU Cache. By using a
HashMapfor quick lookups and a doubly linked list to track usage order, our cache efficiently manages data:Fast Access:
HashMapensures quick retrieval.Efficient Eviction: The linked list helps easily remove the least recently used items when the cache is full.
If you found this post helpful, consider subscribing for more updates or following me on LinkedIn, and GitHub. Have questions or suggestions? Feel free to leave a comment or contact us. Happy coding!!


