Understanding LRU Cache in Java
In today’s computing world, efficiency is key. One of the critical components that can make your application more efficient is caching. Among various caching strategies, the Least Recently Used (LRU) cache is a popular one. In this blog, we’ll delve into what an LRU cache is, why it is useful, and how you can implement one in Java.
What is an LRU Cache?
An LRU (Least Recently Used) cache is a type of cache algorithm that discards the least recently used items first. This is particularly useful when you have limited cache size and need to manage which data to keep and which to discard. The LRU cache keeps track of the order in which elements are accessed, ensuring that the most recently accessed items stay in the cache while the least recently accessed ones are removed when the cache reaches its capacity.
Why Use an LRU Cache?
Implementing an LRU Cache in Java
Java provides various ways to implement an LRU cache. One of the simplest ways is by using the LinkedHashMap
class, which maintains a predictable iteration order and can be easily configured to work as an LRU cache.
Here’s a step-by-step guide to implementing an LRU cache in Java:
Step 1: Extend LinkedHashMap
We start by extending the LinkedHashMap
class. We will override its removeEldestEntry
method to ensure that the cache does not exceed the specified capacity.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 |
import java.util.LinkedHashMap; import java.util.Map; public class LRUCache<K, V> extends LinkedHashMap<K, V> { private final int capacity; public LRUCache(int capacity) { super(capacity, 0.75f, true); this.capacity = capacity; } @Override protected boolean removeEldestEntry(Map.Entry<K, V> eldest) { return size() > capacity; } } |
Step 2: Add a Main Class to Test the LRU Cache
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
public class Main { public static void main(String[] args) { LRUCache<Integer, String> lruCache = new LRUCache<>(3); lruCache.put(1, "one"); lruCache.put(2, "two"); lruCache.put(3, "three"); System.out.println("Cache: " + lruCache); lruCache.get(1); System.out.println("Accessed 1: " + lruCache); lruCache.put(4, "four"); System.out.println("Added 4: " + lruCache); } } |
Step 3: Compile and Run
Compile and run the program. You should see that when the cache exceeds its capacity, the least recently used entry is removed.
1 2 3 |
Cache: {1=one, 2=two, 3=three} Accessed 1: {2=two, 3=three, 1=one} Added 4: {3=three, 1=one, 4=four} |
In this output, you can see that adding a fourth element (4=four
) causes the least recently used element (2=two
) to be evicted from the cache.
Conclusion
Implementing an LRU cache in Java is straightforward with the help of LinkedHashMap
. By extending this class and overriding the removeEldestEntry
method, you can create an efficient and easy-to-manage LRU cache. This technique is a powerful tool to enhance the performance and memory management of your applications.
I hope this blog helps you understand and implement LRU cache in your Java applications. Happy coding!