The Least Recently Used(LRU) cache is the most commonly used cache algorithm. LRU Cache arranges things so that we want to add new data when the cache becomes full. We have to evict the least used data value from it.
We use LRU cache to save time; let us assume we have to access a specific item from an extensive database, and that particular item is frequently searched. LRU cache will save time as it will store the latest information searched in the database. We can access that specific item in less time from the LRU cache, and it does not have to be searched multiple times from the database.
For instance, a user manages a flower shop in an e-commerce store. When a user searches for a specific flower type, it searches it from the website disk where data is stored, which is time-consuming. It is pretty time-consuming if multiple users search for the same flower simultaneously and find it repeatedly from the disk.
Using LRU cache can save time as it stores that flower detail in it, and when a user searches for it, LRU passes it over to the user, and when the cache becomes full, it will delete the most negligible searched value.
The above illustration shows an example of a cache that can store four elements, where we stored four flower data, Lily, Tulip, Hyacinth, and Jasmine; the cache is fully populated.
We want to add a new flower Iris element in the cache; as it is an LRU cache, it will delete the non-accessed/oldest data.
The Code widget below gives the implementation of the LRU cache in C++.
int main() {// Creating a cache of size 2int cache_capacity = 2;LRUCache cache(cache_capacity);cout<< "Initial state of cache" <<endl;cout<< "Cache capacity: " << cache_capacity <<endl;cache.Print();vector<int> keys = {10, 10, 15, 20, 15, 25, 5};vector<string> values = {"20", "Get", "25", "40", "Get", "85", "5"};for (int i = 0; i < keys.size(); i++) {if (values[i] == "Get") {cout << "Getting by Key: " << keys[i] << endl;cout << "Cached value returned: " << (cache.Get(keys[i])) << endl;} else {cout << "Setting cache: Key: " << keys[i] << ", Value: " << values[i] << endl;cache.Set(keys[i], stoi(values[i]));}cache.Print();}}
We store data in the cache one by one, and when we print/access, the element is brought to the cache. In the next step, we have to add another element to an already populated cache; we will delete the element which is not accessed until the last step, as shown in the output. The LRU cache works this way to store the data and easily access it without consuming more time.
Free Resources