Think like a caching mechanism

We keep the things we use around us, in reach, in all sorts of situations. Stuff on the desk at work, stuff on the coffee-table in the living room, on the drawers in the bathroom, in the garage, etc…

One day, while sorting and storing things around the house, a thought emerged, we should manage these items like a caching mechanism!

Cache, as described in Wikipedia, is:

a component that transparently stores data so that future requests for that data can be served faster

For example, when a blog post is opened, the system does not read it from database the data per every request, but it caches and retrieves it in an instant, from the RAM memory. Or another example, instead of going to the basement for your favorite wine, keep one in the fridge.

How can we apply this kind of thinking around the house? Let’s see the characteristics of a cache:

Cache size: imagine your most accessible drawers, and the tables you reach out most as the cache, as the ‘level one’ cache. You calculate the area that is to be used for storing things and make an estimation how much and which stuff to put on there. You may populate it with the items you think are most needed to be easily accessible, or you may put the items as you see that you use them.

Eviction policy: imagine you have a drawer with perfumes, and some of these are not used at all. A cache would remove these items, either by the ‘Least Recently Used (LRU)’ or by the ‘Least Frequently Used (LFU)’ policy. And in this case, we remove/evict this never-used perfume in some more permanent and less accessible storage, let’s call it the ‘level two’ cache.

Retrieving items: when we go to retrieve an item we need, if it’s not in the cache, we add it, so that the next time it will be there and the time ‘penalty’ will not happen. To get an idea of item retrieval and time penalties, let’s see an illustration of the example with the favorite wine:

Wine example

Adding new items: when we add a new item in the cache (new perfume!) we first check if the cache is full and if it is we remove an item by applying the eviction policy. If there is space in the cache we can just add it.

Time to live: cache mechanisms can remove items that reach the TTL count, if it’s not set to infinity. For example, some items in the fridge have an expiry of ‘n’ days or weeks and some don’t expire. The TTL is set for an item at the very moment when it’s added to the cache.

Statistics: caches maintain statistics, giving information about the most accessed items, load time penalties, number of times items have been evicted, and more…

So, few ‘caching’ thoughts that apply when thinking of sorting and storing stuff:

 
15
Kudos
 
15
Kudos

Now read this

Cool demo projects from #SpringOne2GX 2014

Want to learn Spring by example? Use the source, Luke! This is a list of the coolest example/demo projects that can be found at the SpringOne2GX conference Github space. For each project there’s a related presentation as well. Rective... Continue →