Saturday, 18 February 2023

Cache - System design notes

 While considering using/building a cache for your systems, you need to consider

Use of Cahce - Cache is mainly used to reduce the latency by avoiding DB calls and long network calls.  Plan to use cache for static data or pre-computed data.

Data Eviction - You should choose appropriate eviction policy for your cache to have place for new objects, else you will not benefit from cahce use and sometimes it causes more damage

    LRU - Least recently used data should be evicted from cache

    LFU - Least frequently used object, even though very recently accessed will be evicted from cache

    FIFO - The object that is first added to cache will be deleted first to add new  objects

    TTL : Delete the object/item for the cache after a specified amount of the time irrespective of the      usage/access.

Consistency - Your cache and DB should always be in consistent and to achieve you can use following mechanisms.

    Write through Cache - Update cache when you update the DB in a single call

     Read Through Cache - Update cache when you have cache miss 

     Write Around - Write to Cache when you write to DB in async call

Where to place cache:

  Close to the Application : 

        Good for small in memory data , avoid n/w calls.  Read through is best way of maintaining cache consistency

  Close to the DB:

       Keep cache outside of application layer and maintain common/global data in cache. This is reposible for updating the cache when DB is out- of sync. Support host failures as new hosts need not warm up cache.

Commonly used distributed Caches:

Elastic Cache from AWS,  MemcacheD , Redis

 

Cache Invalidation Methods: 

No comments:

Post a Comment

AWS Data Pipeline Services

https://www.youtube.com/watch?v=tykcCf-Zz1M