Breaking News: Grepper is joining You.com. Read the official announcement!
Check it out

Considerations for using cache

Sumit Rawal answered on May 20, 2023 Popularity 1/10 Helpfulness 1/10

Contents


More Related Answers

  • cache control html
  • please provide valide cache
  • no to cache
  • what is a browser cache
  • http header cache-control
  • how to cache in haproxy
  • should http client cache data
  • `/Library/Caches
  • drop caches
  • drop caches
  • drop caches
  • Cache and return requests
  • control sg cache for optinmonster
  • Using Cache Tags
  • chear cache shortcut
  • What is the caching mechanism?
  • What is the caching mechanism?
  • What are the different types of caches available in Hibernate?
  • Evicts all second level cache hibernate entites
  • Facts About First-Level Cache
  • Removing Cached Entity from First-level Cache
  • distributed cache
  • eliminar cache
  • Caching strategies
  • Cache consistency
  • Caching Strategies in Distributed Caching
  • What is the first level of cache in Hibernate?
  • What are the different second level caches available in Hibernate?
  • database level cache
  • Authentication cache

  • Considerations for using cache

    0

    Here are a few considerations for using a cache system:

    Decide when to use cache. Consider using cache when data is read frequently but modified infrequently. Since cached data is stored in volatile memory, a cache server is not ideal for persisting data. For instance, if a cache server restarts, all the data in memory is lost. Thus, important data should be saved in persistent data stores.

    Expiration policy. It is a good practice to implement an expiration policy. Once cached data is expired, it is removed from the cache. When there is no expiration policy, cached data will be stored in the memory permanently. It is advisable not to make the expiration date too short as this will cause the system to reload data from the database too frequently. Meanwhile, it is advisable not to make the expiration date too long as the data can become stale.

    Consistency: This involves keeping the data store and the cache in sync. Inconsistency can happen because data-modifying operations on the data store and cache are not in a single transaction. When scaling across multiple regions, maintaining consistency between the data store and cache is challenging. For further details, refer to the paper titled “Scaling Memcache at Facebook” published by Facebook [7].

    Mitigating failures: A single cache server represents a potential single point of failure (SPOF), defined in Wikipedia as follows: “A single point of failure (SPOF) is a part of a system that, if it fails, will stop the entire system from working” [8]. As a result, multiple cache servers across different data centers are recommended to avoid SPOF. Another recommended approach is to overprovision the required memory by certain percentages. This provides a buffer as the memory usage increases.

    Eviction Policy: Once the cache is full, any requests to add items to the cache might cause existing items to be removed. This is called cache eviction. Least-recently-used (LRU) is the most popular cache eviction policy. Other eviction policies, such as the Least Frequently Used (LFU) or First in First Out (FIFO), can be adopted to satisfy different use cases. 

    Popularity 1/10 Helpfulness 1/10 Language whatever
    Source: Grepper
    Tags: using whatever
    Link to this answer
    Share Copy Link
    Contributed on May 20 2023
    Sumit Rawal
    0 Answers  Avg Quality 2/10


    X

    Continue with Google

    By continuing, I agree that I have read and agree to Greppers's Terms of Service and Privacy Policy.
    X
    Grepper Account Login Required

    Oops, You will need to install Grepper and log-in to perform this action.