Caching Strategies
Approaches to storing and invalidating cached data to balance freshness and performance.
Definition
Caching strategies define how data is stored in fast-access layers and when it is invalidated or refreshed. Common patterns include cache-aside (application checks cache before database), write-through (writes go to cache and database simultaneously), write-behind (writes go to cache first, database later asynchronously), read-through (cache sits in front of database and handles misses automatically), and cache-around patterns. Eviction policies (LRU, LFU, TTL) determine which cached items are removed when capacity is reached.
Example
“An e-commerce site uses cache-aside with Redis: the product service checks Redis first; on a cache miss it queries PostgreSQL and writes the result to Redis with a 15-minute TTL, dramatically reducing database load during high traffic.”
Synonyms
- cache patterns
- caching techniques
- cache policy
- cache architecture
Antonyms / Opposites
- cache-less architecture
- always-fresh data
Images
CC-licensed · free to useVideo
Related Terms
- redis
- cache-memory
- cdn-caching
- database-cs
