Cache
A hardware or software component that stores frequently accessed data for faster future retrieval.
Definition
A cache is a high-speed data storage layer that stores frequently accessed data so that future requests for that data can be served faster. Caching reduces latency, decreases load on origin data sources, and improves scalability. Caches operate at multiple levels: browser cache (local HTML, CSS, JS), CDN cache (edge servers), application cache (Redis, Memcached), and database query cache. Cache invalidation — knowing when to expire or update cached data — is one of the classic hard problems in computer science.
Example
“Redis cache stores the results of a complex database query for 60 seconds; 10,000 users requesting the same data in that window hit the cache, not the database.”
Synonyms
- buffer
- temporary storage
- fast memory
- data cache
Antonyms / Opposites
- persistent storage
- uncached data
- origin server
Images
CC-licensed · free to useVideo
Related Terms
- cdn
- redis
- performance
- cache-invalidation
