A typical system consists of service hosts, data storage, and clients that are making requests to the service hosts. Under normal circumstances, this abstraction performs fine. However, as the number of users increases, the database queries also increase, resulting in slow performance. In such cases, a cache is added to the system to deal with performance deterioration.

A cache is a temporary data storage that can serve data faster by keeping data entries in memory. Caches store only the most frequently accessed data. When a request reaches the serving host, it retrieves data from the cache (cache hitWhen the requested data is found in the cache, the server responds with the data immediately.) and serves the user. However, if the data is unavailable in the cache (cache missWhen the requested data isn’t found in the cache, it’s called a cache miss.), the data will be queried from the database.

Level up your interview prep. Join Educative to access 70+ hands-on prep courses.