What are different Next JS Data Caching Strategies?

In today’s ever-evolving web world, improving the performance of our applications is of utmost importance. In this Answer, we’ll explore web caching, a technique that plays an important role in enhancing the performance of our Next JS applications. We will cover three strategies of caching: client-side caching, server-side caching, and Stale-While-Revalidate (SWR).

Web caching

Web caching is a mechanism that stores the results of expensive function calls or network requests and reuses the response for subsequent identical requests. This cached data is then reused for subsequent identical requests, which helps improve the speed and efficiency of data retrieval.

Caching strategies

We’ll discuss three common caching strategies, each with its unique advantages and use cases.

Primary Caching Strategies
Primary Caching Strategies

Client-side caching

Client-side caching involves storing the data on the client’s device (browser) after the initial request. This method is straightforward and reduces network traffic because subsequent requests for the same data can be satisfied without additional network calls.

Client-side caching
Client-side caching

Example

Let’s see an example of how client-side caching can be achieved with Next JS.

import { useEffect, useState } from 'react';
import api from './api/api';

const Posts = () => {
  const [posts, setPosts] = useState(null);

  useEffect(() => {
    const fetchData = async () => {
      try {
        const response = await api.get('/posts');
        setPosts(response.data);
      } catch (error) {
        if (error.isCache) {
          setPosts(error.data);
        } else {
          console.error(error);
        }
      }
    };

    fetchData();
  }, []);

  return (
    <div>
      {posts && posts.map((post) => <div key={post.id}>{post.title}</div>)}
    </div>
  );
};

export default Posts;

Pros

  • Lowers server load: Reduces the number of requests sent to the server, as the client can serve cached data locally.

  • Faster response time: Retrieving data from the local cache is generally faster than fetching it over the network.

  • Offline support: Allows users to access previously cached content when they are offline.

Cons

  • Limited storage: The cache size on the client-side is limited, which may lead to the eviction of older data.

  • Stale data: If not managed properly, cached data might become outdated and not reflect the latest server changes.

  • Security concerns: Sensitive data cached on the client-side could be accessible to malicious users.

Stale-While-Revalidate (SWR)

The Stale-While-Revalidate (SWR) strategy is a cache invalidation strategy popularized by HTTP Cache-ControlHTTP Cache-Control is a header used in HTTP responses to specify caching directives, indicating how the response should be cached or not cached by the client or intermediary caches.. The idea is that you can use the stale data while you send a request to validate if there’s any updated data.

Cache-Control: stale-while-revalidate
Cache-Control: stale-while-revalidate

Example

Again, let’s use the SWR library in Next JS to understand this.

import useSWR from 'swr'

function Profile() {
  const { data, error } = useSWR('/api/user', fetch)

  if (error) return <div>Failed to load</div>
  if (!data) return <div>Loaing...</div>

  // Render data
  return <div>Hello, {data.name}!</div>
}

export default function Home() {
  return (
    <div>
      <h1>Next JS App</h1>
      <Profile />
    </div>
  )
}

In the code above, the useSWR first returns data from the cache (stale), then sends the fetch request (revalidate). If the fetch request provides new data, SWR will update accordingly.

Pros

  1. Improved performance: Allows serving stale data while simultaneously fetching fresh data in the background.

  2. Low latency for most requests: Users get quick responses from the cache while still getting updated data.

  3. Robustness: The application remains responsive even when the server is temporarily unreachable.

Cons

  1. Complexity: Implementing the logic for revalidation and handling stale data can be complex.

  2. Resource consumption: Concurrently revalidating data can put a strain on the server and the network.

  3. Data staleness: Users might still receive slightly outdated data, depending on revalidation intervals.

Server-side caching

In server-side caching, data is stored on the server after the initial request, reducing the load on your database or other data sources because subsequent requests can be satisfied directly from the cache.

Server-side caching
Server-side caching

The first time a user visits the webpage, the server generates the page by fetching data and constructing it. Afterward, the server caches this generated webpage. On subsequent visits, the server serves the cached copy, resulting in faster loading times for the user.

Example

Here’s a simple example using a Node JS server with express and memory-cache.

import React, { useState, useEffect } from 'react';

const DataPage = () => {
  const [data, setData] = useState({});

  useEffect(() => {
    // Fetch data from the API route
    fetch('/api/data')
      .then((res) => res.json())
      .then((data) => setData(data));
  }, []);

  return (
    <div>
      <h2>Data from API</h2>
      <ul>
        <li>Message: {data.message}</li>
        <li>Timestamp: {data.timestamp}</li>
      </ul>
    </div>
  );
};

export default DataPage;

Explanation

Let's understand the explanation of code in steps.

  • Server-side caching: The application uses the memory-cache package to implement server-side caching. The memory-cache module allows storing data in memory, making it available for future requests without fetching the data from the original source again. This caching is useful to reduce the load on the API and improve the performance of the application.

  • Data fetching: The application fetches data from a custom API endpoint defined in pages/api/data.js. The data.js file sets up an Express server route that handles the /api/data endpoint. When a user accesses this endpoint, the server checks if the data is already cached. If cached data exists, it is sent as the response; otherwise, the server fetches data from the data source (in this case, it uses the fetchFromDataSource function) and caches it before sending the data as the response.

  • Data source: The application uses a dummy data source represented by the fetchFromDataSource function. In a real-world application, this function should be replaced with an actual API call to an external data source, such as a RESTful API or a database query.

  • Data display: The fetched data is then displayed on the webpage. The DataPage component is responsible for fetching the data from the API using the fetchFromDataSource function. Once the data is received, it is displayed in a specific format on the webpage.

  • Date formatting: To display the timestamp in the desired format (e.g., "date-monthName-year"), the formatDate function is implemented in the DataPage component. This function takes the timestamp as input, creates a Date object from it, and then extracts the day, month, and year components using various Date object methods. Finally, it constructs a string in the desired format and displays it along with the fetched message.

Pros

  1. Scalability: Reduces server load and enhances the server’s ability to handle more concurrent users.

  2. Consistent data: All clients receive the same up-to-date data from the server’s cache.

  3. Centralized control: The server can manage cache evictionCache eviction refers to the process of removing or replacing items from the cache to make room for new or more relevant data. and data updates efficiently.

Cons

  1. Increased latency: Cached data must be fetched over the network, leading to potential delays.

  2. Cost and complexity: Implementing and maintaining server-side caching requires additional infrastructure and monitoring.

  3. Cache synchronization: Ensuring data consistency across multiple cache instances can be challenging.

Summary

Here’s a brief summary to help you understand when to use each caching strategy effectively:

  • Client-side caching: Ideal for scenarios where you have limited data that can be cached and you want to reduce server load while providing a faster user experience. It works well for static assets, small datasets, and situations where data updates are infrequent.

  • Server-side caching: Suitable for applications that serve the same data to multiple clients, especially when the data changes infrequently or on a predictable schedule. It’s commonly used for content-heavy websites, APIs, and database-driven applications.

  • Stale-While-Revalidate (SWR): A great choice for applications where immediate response is crucial, and occasional stale data is acceptable. This strategy works well for frequently accessed data, like news articles or social media posts, where up-to-the-minute accuracy is not critical.

Conclusion

Caching is an essential aspect of any web application that fetches data. Depending on your specific use case, each of these strategies can prove beneficial. Client-side caching is great for reducing network traffic, SWR balances freshness and speed, and server-side caching can significantly reduce database load.

Copyright ©2024 Educative, Inc. All rights reserved