Next.js is a fascinating framework that offers robust support for writing complex React applications with features like server-side rendering and static generation.
However, Next.js caching behavior is by far the most criticized aspect of the framework, and many developers hate how it works. While the caching is a huge factor in a good performing React application, it can easily lead to tricky bugs that can take a lot of time to debug if not understood.
A lack of solid understanding of the caching mechanism only opens you to the possibility of constantly battling unexpected behaviors and bugs in your code, such as a page displaying stale data on the client.
This constant battle deprives you of taking full advantage of such an amazing framework to optimize and speed up React applications, leaving you stuck in your projects every time.
In this article, we will examine the four different caching mechanisms, break them down, and learn how to control them effectively:
We will also explore cache invalidation, some Next.js caching tools, best practices, and a lot more!
Caching simply means storing fetched or computed data in a temporary storage location for future access instead of having to re-fetch or re-compute every time it’s needed. In Next.js, caching is very aggressive, which means that it caches everything possible, e.g., fetched data, visited routes, and much more.
To achieve this, Next.js has four distinct caching mechanisms that work at different stages of your React application, which we’ll discuss below.
Request memorization caches data on the server. This technique caches data that has been fetched with a similar GET
request during the lifespan of one request by a single user. This means that data is cached and reused only during one-page rendering.
This way, when a certain route fetches the same data in multiple places in the component tree during one render, only one actual network request is made. This cache is a bit like short-term memory for fetched data.
Request memorization is great because developers do not need to fetch data from the top of the tree and then pass it down through props. You can choose to fetch the data anywhere and not worry about making unnecessary multiple HTTP requests to the server to get the same data.
However, the request memorization mechanism only works with the native fetch function and when the requests are exactly the same, i.e., the same URL and options object. This mechanism is a React feature, which means that it must happen inside a React component and not in a server action or route handler.
Here is an example of request memorization in action. In this setup, the getProducts
function fetches a list of products from an API endpoint. When first called, the data is cached; subsequent calls retrieve data from the cache rather than making additional network requests:
export const getProducts = async () => { const res = await fetch( 'https://mystoreapi.com/products'); const data = await res.json(); return data; }; import { getProducts } from '../../../lib/products'; import ProductList from '../productList/page'; const Product = async () => { const products = await getProducts(); const totalProducts = products?.length; return ( <div> <div>{`There are ${totalProducts} total number of products in my store.`}</div> <ProductList /> </div> ); }; export default Product; import { getProducts } from '../../../lib/products'; const ProductList = async () => { const products = await getProducts(); return ( <ul> {products?.map(({id, title}) => ( <li key={id}>{title}</li> ))} </ul> ); }; export default ProductList;
In the code snippets above, we have a getProducts
function that fetches the products data from the data source. Then, there are two more components that consume the products data: Products
and ProductsList
.
The Products
component is calling the getProducts
function and using the data to display the number of products, and there is also a ProductsList
component instance that also calls the getProducts
function.
The first time the getProducts
function is called, the data is returned and then stored temporarily in the request memoization cache for future use.
The second getProducts
function call came in the ProductsList
component, which is rendered in the Products
component. This way, the second time the getProducts
function is called, there won’t be a network request to the remote data source; rather, the data will come from the request memoization cache.
It is important to note that the URLs in both getProducts
function use cases are the same.
A data cache stores all the data that has been fetched either in a specific route or from a single fetch request. The unique thing about a data cache is that the cached data stays there — basically forever unless you decide to revalidate the cache, which remember, means purging the cache of the old data and updating it with a new one.
Data cache ensures that the cached data is available across multiple requests from different users and even survives app redeployments. This caching mechanism is quite similar to static pages, where every user gets the same page.
This means that this data is used to render routes statically. When this data is revalidated, the currently rendered static page will be regenerated. This is the whole idea of ISR (Incremental Static Regeneration).
The data cache is also great, as it boosts performance and prevents multiple network requests to the original data source. It’s also a good choice for developers because it can be configured.
The full route cache stores all the pages as HTML and RSC payloads during build time. Static pages only have to be built once and then served to multiple users, and it’s this cache mechanism that enables static pages to work the way they do, basically serving as a storage mechanism for the static routes.
Conceptually, the full route cache is nothing more than building static routes and storing them as HTML and RSC payloads. Since this cache is related to the data cache, it persists until the former is invalidated, i.e., the data is cleared.
In the example below, we use full route caching to render the Product
component as a static HTML page with its fetched data preserved:
import Link from 'next/link'; import { getProducts } from '../../../lib/products'; const Product = async () => { const products = await getProducts(); return ( <div> <h1>Blog Posts</h1> <ul> {products.map(({id, title}) => ( <li key={id}> <Link href={`/blog/${id}`} > <a>{title}</a> </Link> </li> ))} </ul> </div> ); }; export default Product;
This Product
component will be cached at build time since it contains no dynamic data that causes the page content to change every time the component is rendered.
The router cache stores all of the prefetched pages and pages visited by the user while navigating around the application in the browser. This caching applies to static and dynamic routes because the browser doesn’t care how the route was generated.
The idea behind this cache is that having all the pages stored in memory allows for instant or almost instant navigation, giving the user the feel of a true single-page application with no hard reload.
The problem with this cache is that pages are not requested from the server again as the user navigates back and forth in the application, which can lead to stale data being displayed on the page.
Furthermore, pages are stored for 30 seconds if dynamic and five minutes if static, and the cache cannot be revalidated unless the user hard-reloads the page or closes and reopens the tab. This is the biggest flaw with the Next.js caching system.
Next.js offers multiple caching strategies, and by utilizing request memoization, data caches, full route caches, and router caches, developers can effectively manage how data is stored and retrieved, depending on whether it is user-specific, static, or dynamic.
However, some caches, like router caches, may lead to stale data. In the next section, we discuss cache invalidation strategies to address possible issues that may arise.
Cache invalidation purges the cache of old data and updates it with new data, while cache revalidation checks whether the cached data is still stale; if it is, a new HTTP request is made to the original data source for fresh data, i.e., invalidated.
Cache invalidation ensures that users see up-to-date data on the page at all times without unnecessarily re-fetching data when the old one is still valid, consequently improving the user experience, especially for dynamic pages or pages that contain personalized user data.
Below are some strategies for implementing cache invalidation in Next.js:
This strategy is often tied with ISR, which involves setting a time limit in seconds, after which the cache data is invalidated and updated with fresh data. This is a very straightforward approach, as seen in the code snippet below, and it is used at the route level:
import { getProducts } from '../../../lib/products'; const Page = async () => { const revalidate = 3600; const products = await getProducts(); return ( <div className='space-y-8'> {products.map(({ id, title, description }) => ( <div key={id}> <h2>{title}</h2> <p>{description}</p> </div> ))} </div> ); }; export default Page;
In the above code, this page uses ISR, and the HTML will be generated at build time. But since the page uses ISR, the data cache will be invalidated every 3,600 seconds, which is exactly an hour. This ensures the page stays up-to-date and in sync with the original data source.
This only makes sense in the server action, where you can invalidate the cache each time the server action is called. Manual cache invalidation can be either through revalidatePath
or revalidateTag
.
In the example below, we implement manual invalidation using the revalidatePath
function, thus ensuring that the next request to the /users
route fetches fresh data from the server:
'use server'; import { revalidatePath } from 'next/cache'; export const getUsers = async () => { const res = await fetch('https://mystoreapi.com/users'); const data = await res.json(); revalidatePath('/users'); return data; };
The use server
directive at the top specifies that the function is intended to run only on the server. This indicates that the function is a server action, where server-specific features like revalidate path/tag make sense.
Each time the getUsers
function is called, it tells Next.js to invalidate the cache responsible for the /users
route, ensuring that fresh data is available on the page for the users:
'use server'; import { revalidateTag } from 'next/cache'; export const getProducts = async () => { const res = await fetch('https://mystoreapi.com/products', { next: { tags: ['products'] }, }); const data = await res.json(); return data; }; export const updateProducts = async () => { revalidateTag('products'); };
In the code snippet above, the getProducts
server action fetches data from the original data source and immediately tells Next.js to cache it, using products
as the data tag.
The updateProducts
server action performs the update logic and then tells Next.js to revalidate all the cache data associated with the specific tag products
. This strategy is highly useful when many pages or components depend on the same data.
Cache invalidation strategies are crucial to ensure users see up-to-date data. These strategies help optimize data fetching and, by extension, improve user experience.
In the next section, we will explore some tools that make caching much easier in Next.js and can optimize your workflow as a developer.
We’ve explored caching mechanisms and invalidation, and luckily there are tools that can help us implement the caching we need with Next.js. These tools make caching easier to manage and fine-tune for different types of content, plus they reduce the pain of debugging weird bugs that can arise in production due to caching.
In this section, we will be providing a step-by-step guide on two Next.js caching tools.
next-cache-toolbar
In Next.js, caching is non-existent in development, only in production. This is one reason why it is a pain. The next-cache-toolbar
is a great caching tool used in development to display cache information about pages.
This tool is so comprehensive as it shows almost everything that needs to be known in a toolbar. In the toolbar, developers can see cache status, such as whether a page was generated using cached data or prefetched, the expiry time of the cached data, and more.
Key Features of next-cache-toolbar
:
With this tool, developers can adjust caching rules and configurations to ensure optimal performance, especially for dynamic pages that need careful caching.
next-cache-toolbar
is very useful, particularly for inspecting pages generated using Next.js features like SSR or ISR. Here are step-by-step instructions for adding the next-cache-toolbar
to your Next.js workflow.
next-cache-toolbar
Add next-cache-toolbar
to your Next.js project. This can be done using npm:
npm install next-cache-toolbar
Or Yarn:
yarn add next-cache-toolbar
next-cache-toolbar
The next-cache-toolbar
requires the use of App Router as some configurations must be in place to integrate and function in the Next.js workflow effectively. The steps to configure next-cache-toolbar
are as follows:
toolbar.jsx
or toolbar.tsx
fileThis file is important, as it lazy loads later on to prevent bundling next-cache-toolbar
in production. In the a``pp
directory, create a toolbar.jsx
or toolbar.tsx
file, and write the code in the snippet below:
// app/toolbar.jsx import { NextCacheToolbar } from "next-cache-toolbar"; import "next-cache-toolbar/style.css"; const Toolbar = () => { return <NextCacheToolbar />; } export default Toolbar;
layout.tsx
or layout.jsx
After creating and configuring the toolbar, it is time to use it:
export const metadata = { title: 'Next.js', description: 'Generated by Next.js', }; // app/layout.jsx let Toolbar: React.ComponentType = () => null; if (process.env.NODE_ENV === "development") { Toolbar = dynamic(() => import("./toolbar")); } export default function RootLayout({ children }) { return ( <html> <head /> <body> {children} <Toolbar /> </body> </html> ); }
When this is done, you should see and interact with the toolbar on your page only during development.
This is the simplest step. Open your terminal and type in npm run dev
to fire up the development server. The next-cache-toolbar
will appear at the top of your page.
next-shared-cache
This tool, also known as @neshca/cache-handler
, is a specialized ISR/data cache API crafted for Next.js applications. This library further simplifies the caching process, thereby reducing the pain of managing cached data.
In addition to the easy customization, next-shared-cache
also provides on-demand revalidation, which, without this tool, would have to be done manually in our project’s server actions.
Furthermore, unlike the next-cache-toolbar
, which only supports App Router, it has full support for Page and App Router, which is a big plus.
Key benefits of next-shared-cache
:
The following steps will guide you through the installation and usage steps to start using advanced caching in your Next.js projects.
Open up your terminal and migrate to your project folder, then run this command:
npm install @neshca/cache-handler
Create a cache handle file named cache-handler.js
in the root of your project directory. This file will contain the configuration. Set the file up with the code below:
import { CacheHandler } from '@neshca/cache-handler'; CacheHandler.onCreation(async () => { const cacheStore = new MagicMap(); const handler = { async get(key) { return await cacheStore.get(key); }, async set(key, value) { await cacheStore.set(key, value); }, async revalidateTag(tag) { for (const [key, { tags }] of cacheStore) { if (tags.includes(tag)) { await cacheStore.delete(key); } } }, async delete(key) { await cacheStore.delete(key); }, }; return { handlers: [handler], }; }); export default CacheHandler;
After installing and setting up the configuration, it is time to integrate the caching tool into your Next.js application.
To use the cache handler, which must be active only in production, update the next.config.js
(yours might be .mjs
— no worries, it’s still the same file). Your next.config.js
file should now look like this:
const nextConfig = { cacheHandler: process.env.NODE_ENV === 'production' ? require.resolve('/cache-handler.mjs') : undefined, experimental: { instrumentationHook: true, }, }; module.exports = nextConfig;
Static pages are pre-rendered. In static pages, the HTML for the route is generated at build-time or periodically in the background by pre-fetching data in ISR.
In this case, it is always a good practice to populate this tool’s cache with pre-rendered pages, giving it initial data.
Create an instrumentation.js
(or .ts
) file in the root of your project folder, and populate it with the code below:
export async function register() { if (process.env.NEXT_RUNTIME === 'nodejs') { const { registerInitialCache } = await import( '@neshca/cache-handler/instrumentation' ); const CacheHandler = (await import('../cache-handler.mjs')).default; await registerInitialCache(CacheHandler); } }
@neshca/cache-handler
is only active in production, which means you need to build your project and run it in production. To prevent split operations, where you build first and then start, we can leverage the power of the script.
Create a new property in the scripts object, and call it anything you want. In this code snippet, we have named the script property prod
:
"scripts": { "dev": "next dev", "build": "next build", "start": "next start", "lint": "next lint", "prod": "next build && next start" }
Now run npm run prod
or npm run yourscriptname
in the terminal to get started with the cache handler.
Caching can greatly improve performance, and following best practices is necessary to enable your application to be secure, efficient, and reliable.
For sensitive data, such as user authentication tokens or personal information, avoid caching it in a way that could be exposed to unauthorized users. Ensure that you store such data securely and always follow security best practices.
Use cache-control headers in your responses to control how long data should be cached. These headers can define expiration times, whether data can be stored in a public cache (like a CDN), and more:
res.setHeader('Cache-Control', 'public, max-age=3600, stale-while-revalidate=59');
If your application deals with frequently updated data, set appropriate revalidation times to ensure users see fresh content without sacrificing performance.
Regularly monitor cache performance using tools like next-cache-toolbar
to detect issues like cache misses or over-reliance on stale data.
Cache misses occur when data is not found in the cache and needs to be fetched again. This could happen due to incorrect configuration or cache expiration. Use next-cache-toolbar
to identify misses and optimize cache durations.
If you have identified some cache misses using next-cache-toolbar
, you definitely want to optimize your cache durations based on misses. Here’s how to do that for two scenarios:
getStaticProps
or getServerSideProps
(ISR strategy)You can simply adjust the time for revalidation based on the frequency of the cache misses. If the information is being refreshed too slowly, reduce the validation time so it refreshes faster, and if it is being refreshed too quickly, increase validation time as seen below:
export async function getStaticProps() { const data = await fetchBlogPost(); return { props: { data, }, revalidate: 3600, // 1 hour (3600 seconds) }; }
Here, simply set up custom cache-control headers to control how long responses are cached on the browser or CDN. You can adjust your header based on the cache misses as seen below:
export default async function handler(req, res) { const data = await fetch('https://api.example.com/data'); res.setHeader('Cache-Control', 'public, max-age=3600, stale-while-revalidate=59'); // Cache for 1 hour res.status(200).json(data); }
Stale data issues arise when outdated content is served after the cache has expired. This can be resolved by setting revalidation times for static pages and ensuring dynamic content is refreshed periodically.
Over-caching occurs when too much content is cached, leading to stale pages or unnecessary storage consumption. To avoid this, be mindful of what information is cached and which is not. Always use short cache lifetimes for data that changes frequently.
Understanding how caching mechanisms work will help you configure them to suit your project’s needs. In addition to understanding how caching works, leveraging caching tools like the ones explained in this article will further boost productivity.
Deploying a Node-based web app or website is the easy part. Making sure your Node instance continues to serve resources to your app is where things get tougher. If you’re interested in ensuring requests to the backend or third-party services are successful, try LogRocket.
LogRocket is like a DVR for web and mobile apps, recording literally everything that happens while a user interacts with your app. Instead of guessing why problems happen, you can aggregate and report on problematic network requests to quickly understand the root cause.
LogRocket instruments your app to record baseline performance timings such as page load time, time to first byte, slow network requests, and also logs Redux, NgRx, and Vuex actions/state. Start monitoring for free.
Install LogRocket via npm or script tag. LogRocket.init()
must be called client-side, not
server-side
$ npm i --save logrocket // Code: import LogRocket from 'logrocket'; LogRocket.init('app/id');
// Add to your HTML: <script src="https://cdn.lr-ingest.com/LogRocket.min.js"></script> <script>window.LogRocket && window.LogRocket.init('app/id');</script>
Hey there, want to help make our blog better?
Join LogRocket’s Content Advisory Board. You’ll help inform the type of content we create and get access to exclusive meetups, social accreditation, and swag.
Sign up nowExplore use cases for using npm vs. npx such as long-term dependency management or temporary tasks and running packages on the fly.
Validating and auditing AI-generated code reduces code errors and ensures that code is compliant.
Build a real-time image background remover in Vue using Transformers.js and WebGPU for client-side processing with privacy and efficiency.
Optimize search parameter handling in React and Next.js with nuqs for SEO-friendly, shareable URLs and a better user experience.
2 Replies to "Advanced Next.js caching strategies"
Thank you very much for the tools you are presenting.
Unhappily, your article is just speaking about the Next “old” way (page router) and, which is sad too, is not taking in account the breaking changes in Next 15 new way of caching, which makes it obsolete for the biggest part :(.
I’m feeling that this is really sad for you, as you took a lot of time, effort and dedication to make your article nice and easy to understand, full of illustrating examples, but it really suffered from bad timing (not your fault, but Next.js is obviously changing its paradigm since v13 and the app router way of doing things : of course, the page router still exists, but it does not support all the new and very useful bits of functionality).
Anyway, maybe a little bit of editing could make this article profitable for everybody, independently of the router structure they are choosing, and the inversion of default behavior that v15 introduced ?
Thank you so much for the kind words and for sharing your thoughts! We’re glad that the examples and explanations were clear and helpful, even if the focus was on the page router approach. It’s always a challenge keeping pace with these exciting changes, but feedback like yours helps keep the discussion going.