There’s been major controversy (rightly so?) surrounding Next.js’s openness, particularly how it was designed not to work well on serverless platforms other than Vercel. In response, developers created a new solution called OpenNext to make Next.js truly portable across all platforms.
If you’re new to the drama, the idea of making an open-source library truly open might confuse you. This article will clear it up. We’ll cover how Next.js wasn’t designed to be fully portable on serverless platforms, how OpenNext is fixing the issue, how to get started with OpenNext, and what the future of Next.js portability might look like.
Next.js was created to make it easier to build full-stack applications with React. React, by itself, is just a UI library; it doesn’t give you enough structure for building real-world apps. Next.js filled that gap. It introduced file-based routing, server-side rendering (SSR), static site generation (SSG), and even a way to build backend API routes inside the same project. Over time, it evolved from a frontend framework into a full-stack one.
Now, as Next.js added more backend-like features such as API routes, SSR, middleware, and Incremental Static Regeneration (ISR), it leaned toward serverless infrastructure. Serverless made sense because these backend features don’t require a long-running server. Instead, they can run as on-demand functions that spin up, do their job, and shut down. This aligns perfectly with the kind of small, stateless tasks that serverless handles best. So, the Next.js team started optimizing its build output for this model.
Vercel is the company that built and maintains Next.js. Naturally, they optimized the framework to run seamlessly on their own platform. When you deploy a Next.js app to Vercel, every API route becomes its own serverless function. Each SSR page is deployed the same way. ISR and middleware are handled behind the scenes by their infrastructure. The whole thing works beautifully, with no extra configuration needed.
However, when you try to deploy that same Next.js app to AWS Lambda or Cloudflare Workers, things start to break or behave unpredictably. This happens because those platforms don’t mirror Vercel’s internal structure, and Next.js wasn’t originally built to be fully portable. So you’re either forced to drop some features, build ugly workarounds, or stick with Vercel.
This is the exact problem OpenNext was created to solve.
OpenNext is a community-driven project that repackages the Next.js build so it can run on any serverless platform. It provides tooling that maps your app’s pages, API routes, and special features like ISR and middleware into formats compatible with AWS Lambda, Netlify Functions, Cloudflare Workers, and more. It basically simulates Vercel’s runtime behavior, so your app works the same way even outside their ecosystem.
When you build a Next.js app, it generates a .next/
folder with everything needed to run the app, including static assets, server-side logic, and routing metadata. OpenNext takes that build output and repackages it for other serverless platforms. This is done by splitting your app into platform-specific parts, like Lambda functions for SSR/API, static files for S3 or a CDN, and background jobs for ISR. It wraps the code in the correct runtime handlers for each target and re-implements Vercel-only features like ISR and image optimization using services like S3, DynamoDB, and Sharp.
The Vercel team has started collaborating with OpenNext contributors. They’re now working on standardizing deployment adapters, which should eventually make it easier to deploy Next.js apps anywhere.
That said, hosting isn’t the only thing people are worried about. There’s been growing drama around the direction of React, Next.js, and Vercel as a whole. Vercel has hired most of the React core team, and a lot of people are uncomfortable with how much control one company now has over the ecosystem. Some are worried that React could slowly shift to favor Next.js-specific ideas, leaving other frameworks out in the cold.
Another problem is how fast Next.js changes overall. Next.js keeps dropping new features, breaking changes, and API shifts without slowing down. It’s getting harder for teams to build long-term projects without worrying that their entire approach could become outdated overnight.
Maybe people are overreacting. Or maybe it’s a real sign that the ecosystem needs better checks and balances. Either way, OpenNext shows that the community still has a say, and that’s a good thing.
Now we’ll transition into how to use OpenNext with a variety of tools. OpenNext currently provides adapters for Cloudflare, AWS Lambda, and Netlify.
You can create a new Next.js app pre-configured for Cloudflare Workers using OpenNext by running:
npm create cloudflare@latest -- my-next-app --framework=next --platform=workers
This command will set up a new Next.js app with all the necessary configs and libraries to make your app work seamlessly on Cloudflare. After the installation, you can also run npm run preview
to locally preview how your app will behave in the Cloudflare Workers runtime, rather than in Node.js.
Once you’re done with development, you can deploy the app by running:
npm run deploy
To configure OpenNext for Cloudflare Workers on an existing Next.js app, first install the Cloudflare adapter:
npm install @opennextjs/cloudflare@latest
Next, install Wrangler as a dev dependency:
npm install --save-dev wrangler@latest
Once both libraries are installed, update your package.json scripts to add commands for building, previewing, and deploying your app:
{ "scripts": { "preview": "opennextjs-cloudflare build && opennextjs-cloudflare preview", "deploy": "opennextjs-cloudflare build && opennextjs-cloudflare deploy" } }
With this update, you can now run npm run preview
to locally see your Next.js app’s behavior on Cloudflare Workers, or npm run deploy
to deploy it live.
SST provides one of the easiest ways to deploy a Next.js app to AWS. All you need to do is initialize SST in your existing Next.js app by running:
npx sst@latest init
Next, install the newly added dependencies:
npm install
Finally, deploy your app with:
npx sst deploy
You can also follow this tutorial for a step-by-step guide on deploying Next.js to AWS Lambda with SST.
OpenNext also provides an adapter that integrates with Netlify. However, no extra configuration is needed; Netlify automatically detects your Next.js project and applies the necessary settings, so everything works out of the box.
OpenNext allows you to customize how your Next.js app is built for different platforms using an open-next.config.ts
file. This file should be placed at the same level as your next.config.js
file. Once created, you can use it to modify configurations such as caching behavior, server wrappers, and how ISR is handled for your target platform.
If you’re deploying to AWS with SST, first install the @opennextjs/aws
package:
npm install @opennextjs/aws
Then, create the open-next.config.ts
file in your project root. For example, to enable Lambda streaming in your Next.js deployment:
import type { OpenNextConfig } from "@opennextjs/aws/types/open-next.js"; const config = { default: { override: { wrapper: "aws-lambda-streaming", }, }, } satisfies OpenNextConfig; export default config;
This configuration enables Lambda streaming, which allows your app to stream responses directly from AWS Lambda, thereby improving performance for dynamic content.
If Cloudflare is your target deployment platform, you should have already installed the @opennextjs/cloudflare
adapter during your initial setup. If not, install it with:
npm install @opennextjs/cloudflare@latest
Then, create the open-next.config.ts
file in your project root. For example, to enable caching with Cloudflare R2:
import { defineCloudflareConfig } from "@opennextjs/cloudflare"; import r2IncrementalCache from "@opennextjs/cloudflare/overrides/incremental-cache/r2-incremental-cache"; export default defineCloudflareConfig({ incrementalCache: r2IncrementalCache, });
This configuration sets up your app to use Cloudflare R2 for ISR caching, improving performance by storing and serving cached content.
For more detailed configurations and advanced use cases, explore the OpenNext documentation.
In this article, we covered how Next.js wasn’t designed to be fully portable on serverless platforms outside of Vercel, with features like caching, ISR, and more breaking when deployed elsewhere. We explained how OpenNext is fixing this problem, looked at the future of Next.js portability, and discussed how to use OpenNext to deploy Next.js apps that work fully on AWS, Cloudflare, and Netlify.
OpenNext is a big step in the right direction. It proves that Next.js apps don’t have to be locked into Vercel’s hosting anymore.
Debugging Next applications can be difficult, especially when users experience issues that are difficult to reproduce. If you’re interested in monitoring and tracking state, automatically surfacing JavaScript errors, and tracking slow network requests and component load time, try LogRocket.
LogRocket captures console logs, errors, network requests, and pixel-perfect DOM recordings from user sessions and lets you replay them as users saw it, eliminating guesswork around why bugs happen — compatible with all frameworks.
LogRocket's Galileo AI watches sessions for you, instantly identifying and explaining user struggles with automated monitoring of your entire product experience.
The LogRocket Redux middleware package adds an extra layer of visibility into your user sessions. LogRocket logs all actions and state from your Redux stores.
Modernize how you debug your Next.js apps — start monitoring for free.
Would you be interested in joining LogRocket's developer community?
Join LogRocket’s Content Advisory Board. You’ll help inform the type of content we create and get access to exclusive meetups, social accreditation, and swag.
Sign up nowuseSearchParams
in ReactLearn about React Router’s useSearchParams Hook, and how it helps you manage state through the URL for a more resilient, user-friendly app.
Discover what’s new in Node.js 24, including major features, improvements, and how to prepare your projects.
Build agentic AI workflows with Ollama and React using local LLMs for enhanced privacy, reduced costs, and offline performance.
Learn when to choose monorepos or polyrepos for your frontend setup by comparing coordination, dependency management, CI/CD requirements, and more.