Slow Next.js apps are more common than you think. Long load times frustrate users and kill engagement. But most performance issues come down to a handful of common causes — from heavy data fetching and routing delays to oversized bundles, caching mistakes, and unoptimized images.
In this article, I’ll pinpoint 8 common performance problems in Next.js apps — and share clear, practical fixes to help you deliver a faster, smoother experience your users will actually feel.
I’m assuming you already have a basic handle on React components, hooks like useState and useEffect, and the fundamentals of Next.js routing and data fetching. You should also be comfortable using browser dev tools and running build commands on the command line. If any of this sounds unfamiliar, you might want to brush up before diving in, though I’ll keep the explanations straightforward.
One quick note — I’ll use getServerSideProps in the examples here, since many existing projects still rely on the Pages Router, and the performance problems are not unique to Next.js versions. The optimization principles apply equally well if you’re using the newer App Router, even if the syntax changes a bit. The goal is to focus on the fixes that matter most, no matter your setup.
Let’s talk about perceived performance — how fast your app feels to users, not just how fast it actually is. Jakob Nielsen, in his 1993 book Usability Engineering, set some classic benchmarks for user patience:
If your Next.js app takes longer than one second to show content, it’s officially “slow” to users, even if the data is technically loading behind the scenes. And waiting 10 seconds? That’s almost a digital eternity, enough time to lose users in today’s fast-paced world.
But here’s the catch. Is your app really slow? Or does it just feel slow?
That’s the idea of perceived slow performance. Sometimes, data fetching takes time no matter what, but how you handle that wait can make all the difference. This is where perceived performance matters most.
And that takes me to the first solution:
The trick here is to show users something immediately — even if it’s not the final content. A well-designed loading state can make a 2-second wait feel quicker than 1 second spent staring at a blank screen.
React Suspense makes implementing this easy in Next.js. Here’s how you can wrap your components to show fallback placeholders while content loads:
import { Suspense } from 'react'; export default function Dashboard() { return ( <div> <h1>Dashboard</h1> <Suspense fallback={<DashboardSkeleton />}> <DashboardContent /> </Suspense> <Suspense fallback={<ChartSkeleton />}> <AnalyticsChart /> </Suspense> </div> ); }
Look at this GIF. When users reload the page, they immediately see placeholders that reassure them something’s happening:
Now that we’ve talked about perceived performance, let’s check out some actual performance issues. As you know, Next.js isn’t just about server-side rendering — and it’s not purely a single-page app either. It’s a hybrid framework that gives the best of both worlds. And sometimes, the worst.
This hybrid model is powerful. It lets your app serve fully-rendered pages for SEO and fast initial loads, and then switch to SPA behavior for smooth, client-side navigation. But it also means you’re juggling two different performance profiles — and that’s where things can get messy.
On the first load, your Next.js app behaves like a traditional server-rendered site. It fetches data on the server, renders the full HTML, and sends it to the browser. That’s great for SEO and getting meaningful content on screen fast:
// First visit: Server does all the heavy lifting export async function getServerSideProps() { // This runs on the server, blocking the response const userData = await fetchUser(); const dashboardData = await fetchDashboard(userData.id); const notifications = await fetchNotifications(userData.id); return { props: { userData, dashboardData, notifications } }; }
But once that initial page is loaded and React takes over, Next.js switches to SPA mode. Clicking around triggers client-side navigation with no full page reloads — just like React Router:
// Subsequent navigation: Pure SPA behavior function Dashboard({ userData }) { const router = useRouter(); const goToProfile = () => { // This doesn't hit the server - pure client-side navigation router.push('/profile'); }; return ( <div> <h1>Welcome, {userData.name}</h1> <button onClick={goToProfile}>View Profile</button> </div> ); }
The tricky part is that you’re essentially running two applications:
Each has its own performance characteristics, and if not optimized properly, each can slow down your entire application:
Even worse, these problems compound each other. Let’s say a user visits your homepage (server-rendered), then clicks into a dashboard (client-side). That dashboard needs a 2MB JavaScript bundle — and it’s not cached yet. So now the user has to wait for the bundle and wait for data fetched client-side:
// The performance trap: Heavy client-side page import HeavyChart from './HeavyChart'; // 500KB import ComplexTable from './ComplexTable'; // 300KB import RichEditor from './RichEditor'; // 400KB export default function Dashboard() { const [data, setData] = useState(null); useEffect(() => { // Now we're fetching data client-side after navigation fetchDashboardData().then(setData); }, []); // User waits for bundle + data loading if (!data) return <div>Loading...</div>; return ( <div> <HeavyChart data={data.charts} /> <ComplexTable data={data.tables} /> <RichEditor content={data.content} /> </div> ); }
When your server and client are both optimized, you get the best of both — fast initial loads, great SEO, and smooth, app-like navigation of SPAs. But ignore either side, and the whole experience will be sluggish.
Now that we’ve seen how hybrid rendering introduces hidden performance costs, let’s zoom in on one of the biggest culprits on the server side — slow, sequential data fetching.
You click something in an app and… nothing. No feedback, no content. Just waiting. Nine times out of ten, the issue is synchronous data fetching. That’s when your app politely loads one piece of data at a time. Here’s what that looks like in a typical Next.js setup:
// The slow way - each request waits for the previous one export async function getServerSideProps({ req }) { // Step 1: Get user (300ms) const user = await fetchUser(req.session.userId); // Step 2: Wait for user, then get profile (400ms) const profile = await fetchUserProfile(user.id); // Step 3: Wait for profile, then get dashboard data (600ms) const dashboardData = await fetchDashboardData(user.id, profile.preferences); // Step 4: Wait for dashboard, then get notifications (200ms) const notifications = await fetchNotifications(user.id); // Total time: 300 + 400 + 600 + 200 = 1,500ms (1.5 seconds!) return { props: { user, profile, dashboardData, notifications } }; }
This approach turns what could be a 4800ms page load into a 1.5-second load. Each await is saying — “Hold everything — we’re not moving until this finishes.” But do those requests really need to happen one after another?
Promise.all()
If your requests don’t depend on one another, there’s no reason they can’t run at the same time. That’s where Promise.all()
comes in:
export async function getServerSideProps({ req }) { // Step 1: Get user first (still needed for other requests) const user = await fetchUser(req.session.userId); // Step 2: Fetch everything else in parallel const [profile, dashboardData, notifications] = await Promise.all([ fetchUserProfile(user.id), // 400ms fetchDashboardData(user.id), // 600ms fetchNotifications(user.id) // 200ms ]); // Total time: 300ms (user) + 600ms (longest parallel request) = 900ms // We just saved 600ms! return { props: { user, profile, dashboardData, notifications } }; }
That simple change shaves off 600ms — and that’s just for one page load.
You can go even further. If some data doesn’t depend on the user, like system-wide settings or server status, you can fetch that in parallel with the user:
export async function getServerSideProps({ req }) { // Fetch user-independent data alongside user data const [user, globalSettings, systemStatus] = await Promise.all([ fetchUser(req.session.userId), fetchGlobalSettings(), // Doesn't need user fetchSystemStatus() // Doesn't need user ]); // Now fetch user-dependent data in parallel const [profile, dashboardData, notifications] = await Promise.all([ fetchUserProfile(user.id), fetchDashboardData(user.id), fetchNotifications(user.id) ]); return { props: { user, profile, dashboardData, notifications, globalSettings, systemStatus } }; }
With smarter parallelization, you cut load times and improve responsiveness — no fancy tools or libraries required. Just better JavaScript.
Even with faster data fetching, your app may still feel clunky. Why? Because your routing behavior might be doing unnecessary full server round-trips. Let’s look at that next.
With Next.js App Router, every navigation can potentially trigger a server-side render — even for routes that could (and should) be handled client-side. This isn’t just inefficient; it’s a surefire way to make the fast apps feel slow.
Let’s say you have a user browsing a product catalog. In a traditional SPA, clicking between product pages would be instant. JavaScript handles the state update, and the URL changes without reloading the page.
But with a poorly configured Next.js App Router setup, each click might make a round trip to the server. Here’s what that looks like:
// app/products/[id]/page.js // This runs on the SERVER for every product page visit export default async function ProductPage({ params }) { // Network call to server on every navigation const product = await fetchProduct(params.id); const reviews = await fetchReviews(params.id); const recommendations = await fetchRecommendations(params.id); return ( <div> <ProductDetails product={product} /> <ReviewsList reviews={reviews} /> <RecommendationGrid recommendations={recommendations} /> </div> ); }
When this setup is in place, every click goes through the same slow sequence:
That’s six steps, each with potential latency. Multiply that by ten product views, and you’ve got ten server requests — instead of ten instant page transitions.
To avoid these round-trips, shift your routing to the client where possible. Here’s how:
// app/products/[id]/page.js 'use client'; // This makes it client-side rendered import { useEffect, useState } from 'react'; import { useParams } from 'next/navigation'; export default function ProductPage() { const params = useParams(); const [product, setProduct] = useState(null); const [reviews, setReviews] = useState(null); useEffect(() => { // Fetch data client-side - no server round trip Promise.all([ fetch(`/api/products/${params.id}`).then(res => res.json()), fetch(`/api/reviews/${params.id}`).then(res => res.json()) ]).then(([productData, reviewsData]) => { setProduct(productData); setReviews(reviewsData); }); }, [params.id]); if (!product) return <ProductSkeleton />; return ( <div> <ProductDetails product={product} /> <ReviewsList reviews={reviews} /> </div> ); }
This way, navigation between product pages happens instantly — and data loads in the background without re-rendering the entire page on the server.
Here’s a quick guide for deciding:
Use server-side rendering (SSR) when:
Use client-side rendering (CSR) when:
Sometimes, you need SSR for the initial page load but want the benefits of CSR for subsequent interactions. And Next.js App Router lets you combine both:
// app/products/[id]/page.js // Server-render the initial page for SEO export default async function ProductPage({ params }) { const initialProduct = await fetchProduct(params.id); return ( <div> <ProductClient initialData={initialProduct} productId={params.id} /> </div> ); } // components/ProductClient.js 'use client'; export default function ProductClient({ initialData, productId }) { const [product, setProduct] = useState(initialData); // Subsequent navigation is client-side const router = useRouter(); const navigateToProduct = async (newId) => { // Update URL immediately (feels instant) router.push(`/products/${newId}`); // Fetch new data in background const newProduct = await fetch(`/api/products/${newId}`).then(res => res.json()); setProduct(newProduct); }; return ( <div> <ProductDetails product={product} /> <RelatedProducts onProductClick={navigateToProduct} /> </div> ); }
But even with smart routing and optimized data fetching, your app can still feel sluggish if it’s dragging around oversized JavaScript bundles. Let’s talk about why that’s a problem — and how to fix it.
I once joined a Next.js project during a hackathon where the main JavaScript bundle was 2.3 MB. Two. Point. Three. Megabytes! For your kind reference, that’s larger than the original Doom game. The previous developer had imported entire libraries just to use a couple of functions. No code splitting. No dynamic imports. Just one giant payload dumped on every user — whether they needed it or not.
JavaScript bundle size directly impacts your Time to Interactive (TTI) — the metric that measures when your page becomes fully functional. The bigger the bundle, the longer users stare at a loading spinner.
Here’s what often causes bundle bloat:
// First Bundle bloater: Importing entire libraries; import _ from 'lodash'; // Imports the entire 70KB library import * as dateFns from 'date-fns'; // Another massive import // Second Bundle bloater: Importing heavy components everywhere import { DataVisualization } from './DataVisualization'; // 500KB component import { VideoPlayer } from './VideoPlayer'; // 300KB component import { RichTextEditor } from './RichTextEditor'; // 400KB component export default function HomePage() { return ( <div> <h1>Welcome</h1> {/* These components might not even be visible on initial load */} <DataVisualization /> <VideoPlayer /> <RichTextEditor /> </div> ); }
This approach loads everything to every user — even if they never interact with those components. Fortunately, there’s a better way.
Next.js supports intelligent code splitting out of the box. But to make the most of it, you’ll want to use dynamic imports to load code only when it’s needed.
By default, Next.js splits your code by route. But you can optimize this further with next/dynamic
:
// pages/dashboard.js - Only loads when users visit /dashboard import dynamic from 'next/dynamic'; // Heavy components loaded only when needed const AnalyticsChart = dynamic(() => import('../components/AnalyticsChart'), { loading: () => <ChartSkeleton />, ssr: false // Skip server-side rendering for client-only components }); const DataExporter = dynamic(() => import('../components/DataExporter'), { loading: () => <p>Loading exporter...</p> }); export default function Dashboard() { const [showAnalytics, setShowAnalytics] = useState(false); const [showExporter, setShowExporter] = useState(false); return ( <div> <h1>Dashboard</h1> <button onClick={() => setShowAnalytics(true)}> View Analytics </button> {showAnalytics && <AnalyticsChart />} <button onClick={() => setShowExporter(true)}> Export Data </button> {showExporter && <DataExporter />} </div> ); }
With this pattern, users only download the charting or export logic when they ask for it — not before.
If you have components shared across routes but only needed in specific situations, you can lazy-load those too:
// components/ConditionalFeatures.js import dynamic from 'next/dynamic'; // Load only when user has premium subscription const PremiumChart = dynamic(() => import('./PremiumChart'), { loading: () => <div>Loading premium features...</div> }); // Load only when user clicks "Advanced Settings" const AdvancedSettings = dynamic(() => import('./AdvancedSettings')); export function ConditionalFeatures({ user, showAdvanced }) { return ( <div> {user.isPremium && <PremiumChart />} {showAdvanced && <AdvancedSettings />} </div> ); }
This ensures your users aren’t paying for features they can’t even access.
@next/bundle-analyzer
To see what’s eating your bundle size, use the official bundle analyzer:
// next.config.js const withBundleAnalyzer = require('@next/bundle-analyzer')({ enabled: process.env.ANALYZE === 'true' }); module.exports = withBundleAnalyzer({ // Your Next.js config });
Run ANALYZE=true npm run build to see a visual map of your JavaScript — every oversized library, every massive component. It’s like an X-ray for your performance problems.
With dynamic imports, conditional loading, and bundle analysis, you can shrink your initial bundle by 50–70% without breaking a sweat.
Even when you are careful with your JavaScript bundles, there’s one performance killer that breeds with React applications — hydration. After your server sends HTML to the browser, React needs to “hydrate” it by attaching event listeners and reconciling its virtual DOM with the server-rendered markup. This process can block interactivity and affect performance.
This is what the problem looks like:
// The traditional Next.js page with hydration bottlenecks export default function ProductPage({ products }) { return ( <div> <Header /> {/* Must hydrate before user can interact */} <ProductGrid products={products} /> {/* Large component tree */} <FilterSidebar /> {/* Complex interactive components */} <Footer /> {/* Static content that doesn't need JS */} {/* Everything hydrates at once, blocking interactivity */} </div> ); }
During hydration, the browser’s main thread gets blocked while React processes your entire component tree. For complex pages, this can take hundreds of milliseconds, or even seconds, on lower-end devices, creating a frustrating delay where users can see your UI but can’t interact with it.
The Next.js App Router brings React Server Components, which fundamentally change this dynamic by letting you choose which parts of your application require client-side JavaScript:
// app/products/page.js - Server Component (no JS sent to client) import { ProductGrid } from './components/ProductGrid'; import { ClientSideFilter } from './components/ClientSideFilter'; // This component runs on the server and sends only HTML export default async function ProductPage() { // Data fetching happens on the server const products = await fetchProducts(); return ( <div> <h1>Products</h1> {/* Static parts remain as HTML only */} <ProductGrid products={products} /> {/* Only interactive parts are hydrated */} <ClientSideFilter products={products} /> </div> ); } // components/ClientSideFilter.js 'use client'; // Marks this as needing hydration export function ClientSideFilter({ products }) { const [filters, setFilters] = useState({}); // Interactive component logic... }
This approach brings several major performance benefits:
Implementing smart hydration techniques is a great start, but if your app keeps re-fetching the same data, like it has memory loss, your users will still feel the drag. Let’s talk about caching.
Caching is like giving your app a good memory. It prevents it from having to relearn information every single time. But I’ve seen plenty of Next.js apps that treat every request like it’s the first time they’ve ever seen it — especially when it comes to things like permissions, user data, or blog posts.
Poor caching doesn’t just slow down your app — it wastes server resources too. And the most common caching mistakes are often basic:
export default function UserProfile({ userId }) {
const [user, setUser] = useState(null);
// This runs on every component mount - no caching!
useEffect(() => {
fetch(/api/users/${userId}
) .then(res => res.json()) .then(setUser); }, [userId]);return user ?
{user.name} : Loading... ; }
export async function getServerSideProps({ params }) { // This hits the database on every single request const posts = await db.posts.findMany({ where: { published: true }, orderBy: { createdAt: 'desc' } }); return { props: { posts } }; }
This is what I call system amnesia — where your app forgets everything it learned the moment the user refreshes or clicks away.
Effective caching works at different levels: API routes, page rendering, and even database queries. Let’s walk through how to make it work for you:
If your data doesn’t change every second, don’t refetch it every second. Use Incremental Static Regeneration (ISR) to serve pre-built pages and refresh them occasionally:
// pages/blog/[slug].js export async function getStaticProps({ params }) { const post = await fetchPost(params.slug); return { props: { post }, revalidate: 3600, // Regenerate at most once per hour }; } export async function getStaticPaths() { // Generate paths for popular posts const popularPosts = await fetchPopularPosts(); return { paths: popularPosts.map((post) => ({ params: { slug: post.slug } })), fallback: 'blocking' // Generate other pages on-demand }; }
This keeps your content fresh and fast, with minimal load on your server.
For expensive API operations, use unstable_cache to cache server-side logic:
// pages/api/posts.js import { unstable_cache } from 'next/cache'; const getCachedPosts = unstable_cache( async () => { // Expensive database query return await db.posts.findMany({ include: { author: true, comments: { take: 5 }, tags: true }, orderBy: { createdAt: 'desc' } }); }, ['posts-list'], { revalidate: 300, // Cache for 5 minutes tags: ['posts'] } ); export default async function handler(req, res) { const posts = await getCachedPosts(); res.setHeader('Cache-Control', 'public, s-maxage=300, stale-while-revalidate=600'); res.json(posts); }
Now your server doesn’t work overtime for the same queries, and your users get a faster experience.
When done right, caching makes your app feel like it already knows the user’s next move. But even with perfect caching, there’s one more trap that slows everything down — unoptimized images.
I once audited a Next.js app where a single hero image was 4.2MB — and it was being loaded on every page. For perspective, that’s bigger than the entire JavaScript bundle of most full apps.
The problem isn’t just file size. Poorly handled images cause layout shifts, delay page rendering, block the main thread during decoding, and completely push your Largest Contentful Paint (LCP) way beyond the acceptable range. It’s like trying to watch a movie where the video keeps buffering — technically, it works, but the experience is terrible.
Here’s what I often see go wrong:
export default function ProductCard({ product }) { return ( {/* No optimization, no lazy loading, causes layout shifts */}{product.name} ${product.price} ); }
``` export default function Gallery({ images }) { return ( <div className="gallery"> {images.map((image, index) => ( // All 50 images load at once, even if users only see 6 <img key={index} src={image.url} alt={image.caption} /> ))} </div> ); }
This strategy delivers way more than users actually need, wrecking both performance and UX.
next/image
with responsive sizesNext.js provides an image component that handles responsive sizing, lazy loading, and format conversion (like WebP/AVIF). It’s faster, more accessible, and saves a ton of bandwidth. Here’s how to use it effectively:
// components/ProductCard.js import Image from 'next/image'; export default function ProductCard({ product }) { return ( <div className="product-card"> <Image src={product.imageUrl} alt={product.name} width={300} height={200} priority={product.featured} // Load featured products immediately placeholder="blur" blurDataURL="data:image/jpeg;base64,/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAYEBQYFBAYGBQYHBwYIChAKCgkJChQODwwQFxQYGBcUFhYaHSUfGhsjHBYWICwgIyYnKSopGR8tMC0oMCUoKSj/2wBDAQcHBwoIChMKChMoGhYaKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCj/wAARCAAIAAoDASIAAhEBAxEB/8QAFQABAQAAAAAAAAAAAAAAAAAAAAv/xAAhEAACAQMDBQAAAAAAAAAAAAABAgMABAUGIWGRkqGx0f/EABUBAQEAAAAAAAAAAAAAAAAAAAMF/8QAGhEAAgIDAAAAAAAAAAAAAAAAAAECEgMRkf/aAAwDAQACEQMRAD8AltJagyeH0AthI5xdrLcNM91BF5pX2HaH9bcfaSXWGaRmknyJckliyjqTzSlT54b6bk+h0R//2Q==" className="rounded-lg object-cover" /> <h3>{product.name}</h3> <p>${product.price}</p> </div> ); }
This alone improves LCP, prevents layout shifts, and helps users start interacting faster.
For images that appear at different sizes on different screens:
// components/HeroSection.js import Image from 'next/image'; export default function HeroSection() { return ( <div className="hero relative h-screen"> <Image src="/hero-image.jpg" alt="Hero image" fill priority sizes="(max-width: 768px) 100vw, (max-width: 1200px) 50vw, 33vw" className="object-cover" /> <div className="absolute inset-0 flex items-center justify-center"> <h1 className="text-white text-6xl font-bold">Welcome</h1> </div> </div> ); }
The sizes attribute ensures the browser chooses the best version for each screen size, saving bandwidth on smaller devices.
For image galleries, implement progressive loading:
// components/ImageGallery.js import Image from 'next/image'; import { useState } from 'react'; export default function ImageGallery({ images }) { const [visibleCount, setVisibleCount] = useState(6); const loadMore = () => { setVisibleCount(prev => Math.min(prev + 6, images.length)); }; return ( <div className="gallery"> <div className="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-4"> {images.slice(0, visibleCount).map((image, index) => ( <div key={image.id} className="aspect-square relative"> <Image src={image.url} alt={image.caption} fill sizes="(max-width: 768px) 100vw, (max-width: 1200px) 50vw, 33vw" className="object-cover rounded-lg" priority={index < 6} // Prioritize first 6 images /> </div> ))} </div> {visibleCount < images.length && ( <button onClick={loadMore} className="mt-8 px-6 py-3 bg-blue-600 text-white rounded-lg" > Load More ({images.length - visibleCount} remaining) </button> )} </div> ); }
This way, users only download what they see — improving performance and reducing memory usage on mobile devices.
TL;DR — If you’re not using the <Image> component in your Next.js project, you’re leaving serious performance gains on the table. Optimize your images, and your users will feel the difference instantly.
Performance issues affect everyone — but mobile users get the worst of it. Here’s why your Next.js app needs to be especially mobile-friendly:
What this means for your Next.js performance strategy:
Pro tip — If your app runs well on a cheap phone over 3G, it’ll fly everywhere else.
Performance optimization in Next.js isn’t about choosing one solution — it’s about recognizing where things go wrong and methodically cleaning up the mess. And the truth is, performance work isn’t a checkbox you tick once. It’s an ongoing balancing act between development velocity and user experience.
Every new feature, every additional dependency, and every shortcut taken under deadline pressure can slowly erode the progress you may have made. The best way to approach this is not to make performance an afterthought. Consider it from the start.
And don’t optimize in the dark. Use tools like:
Most of what we’ve covered in this guide — caching, images, code splitting, and SSR strategy — can solve about 80% of Next.js performance problems. The remaining 20% often involves more complex optimizations like edge rendering, CDN strategy, query optimization, and sometimes full-on architectural shifts.
But don’t start with the edge cases. Focus on the big wins first.
Here’s the tricky thing about performance: there’s a gap between how fast your app is and how fast it feels. Your app might technically load in 2 seconds — but if users are staring at a blank screen for 1.8 of those seconds, it feels painfully slow. Perception matters just as much as the metrics.
Keep this in mind. If it feels fast, it is fast — at least to your users.
So build with that in mind. Add loading states, show placeholders, give users visual feedback. That way, when someone visits your app, they won’t just see speed — they’ll feel it.
Debugging Next applications can be difficult, especially when users experience issues that are difficult to reproduce. If you’re interested in monitoring and tracking state, automatically surfacing JavaScript errors, and tracking slow network requests and component load time, try LogRocket.
LogRocket captures console logs, errors, network requests, and pixel-perfect DOM recordings from user sessions and lets you replay them as users saw it, eliminating guesswork around why bugs happen — compatible with all frameworks.
LogRocket's Galileo AI watches sessions for you, instantly identifying and explaining user struggles with automated monitoring of your entire product experience.
The LogRocket Redux middleware package adds an extra layer of visibility into your user sessions. LogRocket logs all actions and state from your Redux stores.
Modernize how you debug your Next.js apps — start monitoring for free.
Hey there, want to help make our blog better?
Join LogRocket’s Content Advisory Board. You’ll help inform the type of content we create and get access to exclusive meetups, social accreditation, and swag.
Sign up nowExplore how to use Google’s new experimental Interest Invoker API for delays, popovers, and smarter hover UX.
Bolt.new revolutionizes how you build and deploy web apps with no-code development and seamless AI integration.
Learn how to get the most out of Cursor AI — one of the hottest tools in AI-assisted coding, with practical workflows and underrated features.
Learn about OpenAI vs open source LLMs for frontend devs, with an integration guide, costs, performance comparison, and implementation tips.