Hello! This is a column about the latest next.js release. Each release is a set of new, interesting and controversial. The new version will not be an exception. But still, the new version is interesting not so much for the new functionality as for the change in priorities and organization in next.js. And yes, as you have already guessed from the title, in many parts the release is valuable for the study of errors and improvements.

I’ve been working with next.js since about the 8th version. All this time I have been watching its development with interest (sometimes not without disappointment).

Note: the article reflects the most interesting changes from the author’s prism. Commits and PR in the next.js core, developer messages and tasks are analyzed here, so the article reflects more changes than officially presented.

Release next.js v15

First, a little about the changes in the internal development processes of next.js. The framework team first published a release candidate (RC version). Obviously, they did it because of the decision of the React.js team to publish React v19 RC.

Usually, the next.js command in its stable releases calmly uses react from the “Canary” release branch (this branch is considered stable and recommended for use by frameworks). This time they decided to do something different (looking into the future – not for nothing).

The plan of both teams was simple – to publish a pre-release version, let the community check for problems and publish a full release in a couple of weeks.

More than six months have passed since the release candidate React.js, but the stable version has still not been published. The postponement of the release of the stable version of React.js also hit next.js plans. Therefore, contrary to traditions, they published as many as 15 additional minors when they were already collecting the 15th version (usually 3-5 minors and the release). It is noteworthy here that these minors did not include all the accumulated changes, but only the correction of critical moments, which also breaks out of the usual next.js processes.

The basic order of layout in next.js – everything merges into the canary branch, and then, at some point, this branch is published as a stable release.

However, as a result, the next.js team decided to unlink from the React.js release and publish a stable version of the framework before publishing a stable version of React.js.

Documentation versioning

Another very useful organizational change. Finally, you can see different versions of the documentation. Actually, why is it so important.

First of all, updating next.js due to major changes is often quite a difficult task. In general, for this reason, there are still more than 2 million downloads for the 12th version and more than 4 million for the 13th (for the sake of the sake of fairness, the 14th version has more than 20 million downloads) monthly.

Accordingly, users of previous versions need documentation of their version, since the new one can be rewritten by half.

Another problem is that next.js essentially uses a single channel. It also changes the documentation. Therefore, descriptions of changes from canary versions immediately appeared in the main documentation. Now they are displayed under the “canary” section.

Using React

At the beginning, I mentioned that next.js now uses the RC version of React.js. But in fact it’s not like that, or rather not quite like that. In fact, next.js now uses two React.js configurations: the 19th canary version for App Router and the 18th version for Pages Router.

Interestingly, at one point they wanted to include the 19th version for Pages Router, but then these changes were rolled back. Now full support for the 19th version of React.js is promised after the release of its stable version.

At the same time, the new version will have several useful improvements for the server actions of the functions (yes, the React team has renamed them):

  • Weight and performance optimization;
  • Improved error handling;
  • Fixed revalidation and redirects from server functions.

Perhaps in the same section I will include the innovation next.js – the Form component. In general, this is already a familiar form from react-dom, but with small improvements. This component is needed primarily if the successful submission of the form involves the transition to another page. For the next page, the downloading.tsx and layout.tsx abstractions will be preloaded.

import Form from 'next/form'

export default function Page() {
return (
<Form action="/search">
{/* On submission, the input value will be appended to
the URL, e.g. /search?query=abc */}
<input name="query" />
<button type="submit">Submit</button>
</Form>
)
}

Development experience (DX)

Speaking of next.js, it is impossible not to mention development experience. In addition to the standard “Faster, Higher, Stronger” (which we will also talk about, but a little later), several useful improvements have been released.

Long-awaited support for ESlint v9. Next.js never supported ESlint v9. This is despite the fact that both eslint itself (v8) and some of its sub-dependencies are already marked as deprecated. This resulted in an unpleasant situation that projects were essentially forced to keep deprecated packages.

The error interface has slightly improved (which is clear and convenient in next.js):

  • A button for copying the call stack has been added;
  • Added the ability to open the source of the error in the editor on a specific line.

Added “Static Indicator” – an element in the corner of the page showing that the page is assembled in static mode. In general, it’s a trifle, but it’s funny that it was included in the key changes as something new. The indicator of the “pre-assembled” page was about like this from the 8th version (2019) and here, in fact, it was just slightly updated and adapted for App Router.

A directory with debugging information has also been added – .next/diagnostics. In it you can find information about the assembly process and all the errors that occur. It is not yet clear whether it will be useful in daily use, but it will definitely be used when finding problems when parsing with Vercel developers (yes, they sometimes help to analyze problems).

Changes in the assembly

After DX, it’s worth talking about assembly. And along with it, about Turbopack.

Turbopack

And the most important news is in this. Turbopack is fully completed for development mode! “100% of existing tests were performed without errors with turbopack”

Now the turbo team is working on the version for production, gradually passing through the tests and working them out (at the moment about 96%)

Turbopack also adds new features:

  • Setting the memory limit for assembly with turbopack;
  • Tree Shaking (removal of unused code).
const nextConfig = {
experimental: {
turbo: {
treeShaking: true,
memoryLimit: 1024 * 1024 * 512 // in bytes / 512MB
},
},
}

These and other improvements in turbopack “reduced memory usage by 25-30%”, and also “accelerated the assembly of heavy pages by 30-50%”.

OTHER

Significant problems with styles have been fixed. In the 14th version, there were often situations when the styles had a broken order when navigating and from this style A became either higher than style B or lower. This changed their priority and, accordingly, the elements looked different.

The next long-awaited improvement. Now the configuration file can be written in TypeScript –next.config.ts

import type { NextConfig } from 'next';

const nextConfig: NextConfig = {
/* config options here */
};

export default nextConfig;

Another interesting innovation is repeated attempts to assemble static pages. That is, if the page cannot be assembled (for example, due to problems with the Internet) – it will try to assemble again.

const nextConfig = {
experimental: {
staticGenerationRetryCount: 3,
},
}

And at the end of the section, the functionality is very desirable by the community – the ability to specify the path to additional files for assembly. With this option, you can, for example, specify that the files are not in the app directory, but in the modules/main, modules/invoces directories.

However, at the moment it has been added only for the internal goals of the team. And in this version it will definitely not be presented. Further, it will either be used for Vercel tasks, or it will be tested and presented by the next release.

Changing the framework API

The most painful part of next.js updates. And this version also has critical updates.

A number of internal APIs of the framework have become asynchronous – cookies, headers, params and searchParams (the so-called Dynamic APIs).

import { cookies } from 'next/headers';

export async function AdminPanel() {
const cookieStore = await cookies();
const token = cookieStore.get('token');

// ...
}

A big change, but the Next.js team promises that all this functionality can be updated automatically by calling their codemod:

npx @next/codemod@canary next-async-request-api .

Another change, but probably not much to anyone. Geo and ip keys have been removed from NextRequest (used in middleware and API routes). In fact, this functionality worked only in Vercel, in other places the developers made their own methods. For Vercel, this functionality will be included in the @vercel/functions package

And a few more updates:

  • Several tags can be transferred to revalidateTag at once;
  • Images.remotePatterns.search and images.localPatterns keys have been added to the next/image configuration. With their help, you can better control the restrictions of the addresses on which image compression will work.
const nextConfig = {
images: {
localPatterns: [
{
pathname: '/assets/images/**',
search: 'v=1',
},
],
},
}

Caching

In my personal opinion, this is where the most important changes for next.js took place. And the most important news is that Caching is now disabled by default!

Let’s go through all the main changes in caching:

  • Actually, fetch uses the no-store value instead of force-cache by default;
  • API routes by default work in force-dynamic mode (previously force-static by default, that is, they were collected in a static response during assembly [if the page did not use dynamic APIs]);
  • Caching in the client router is also disabled. Previously, if the client entered the page within the path, it was cached on the client and remained in this state until the page was reloaded. Now the current page will be loaded every time. You can reconfigure this functionality through next.config.js:
const nextConfig = {
experimental: {
staleTimes: {
dynamic: 30 // defaults to 0
},
},
}
  • At the same time, even if client caching is enabled, it will probably be updated at the right time. Namely, if the page cache on the server has been enabled to expire.
  • Server components are now cached in development mode. Due to this, updates in development are faster. You can reset the cache by simply reloading the page. You can also completely disable the functionality through next.config.js:
const nextConfig = {
experimental: {
serverComponentsHmrCache: false, // defaults to true
},
}
  • You can control the “Cache-Control” header. Previously, it was always hard overworn to the internal values of next.js. From this there were artifacts with caching via CDN;
  • next/dynamic caches modules and reuses them, instead of reloading them every time;

This is about “historical misunderstandings”. New APIs will also appear in next.js. Namely, the so-called Dynamic I/O. It has not been written about it anywhere yet, so there will be further guesses of the author based on the changes.

Dynamic I/Oapparently it is an advanced dynamic assembly mode. Something like PPR (partial pre-rendering), or rather its addition. In short, Partial Prerindering is a page assembly mode in which most elements are collected at the assembly stage and cached, and individual elements are collected for each request.

So, dynamic I/O [probably] finalizes the architecture for this logic. It expands the caching capabilities so that you can turn it on and off pointwise depending on the mode and place of use (in the “dynamic” block or not).

const nextConfig = {
  experimental: {
    dynamicIO: true, // defaults to false
  },
}

At the same time, the "use cache" directive is added. It will be available in nodejs and edge runtimes and, apparently, in all server segments and abstractions. By specifying this directive at the top of the function or module with the export of the function, its result will be cached. The directive will be available only when dynamicIO is enabled.

async function loadAndFormatData(page) {
  "use cache"
  ...
}

Also, cacheLife and cacheTag methods are added specifically for use cacheTag

export { unstable_cacheLife } from 'next/cache'
export { unstable_cacheTag } from 'next/cache'

async function loadAndFormatData(page) {
"use cache"
unstable_cacheLife('frequent');
// or
unstable_cacheTag(page, 'pages');
...
}

cacheTag Will be used for revalidation using revalidateTag, and cacheLife will set the cache lifetime. At the same time, you will need to use one of the pre-configured values as the cacheLife value. Several options will be available out of the box (“seconds”, “minutes”, “hours”, “days”, “weeks”, “max”), additional ones can be written in next.config.js:

const nextConfig = {
experimental: {
cacheLife?: {
[profile: string]: {
// How long the client can cache a value without checking with the server.
stale?: number
// How frequently you want the cache to refresh on the server.
// Stale values may be served while revalidating.
revalidate?: number
// In the worst case scenario, where you haven't had traffic in a while,
// how stale can a value be until you prefer deopting to dynamic.
// Must be longer than revalidate.
expire?: number
}
}
}
}

Partial pre-rendering (PPR)

Probably the main character of the next release. As already mentioned, PPR is a page assembly mode in which most elements are collected at the assembly stage and cached, and individual elements are collected for each request. At the same time, the pre-assembled part is immediately sent to the client, and the rest are loaded dynamically.

The functionality itself was presented six months ago in the release candidate as an experimental API. This API will be left in this quality and as stable we will probably see it only in the 16th version (which is good, often large functionality went to the squad of stable ones in six months or a year).

As for the changes. As already mentioned, first of all he updated the principles of work. However, from the point of view of using PPR, it almost did not hurt. At the same time, it received several improvements:

Previously, there was just a flag in the configuration, but now you need to specify incremental to enable PPR. This is apparently done to make the logic more transparent – the content can be cached by developers even in PPR and to update it you need to call revalidate methods.

const nextConfig = {
experimental: {
ppr: 'incremental',
},
}

Also before PPR was launched for the entire project, now it must be enabled for each segment (layout or page):

export const experimental_ppr = true

Another change is Partial Fallback Prerendering (PFPR). It is due to this improvement that the pre-assembled part is immediately sent to the client, and the rest are loaded dynamically. In place of dynamic elements at this time, the callback component is shown.

import { Suspense } from "react"
import { StaticComponent, DynamicComponent } from "@/app/ui"

export const experimental_ppr = true

export default function Page() {
return {
<>
<StaticComponent />
<Suspense fallback={...}>
<DynamicComponent />
</Suspense>
</>
};
}

Instrumentation

Instrumentation is marked as a stable API. The instrumentation file allows users to connect to the life cycle of the Next.js server. Works on the entire application (including all segments of Pages Router and App Router).

At the moment, instrumentation supports hooks:

register– is called once when the next.js server is initialized. Can be used for integration with surveillance libraries (OpenTelemetry, datalog) or for specific project tasks.

onRequestError– a new hook that is called for all server errors. Can be used for integrations with error tracking libraries (Sentry).

export async function onRequestError(err, request, context) {
await fetch('https://...', {
method: 'POST',
body: JSON.stringify({ message: err.message, request, context }),
headers: { 'Content-Type': 'application/json' },
});
}

export async function register() {
// init your favorite observability provider SDK
}

Interceptor

Interceptor, aka route-level middleware. It is something like a full-fledged [already existing] middleware, but, unlike the latter:

  • Can work in node.js runtime;
  • Works on the server (which means it has access to the environment and a single cache);
  • Can be added many times and inherited in nesting (about the same as middleware worked when it was in beta);
  • Works, including for server functions.

At the same time, when creating an interceptor file, all pages below the tree become dynamic.

import { auth } from '@/auth';
import { redirect } from 'next/navigation';

const signInPathname = '/dashboard/sign-in';

export default async function intercept(request: NextRequest): Promise<void> {
// This will also seed React's cache, so that the session is already
// available when the `auth` function is called in server components.
const session = await auth();

if (!session && request.nextUrl.pathname !== signInPathname) {
redirect(signInPathname);
}
}

// lib/auth.ts
import { cache } from 'react';

export const auth = cache(async () => {
// read session cookie from `cookies()`
// use session cookie to read user from database
})

If we talk about Vercel, now middleware will be effective as a primary simple check at the CDN level (thus, for example, immediately return redirects if the request is not allowed), and interceptor will already work on the server, performing full-fledged checks and complex operations.

In self-host, apparently, such a division will be less effective (since both abstractions work on a server). It may be enough to use only the interceptor.

Conclusions

Overwriting fetch, hard caching, many bugs and ignoring community requests. The next.js team made wrong decisions, rushed with releases, kept their views despite the community. It took almost a year to realize the problems. And only now, finally, there is a feeling that the framework solves the problems of the community again.

On the other hand, other frameworks. A year ago, at the presentation of React.js, it seemed that all the frameworks were about to be on par with next.js. React began to mention next.js less often as the main tool, the frameworks showed the upcoming assembly systems, support for server components and functions, a number of global changes and associations. Time passed, and all of them actually did not reach this point.

Of course, the final conclusions can be made after a while, but so far there is a feeling that changes in React.js instead of the then expected equalization of frameworks – led to an even greater dominance of next.js and a greater discrepance of frameworks (since the implementation of server components and actions was left to the discretion of the frameworks).

At the same time, Open AI switched to Remix (“to the extent of its greater stability and convenience”):

And they started, apparently, even before significant changes in next.js:

In general, on the next stateofjs and stackoverflow survey, we are likely to see significant permutations.

The conference itself will be held the day after tomorrow, October 24, at 19:00 Moscow time, live on the website nextjs.org/conf and on YouTube. The reports promise to be interesting. It will also be interesting to hear about the changes from the next.js team itself (with examples, animations and plans).

Credits

Code examples or their basics are taken from the next.js documentation, as well as from commits, PR and the next.js kernel.