Re-evaluating Next.js: Did it go the wrong path?

Re-evaluating Next.js: Did it go the wrong path?

For any of my projects in the last years, Next.js was the framework of choice. For future projects I'm not sure anymore and I'm in the midst of a Next.js crisis so to say.

Please note: I'm not bashing Next.js here. I'm using Next.js still everyday but I really needed to get my thoughts and critics out there to see if people do share my concerns. There is a lot of good stuff about Next.js for sure, otherwise I wouldn't have used it for so long, but right now the bad things seem to overshadow the good.

Earlier when people asked questions about Next.js (v12, Pages Router), it was quite easy to simply pinpoint to the docs. Most of it was easy to understand, people understood that if they want to embark on server-side data, they can do so using getServerSideProps or getStaticProps. Pretty clear, intuitive.

I am an architect of 23 years development experience now. I can surely state that I am happy I could develop an awesome skillset but I probably spent the most time with JavaScript. That being said if I am thinking about Next.js being not intuitive enough or too complex I can only imagine people with fewer experience struggling.

Since v13, Next.js tries to push Server-Side-Rendering to the fullest but it doesn't feel complementary anymore but instead seems to wanting to replace frontend. I mean this is cool in the use-case where you actually want that, say a contentful news page or an e-commerce shop. But for an extremely fluid, reactive application with beautiful user interaction, things got way more complicated.

The problem isn't SSR, at all. SSR is lovely and important. The problem is, that we are hyperfocused on enforcing SSR and right now it doesn't seem quite reasonable anymore. If Next.js could trigger onClick on the server, they probably would. And this is where it gets ridicolous: Solving a problem that didn't want to be solved. But let's get into actual issues.

Caching without asking

With v13, Next.js started caching your internal API requests implicitly. Means: If you use fetch for grabbing weather data, you'll get old weather data the next time you fetch it. Thanks to automatic caching. Sure, you can disable it. But this is a breaking change and React itself has a known guideline that it's tailored for enterprise not having breaking changes like that ( So even though React and Next are extremely close, Next didn't seem to adhere to the same design principles. Having such a caching mechanism is surely awesome, if you can enable it - either globally or partially. But I'm confused that amongst all architects, no one said: it shouldn't be enabled by default changing the way your application works in a world where probably most API requests should not be cached.

The whole thing lead to an answer in the form of explaining the caching ( There's a saying though: If you need to explain it, it's not intuitive enough. But the thing is: it was explained in the docs (which makes sense) so rather the saying is: If you need to explain it over and over again, it's definitely not the best solution. (Still, thanks for the video, it's a good one!).

Let me say this: From a technical perspective, seeing what they did, I'm amazed. But as given, it's something no one asked for. I would've rather had something like an explicit caching mechanisms allowing me to CONTROL my app better and not letting my app be controlled by a simple upgrade. Especially because this existing mechanism leads to the problem that I cannot cache requests that I want to cache, e.g. when I'm using Supabase and fetch data from it, it will always skip the cache - which usually makes sense but if Next had explicit caching, I could just put the result to the cache - oh well, in fact, you can, it's just hidden: but, again, comes with pitfalls: . So much to know for a rather simple thing to achieve.

The other problem about that caching is that there are a lot of additional things to consider. For example, if you want to force a page to refresh, you can actually call router.refresh() one the same page or revalidatePath . Unsure? Well that's easy, here's what the docs say:

revalidatePath vs. router.refresh:

Calling router.refresh will clear the Router cache, and re-render route segments on the server without invalidating the Data Cache or the Full Route Cache.

The difference is that revalidatePath purges the Data Cache and Full Route Cache, whereas router.refresh() does not change the Data Cache and Full Route Cache, as it is a client-side API.

All clear? No? Well you're not alone.

No instantaneous UX feedback when loading

If you're a user of Next.js and just added standard pages, you surely will have noticed, even on localhost, that sometimes it feels like the page is stale. You click a link, nothing happens and then, after a few moments, swoosh. This was less worse in the old PHP times where the browser indicated loading behaviour in the URL bar, then the page went white and users knew "aha the browser is loading a new page". With Next.js it feels like "nothing happens" and then suddenly the moment "ah ok well it did work".

Again, Next.js "solved" this by allowing you to add a loader.js at the level of the page you are loading. The problem is though: The loader has to be loaded - for the sake of performance. I'm not sure if I should laugh about this. The loader's goal, UX-wise, is to be instantaneous to provide user-feedback. Period.

Now Lee Robinson says that this is by design because otherwise they'd have to preload all the loading spinners because they wouldn't know when to load which and if you got like 100kb spinners this is way too much. This sounds like a "Well, we sat in a cave and thought about unrealistic use cases to prevent instead of asking real people". See, my current loading spinner has 100bytes. Take 10 and make them 10 times the size, that's 10kb in total (!!). Already this is exaggerating for my use-case but 10kb additional bundle size for my web application to feel fluid: not a problem at all, it's a joke of additional payload, especially if it was lazy-loaded after the critical content.

Logically, this current idea of how the loading spinner is implemented doesn't make sense at all. So here, Next is basically patronizing the developers in the sense of "I know better what's good for you".

Here are potential solutions:

  • Let me choose which loading spinners to include in the bundle (e.g. by exporting a const or setting a global flag like bundleAllLoaders: true in the next.config.js)

  • Let me prefetch the spinner only, not the page. Here's a very reasonable case: A ticket creation form which will lead to the ticket details page afterwards. The details page cannot be prefetched fully because the ticket is about to be created so there is NO WAY of prefetching the page - but there is sense in prefetching the loader. So, let me call router.prefetchLoader('/ticket/details/[id]')

Server actions were published without security advise

I don't wanna go too deep into that topic as I've made a video about it: The thing is: If you use Server Actions as part of your component and load data with credential keys, the credentials are bridged via frontend without you noticing. Some people said in the video: Well, that's obvious because it's a closure. But in fact: It's not a normal closure. This isn't "standard javascript". Nothing here is obvious. Server Actions themselves aren't obvious because they're extracted from the actual component and in times where bundlers do the work and something like Server Actions are presented without further warning or advise considering that, I can expect that they're being extracted without passing my credentials.

Knowing what happens it's easy to bypass but the problem here again is: The upgrade essentially delivered a new problem. Maybe it would've been more clever to not allow the action definition in the component itself.

It's sometimes painfully slow

When an SPA was slow it was pretty easy to debug. It was either the server response that took too long or something in your frontend code was rubbish. With the hybrid approach and the implicit caching mechanisms sometimes you're waiting and you have to dig deep to find the root cause - sometimes without any success.

What I find especially troublesome is the fact that the dev experience doesn't reflect the prod experience at all. It's known that Next produces an optimized output for npm run build but it's not clear on the other hand that sometimes the dev server is kinda hanging itself on a machine that can handle gaming, video cutting and coding at the same time. The worst part is that it's not clear why this happens but it doesn't just happen to me.

Serverifying frontend

I was thinking about this a lot. I'm a huge fan of SSR. But what worth is SSR for interactive apps like e.g. Canva?

I remember the times when headless was the way to go. We all used SPAs like React or Angular and loaded data from the server. A pretty good approach with the disadvantage of having a lot of code in the frontend. That however was solved by having the option to lazy load. Back then, we noticed we need immediate server-rendered pages especially for search engines so we rendered the SPAs output as well on the server and hydrated it.

Now, I feel like Next pushes hard on people to use server-side features as far as moving frontend functionality to the backend. I like having the option, but using the server for everything might not be the best advise.

Let's take for example useOptimisticUpdate. The sample shows how to trigger adding a Chat Message and how it can be shown in the frontend even before it's done adding on the backend ( It sounds good, and it isn't bad by all means. But the problem is, that you're not in control of the state. Once the server is done, it will update the state and re-render the list. Sounds good so far as well but here's my take:

If I wanted to create a ToDo-list that allows adding an entry and immediately being able to drag'n'drop it, this approach would interfere with the interaction and re-render the list. If there was some kind of middleware to the useOptimisticUpdate hook like useOptimisticUpdate(..., { onBeforeUpdateFromServer } => finalState) as well as a setState to control its content, this would be different. To overcome this, you'd need to complexify it as far as I'd choose another approach anyway e.g. using my unglitch library.

But that was just one very specific and, frankly, not limiting example because I'm not forced to use it.

What is a limiting factor however is the fact that every page request is now hitting the server. Transitioning with SPA routers between pages was rather simple because we were just replacing components in a parent wrapper and had the appropriate listeners to do easy transitions. Now, with Next.js this is still possible but it honestly feels like there's more mental load to it (

Also, as stated, for super-interactive Apps like Spotify, I wouldn't want a frontend page switch to trigger backend at all as my Server Component might've fetched initial data that isn't supposed to be fetched anymore when we're already on frontend (e.g. a user id). Hence, I asked myself if I can trick portions of my Next.js app into acting like it was an SPA. Indeed I could, but I'm not sure about the implications:

export default function SPAPageInsideNext({ searchParams }) {
  const p = useSearchParams();
  const r = useRouter();

  const isFooInitially = === "snoo";
  const isFooNow = p.get("foo") === "snoo";
  return (
      Is this an SPA? 
      <div>isFooInitially = {isFooInitially ? "yes" : "no"}</div>
      <div>isFooNow = {isFooNow ? "yes" : "no"}</div>

        onClick={() => {
          if (p.get("snoo") === "foo") {
            history.pushState({ foo: "snoo" }, "foo-snoo", "/spa?foo=snoo");
          } else {
            history.pushState({ snoo: "foo" }, "sno-foo", "/spa?snoo=foo");
        Toggle Param

So instead of router.push, I'm using the native history.pushState, ensuring on reload it will be the correct URL (deeplinking) and it indeed triggers a re-render without server request (i thought it ignores the re-render part, but it makes sense considering onClick runs in the react lifecycle).

Is this good? I didn't dig deeper, so I can't tell yet. But it's definitely good to know. However I'm scared it comes with pitfalls I haven't discovered yet.

This is indeed an official solution as stated in (thanks to a Reddit user pinpointing to it) .

Vercel Marketing vs Vercel Target group

I know it has been some time since Turbo was announced but the thing is: I feel taken for an idiot if the claims given are far beyond the truth - developers are seeking for things that are of value and if something has just 10% more speed than before, that's cool. But if you're saying that something has like 1000% more speed and in fact it's a lie ( it doesn't create a trustworthy bond. Just my two cents on this one.


I was lately seeing some React-only or even Angular apps that went without SSR approach. My first thought: Why not choose SSR, Next.js whatever. But at a second glance I noticed they were extremely speedy. Everybody nowadays says that rather than choosing React one should choose Next. I'm not so sure anymore. If you create a wonderfully splitted SPA with even manually controlled lazy-loading, React can serve you well, depending on your use-case.

Sure, for an e-commerce app with contentful pages, Next.js would probably be my first choice. I'm just re-evaluating the "Next.js for everything" case.

There's too many things in the open that I feel like I really want to try another framework (I love React, so maybe I'll give Remix a shot) to see if there's one that gives me less mental load and a better feeling of confidence.

These are just the things that came to my mind recently.

So my primary questions are: What are your thoughts? And would you consider Next.js being an awesome choice for a fluid interaction-rich application like Spotify or Canva?