await headers() in the root layout forced every page into dynamic rendering. Next.js responded with Cache-Control: private, no-store. Google read private as personalized content and stopped indexing. 100+ pages crawlable, 1 indexed.
This is the most expensive lesson we learned building on Next.js App Router. We are writing it down so you do not have to learn it the same way.
Google Search Console: 100+ pages crawlable, 1 URL indexed. We had launched with SEO metadata on every public page, a valid sitemap, a minimal robots.txt. Everything looked correct. Nothing was indexing.
One line in src/app/layout.tsx: const nonce = headers().get(x-nonce). In Next.js App Router, calling any dynamic API (headers(), cookies(), draftMode()) in the root layout forces every downstream page into dynamic rendering mode. Dynamic rendering means Next.js sets: Cache-Control: private, no-cache, no-store, max-age=0, must-revalidate. This overrides anything set in middleware or vercel.json — it is set at the framework level.
Google reads Cache-Control: private as personalized content. It crawled our pages. It read private. It moved on.
We tried setting explicit Cache-Control: public in middleware. It appeared to work in npm start. In Vercel production, Next.js framework-level override won. The framework sets the header after middleware, and it wins. You must verify cache headers on a live deployment URL, not via npm start.
Remove await headers() from the root layout. Replace per-request nonce CSP with a static CSP policy. Move any inline scripts to static files in /public/. Rebuild. Pages reclassify from dynamic to static. Cache-Control becomes public, max-age=0, must-revalidate.
Before shipping any Next.js App Router project: does the root layout call headers(), cookies(), or draftMode()? If yes, every page is dynamic. If the site has public pages that should be indexed, this is a blocking issue.