Configure robots.txt using a Route Handler
In the Content SDK Next.js App Router template, robots.txt is delivered by a Route Handler rather than a physical file. This approach has the following benefits:
-
Per‑site resolution (multisite support) based on the incoming
Hostheader. -
Controlled caching and revalidation of responses.
Before you begin, see how you can configure the robots.txt file.
Handler implementation
A dedicated handler created with createRobotsRouteHandler responds at /api/robots, and a Next.js rewrite exposes it at the canonical /robots.txt path. You can use the createRobotsRouteHandler method from @sitecore-content-sdk/nextjs/route-handler to scaffold the GET implementation. It requires the following:
-
A Sitecore client instance
-
A list of sites (
SiteInfo) (commonly imported from.sitecore/sites.json)
See the following example implementation of the handler:
import { createRobotsRouteHandler } from '@sitecore-content-sdk/nextjs/route-handler';
import sites from '.sitecore/sites.json';
import client from 'lib/sitecore-client';
export const { GET } = createRobotsRouteHandler({
client,
sites,
});Add this to app/api/robots/route.ts and define a Next.js rewrite in next.config.ts as shown:
const nextConfig: NextConfig = {
...
rewrites: async () => {
return [
...
{
source: '/robots.txt',
destination: '/api/robots',
locale: false,
},
];
},
};Caching and revalidation
By default, responses are cached for 60 seconds. To adjust or disable revalidation:
-
Pass a number (seconds) to set a new duration.
-
Pass
falseto skip revalidation and cache indefinitely (only do this if content is truly static). -
Omit
revalidateto use the default.
See the following example with a custom duration:
export const { GET } = createRobotsRouteHandler({
client,
sites,
revalidate: 90, // cache for 90 seconds
});