robots.txt
Add or generate a robots.txt file that matches the Robots Exclusion Standard in the root of app directory to tell search engine crawlers which URLs they can access on your site.
Static robots.txt
User-Agent: *
Allow: /
Disallow: /private/
Sitemap: https://acme.com/sitemap.xml
Generate a Robots file
Add a robots.js or robots.ts file that returns a Robots object.
Good to know:
robots.jsis a special Route Handlers that is cached by default unless it uses a dynamic function or dynamic config option.
import type { MetadataRoute } from "next";
export default function robots(): MetadataRoute.Robots {
return {
rules: {
userAgent: "*",
allow: "/",
disallow: "/private/",
},
sitemap: "https://acme.com/sitemap.xml",
};
}
export default function robots() {
return {
rules: {
userAgent: "*",
allow: "/",
disallow: "/private/",
},
sitemap: "https://acme.com/sitemap.xml",
};
}
Output:
User-Agent: *
Allow: /
Disallow: /private/
Sitemap: https://acme.com/sitemap.xml
Customizing specific user agents
You can customise how individual search engine bots crawl your site by passing an array of user agents to the rules property. For example:
import type { MetadataRoute } from "next";
export default function robots(): MetadataRoute.Robots {
return {
rules: [
{
userAgent: "Googlebot",
allow: ["/"],
disallow: "/private/",
},
{
userAgent: ["Applebot", "Bingbot"],
disallow: ["/"],
},
],
sitemap: "https://acme.com/sitemap.xml",
};
}
export default function robots() {
return {
rules: [
{
userAgent: "Googlebot",
allow: ["/"],
disallow: ["/private/"],
},
{
userAgent: ["Applebot", "Bingbot"],
disallow: ["/"],
},
],
sitemap: "https://acme.com/sitemap.xml",
};
}
Output:
User-Agent: Googlebot
Allow: /
Disallow: /private/
User-Agent: Applebot
Disallow: /
User-Agent: Bingbot
Disallow: /
Sitemap: https://acme.com/sitemap.xml
Robots object
type Robots = {
rules:
| {
userAgent?: string | string[];
allow?: string | string[];
disallow?: string | string[];
crawlDelay?: number;
}
| Array<{
userAgent: string | string[];
allow?: string | string[];
disallow?: string | string[];
crawlDelay?: number;
}>;
sitemap?: string | string[];
host?: string;
};
Version History
| Version | Changes |
|---|---|
v13.3.0 | robots introduced. |