Robots.txt | Yext Hitchhikers Platform

A robots.txt file can be an important part of your SEO strategy. According to Google, a robots.txt file tells search engine crawlers which URLs the crawler can access on your site. Here is a very informative guide for reference.

A robots.txt file should be served from the root of your site, and it must be plain text. To serve a plain text file, we recommend using a TypeScript file that exports a render function. This non-react template module will be rendered as plain text, so it is ideal for creating a robots.txt. For more information about rendering from non-React templates, check out this article .

light bulb
The vite-based dev server does not handle hot-reloading for non-React templates, so it is not possible to preview the robots.txt with the dev server. To preview your robots.txt locally, run npm run build:serve to generate a local production build.

Below is an example of a robots.ts template module:

// src/templates/robots.ts <-- Note the .ts suffix

import {
} from "@yext/pages";

 * Not required depending on your use case.
export const config: TemplateConfig = {
  // The name of the feature. If not set the name of this file will be used (without extension).
  // Use this when you need to override the feature name.
  name: "robots",

export const getPath = () => {
  return `robots.txt`;

export const render = (data: any): string => {
  * Return a string that will be served at <your-site-domain>.com/robots.txt. 
  * For more information about robots.txt, check out this resource:
  * An empty robots.txt will NOT prevent any pages from being crawled. 

  return ``;

A few notes about things from the example above:

  1. The getPath export is returning the string “robots.txt”. This is required to ensure search engine recognition of the robots.txt file.
  2. The file has no default export and instead exports a render function that returns a string. As explained here , non-React template modules use a non-default export for the rendered content. The return value from render will display in the browser as plain text.
  3. The file must have .ts suffix. Because the template module is returning a string and not “tsx,” it must have the “ts” suffix.
  4. The file should live under src/templates to ensure the page is built and rendered. Full path: src/templates/robots.ts.
  5. The above file will be served at <your-site-domain>.com/robots.txt.
  6. A good default, if you do not have specifics robots requirements, is to provide an empty robots.txt . An empty robots file will NOT prevent any pages from being crawled.