Search engine indexing

By default creates a robots.txt file for your site that allows Google and other search engines to index all the blog posts and pages on your site.

If you’d like to disable generating the robots.txt file, you can override it with a custom theme. Create a template named config.json in your theme with the following contents:

  "enableRobotsTXT": false

You can also add custom rules to the robots.txt file. Create a new template named layouts/robots.txt in a custom theme with contents such as:

User-agent: *
Disallow: /some-page-here

Note that for the custom robots.txt layout to take effect, you will need to make sure that enableRobotsTXT is true.

1 Like