Search engine indexing

By default creates a robots.txt file for your site that allows Google and other search engines to index all the blog posts and pages on your site.

If you’d like to disable generating the robots.txt file, you can override it with a custom theme. Create a template named config.json in your theme with the following contents:

  "enableRobotsTXT": false

You can also add custom rules to the robots.txt file. Create a new template named layouts/robots.txt in a custom theme with contents such as:

User-agent: *
Disallow: /some-page-here

I submitted the robots.txt sitemap link for in Google Search Console but it shows an error? So I submitted the RSS for now and that worked. But how do I get Search Console to read the full site?

1 Like