[IMPLEMENTED] Custom robots.txt

I’ve been running some mobile validation tools in the Google Search Console and it appears it doesn’t like that Squidex is not allowing the Google bot to crawl and index assets (images).

“Googlebot blocked by robots.txt”

So, in addition to


I would like to request a “custom robots.txt” feature. One idea is to have a UI/page under settings where you can edit the content of robots.txt which then is saved in mongo. This way we don’t need to think about writing to the file system and/or backup strategy of the same.

The solution I’m looking for looks something like this:

User-agent: *
Allow: /api/assets/
Disallow: /

Good point, thank you for the input. I would just add it to the settings, then it is a 20 minute task.

See: https://github.com/Squidex/squidex/blob/master/src/Squidex/appsettings.json#L56

1 Like