[IMPLEMENTED] How to add robots.txt to a Squidex installation


After deploying Squidex I would like to prevent Google and other crawlers from crawling and indexing my Squidex web app installation. Is there an easy way of doing this without compiling and redeploying the app?
Maybe this can be done through the shell - by creating a new robots.txt file in each running pod/container?

Note: One prereq is that I am using the squidex:dev image on Docker Hub and don’t want to manually build Squidex - I want to keep using the squidex:dev image.

Hi, can you provide a pull request for that? It is useful for all users I think

I’ll see what I can do.

Just add the file to this folder: https://github.com/Squidex/squidex/tree/master/src/Squidex/wwwroot

@spacecat: I have added the robot txt for you: https://github.com/Squidex/squidex/blob/master/src/Squidex/wwwroot/robots.txt

1 Like