Hi,
After deploying Squidex I would like to prevent Google and other crawlers from crawling and indexing my Squidex web app installation. Is there an easy way of doing this without compiling and redeploying the app?
Maybe this can be done through the shell - by creating a new robots.txt file in each running pod/container?
Note: One prereq is that I am using the squidex:dev image on Docker Hub and don’t want to manually build Squidex - I want to keep using the squidex:dev image.