Deploy Published Content

If I want to include published Squidex content items in a website, but want a fully decoupled(from squidex) design, what is considered best practice? Use trigger to deploy published json content items to folders of the website hosting the content?

You mean software design, right?

There are several strategies:

  1. You can just use caching and refresh in the background. Then even if the refresh fails you can still serve the old content. This is the easiest solution.

  2. You could pull the content periodically.

  3. You can use rules to send the published content to another service, which then stores it in separate database. You can also place a queue in between, but Squidex has its own internal queue and retry mechanism, so it might not be needed.

  4. You can use rules and push the content directly to Elastic or Algolia and use this as a data store for your content.

  5. You can place a custom proxy between Squidex and your server to serve as a cache. I made a simple demo here: squidex-samples/node/proxy at master · Squidex/squidex-samples · GitHub

  6. You can place a CDN (or the Squidex CDN, if you use the cloud) between your service and squidex. this is very useful if you want to to use edge computing and have your frontend directly on cloudflare servers for example.

I usually choose a combination of 1. and 3/4

Thank you for the quick response.

You are correct, yes, I meant software design.

Our team is currently building an external facing customer website with authentication, based on angular. It includes a mixture of dynamic content from line of business applications plus content from squidex, created by internal users/employees as contributors. We have two basic scenarios for viewing squidex content.

*We are using a self hosted squidex instance, via IIS.

  1. Via an Angular app, our customers will view squidex content which can not be updated by them. The customers access the Angular app via the internet.

  2. Our publishers/employees access the angular app and can manage content in any number of ways via our internal corporate network.

Which design would be most appropriate, in your opinion, to deliver the content managed by Squidex in these two scenarios. Currently, we have calls directly from the angular app to the squidex system, via GraphQL. Is publishing to a cached medium the ideal approach overall, or is direct calls from browser to Squidex more appropriate, for viewing published content.

Lastly, should we be including links to published squidex assets directly from the internet? We see special features of asset endpoints such as image resizing and wondered if that meant your intent was for developers to have links directly to these endpoints, or should we use the resizing feature during the publishing step, and then move the final asset out to a web server or hosting system.

Thank you for your help.

Following up on the original post…We are wanting to separate the content management from content viewing, so having tokens with live CMS calls from the web browser seemed away from that objective.

The current path being developed is using rules to publish the JSON output to static files to be stored/hosted in IIS virtual directories and then binding those static files to the client SPA from the result of HTTP Get methods to the static files.

We welcome validation of this solution or criticism and direction.