[DUPLICATE] Improve the API log download

I have…

  • [ ] Checked the logs and have provided uploaded a log file and provided a link because I found something suspicious there.

I’m submitting a…

  • [ ] Regression (a behavior that stopped working in a new release)
  • [x ] Bug report
  • [ ] Performance issue
  • [ ] Documentation issue or request

Current behavior

We are running the free version of the cloud hosted squidex, yesterday we did some tests and hit the request limit. I would like to download the request log but it simply doesn’t work.
The log fails and stops downloading after about 8mb (which takes like 10-15 minutes for some reason) when i click download again it downloads the logfile but all it contains is the text
“Another process is running, try again later”

Expected behavior

To retrive the logfile

Minimal reproduction of the problem

Environment

  • [ ] Self hosted with docker
  • [ ] Self hosted with IIS
  • [ ] Self hosted with other version
  • [x ] Cloud version

Version: [VERSION]

Browser:

  • [x ] Chrome (desktop)
  • [ ] Chrome (Android)
  • [ ] Chrome (iOS)
  • [ ] Firefox
  • [ ] Safari (desktop)
  • [ ] Safari (iOS)
  • [ ] IE
  • [ ] Edge

Others:

Hi,

I use the logging infrastructure of Google Cloud and have a lot of rate limits I have to consider here. Therefore it is very slow and I can only run one download at a time. I would suggest to try it out later.

I will think about it how to provide a custom solution, but request logs are just a huge amount of data.

EDIT: I have found a way to improve the speed a little bit, but I can only download around 100 log entries per second unfortunately.

1 Like

Will be tracked here: API Call Downlog Logs fails - Rebuild using MongoDB