-
Notifications
You must be signed in to change notification settings - Fork 40
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
API download function seems unreliable :( #375
Comments
Note that it doesn't appear to be a lack of resources, at least not cpu (plentiful), disk i/o (plentiful), nor memory. I tripled the ram given to the API VM under test (from 8GB to 24GB) and the failure happens in exactly the same spot. |
Live databases are just as bad, timing out after about 20 seconds:
Small databases work fine though:
|
As a data point, I just changed the version of Go being used for this from 1.19 provided by Debian 12 packages, to the latest official Golang release... and the problem still remains. Looks like it's an actual problem with either our code or one of the components in use. The backend server daemon itself shows this at the time of the broken download connection:
It does that once per failed download attempt, so 3 attempts gave:
Next, I'll see if the problem shows up when running from my local development workstation. Probably not today though. |
As part of post-migration testing of things in the new data centre, I'm running some sanity tests to verify things are working ok.
However, it appears the API download() function doesn't work properly with larger sized database files:
That's happening multiple times in a row with databases of non-trivial size.
The backend isn't showing the problem with any useful detail either:
It's happening on both the old and new API servers(!), and none of MinIO, Memcached, nor PostgreSQL are displaying any kind of error either.
After the migration is completed we'd better take a look at this.
The text was updated successfully, but these errors were encountered: