You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We are using tilestream on a micro EC2 instance for about half a year now, without any problems. Now we get some more users and requests, on peak moments some tiles fail with a 404. The error that is returned from Tilestream with the 404 reads Error: SQLITE_CANTOPEN: unable to open database file.
I've done some searching on this error, and #139 seems very relevant. The mentioned TileLive issue mapbox/tilelive#90 proposes this could be fixed by raising the open-file limit in linux by using the ulimit -n command. This was limited to 1024 on my machine, so I raised it to 10000 by following the steps @apollolm took. I can confirm that after reboot and relogging, the ulimit -n is set on 10000.
Though after a while this problem happens again. As if the solution didn't work in this case. I have tried monitoring how many open files at a time there are, by using the lsof | awk '{ print $2; }' | sort -rn | uniq -c | sort -rn | head command proposed in this stackoverflow question. This amount never exceeds 6000 so this shouldn't hit the limit.
Is there another reason this error could happen, other than the ulimit -n?
My context is that I am using two .MBTILES files each 52GB in size respectively on an EC2 micro instance with CPU not going over 15% and ~135MB free memory.
Thanks in advance!
The text was updated successfully, but these errors were encountered:
Hi,
We are using tilestream on a micro EC2 instance for about half a year now, without any problems. Now we get some more users and requests, on peak moments some tiles fail with a 404. The error that is returned from Tilestream with the 404 reads
Error: SQLITE_CANTOPEN: unable to open database file
.I've done some searching on this error, and #139 seems very relevant. The mentioned
TileLive
issue mapbox/tilelive#90 proposes this could be fixed by raising the open-file limit in linux by using theulimit -n
command. This was limited to1024
on my machine, so I raised it to10000
by following the steps @apollolm took. I can confirm that after reboot and relogging, theulimit -n
is set on10000
.Though after a while this problem happens again. As if the solution didn't work in this case. I have tried monitoring how many open files at a time there are, by using the
lsof | awk '{ print $2; }' | sort -rn | uniq -c | sort -rn | head
command proposed in this stackoverflow question. This amount never exceeds 6000 so this shouldn't hit the limit.Is there another reason this error could happen, other than the
ulimit -n
?My context is that I am using two
.MBTILES
files each 52GB in size respectively on an EC2 micro instance with CPU not going over 15% and ~135MB free memory.Thanks in advance!
The text was updated successfully, but these errors were encountered: