Skip to content

Improve performance for large files #355

@mtlynch

Description

@mtlynch

Reddit user /u/bdpna reports that PicoShare is taking 10 hours to upload a 7 GB file to a well-resourced server (Supermicro 4u, dual XEON CPU, 64 GB RAM). PicoShare doesn't seem to be pegging the CPU or RAM.

They tried increasing the buffer size of SQLite data entries, but that had negligible or negative impact on performance.

Other ideas:


From another year of using PicoShare on a 1x shared CPU, 256 MB RAM Fly.io instance, PicoShare seems to perform well on files below 1 GB. Above that, it sometimes crashes when receiving or serving those files.


Update (2024-03-16)

As a test to see whether there was an inherent size limit in PicoShare, I spun up a Scaleway PRO2-L (32 CPU / 128 GB RAM), and I was able to upload an 11 GB file fine:

image

I'm sure there are ways to make PicoShare more efficient so that larger files work on smaller servers, but PicoShare demonstrably supports files up to 11 GB.


Update (2026-02-01)

I ran some tests using AI, but they were inconclusive. I can run a new suite when I have time:

https://github.com/mtlynch/picoshare/blob/442857aef0796f857f742e04e4676a0b5e290a5f/perf-test/2026-02-01-summary.md

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requesthelp wantedExtra attention is needed

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions