You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I know this sounds extreme, but I am creating a device for detecting cells in microscope images and there are ~ 50k cells in one image that I would like to track over the course of 30 frames.
I've managed to get Ultralytics/SAHI and SV to work together for this, but have found that the Detections object grows to around 160 GB in RAM. I have 250 GB of RAM so I am ok for now, but am wondering if there are any out-of-memory tricks for tracking this many detections. For example, if I had a huge dataframe then I could use something like Dask to handle arrays that are too large to fit into memory. Is there something like this for SV?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I know this sounds extreme, but I am creating a device for detecting cells in microscope images and there are ~ 50k cells in one image that I would like to track over the course of 30 frames.
I've managed to get Ultralytics/SAHI and SV to work together for this, but have found that the Detections object grows to around 160 GB in RAM. I have 250 GB of RAM so I am ok for now, but am wondering if there are any out-of-memory tricks for tracking this many detections. For example, if I had a huge dataframe then I could use something like Dask to handle arrays that are too large to fit into memory. Is there something like this for SV?
Thanks.
-Tony
Beta Was this translation helpful? Give feedback.
All reactions