You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm facing the same issue. I've trained a (german) model using around 560MiB plain text from the Leipzig Corpora Collection. The model itself is 488MiB. Having a 16GB RAM 4CPU Linux Cloud, it takes 5-10 min to load the model. Is it possible to speed this up?
I've trained my model (I've tried versions from
master
and0.0.11
branches) on 10 MiB plain text part of English Wikipedia (enwiki-latest-pages-articles_10MiB.txt) and got 41 MiB bin file (enwiki.bin.zip).I'm loading it in Python, but it takes 12 GiB of memory to load it and still it doesn't load in foreseeable time.
The text was updated successfully, but these errors were encountered: