Replies: 4 comments
-
Further compression is possible though...
Suggested by DarthGizka here. The logic for bit compression & decompression would be: function compress(a: number) {
return a & 1 ? a : a & 254 | a >>> 8; // put bit 8 (=256) into bit 0 (=1), skip 1
}
function decompress(a: number) {
return a & 254 ? a & 254 | (a & 1) << 8 : a; // put bit 0 (=1) into bit 8 (=256), skip even numbers
} |
Beta Was this translation helpful? Give feedback.
-
This extra bit compression has been implemented in version 0.3.3, so cachePrimes now supports up to 100mln primes, each compressed into a 1-byte gap. |
Beta Was this translation helpful? Give feedback.
-
I had an idea last night when I measured the gaps between primes up to 2^32. There are 151 distinct gap sizes. You could make a look-up table so that range [0, 150] maps to the gap sizes for decompression, and vice versa for compression. The compressed data would be storing they 1-byte keys to the look-up table. Naturally, you can go up to 256 distinct gap sizes. I have not yet calculated how high that'd take us up the list of primes. |
Beta Was this translation helpful? Give feedback.
-
Thanks for this update! I haven't touched this project in ages, it is unlikely that I will be doing any further development. After all, this is all purely theoretical stuff. |
Beta Was this translation helpful? Give feedback.
-
I had an idea about caching primes compressed via the gap, which I quickly implemented here.
It seems to be working really well, consuming 8 times less memory than normally, while also being very fast, especially when iterating through the whole array, takes about 30 times less time than the fastest method of generating the same primes.
Beta Was this translation helpful? Give feedback.
All reactions