You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We practically double the RAM by copying everything from Redis into PHP RAM space.
Raised in #13 by @msigley
This is an issue in some cases where we're talking about maybe 10k objects?
Say the values are 128M in serialized data length they would take up roughly 128M in MySQL RAM space (maybe), another 128M in Redis and another 128M in every PHP process.
Needs some benchmark love. But how can we limit the mget amount, make the mget more intelligent? Maybe keep a stack of LRU keys per URL?
The text was updated successfully, but these errors were encountered:
We practically double the RAM by copying everything from Redis into PHP RAM space.
Raised in #13 by @msigley
This is an issue in some cases where we're talking about maybe 10k objects?
Say the values are 128M in serialized data length they would take up roughly 128M in MySQL RAM space (maybe), another 128M in Redis and another 128M in every PHP process.
Needs some benchmark love. But how can we limit the
mget
amount, make themget
more intelligent? Maybe keep a stack of LRU keys per URL?The text was updated successfully, but these errors were encountered: