-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support SpreadSASLMemcachedCache: sharding values > 1MB #32
Comments
I mean... no. this would be very difficult (and probably not desirable) to do exactly since pickle is very Python specific. However, doing something similar, where values are broken up would be relatively straight forward (i've done it for Dalli). Not sure if this makes sense as a separate library/code snippet. If you get around to this I'll definitely appreciate a pull request. Otherwise, I might get around to it. In general, there are some tricky tradeoffs with the kinds of strategies you might use to spread values across multiple keys, that I'm not sure how best to generalize (as opposed to giving code snippets for how to write a custom thing for each app, or a higher level library with multiple options or something). But very interested in seeing what can be done. |
Can you post the code you used? The implementation doesnt matter (pickle vs something else) as long i can replicate the result. Sent from my iPhone On Sep 21, 2013, at 9:29 PM, Amit Levy [email protected] wrote:
|
Bump |
Just curious why you closed this since the issue isn't solved. |
Hrm... mostly I was just doing cleaning, and hadn't touched this issue in almost a year... but.. I wasn't convinced this is a good feature to have in this level of library since it would need to provide a specific choice amongst a set of imperfect semantics: what happens when one of the keys is evicted? do you store keys sequntially (i.e. My hunch is that all of the answers are correct for some people and I'm not sure there is a "good" one for most. However, I can mention what I've done in a certain app algorithmically: set:
get:
|
Well, I'm far from an expert on this topic, but why not just do it how the python version does it? There has to be a nodejs equivalent of pickle, right? At the least, can you post code showing how you worked around this issue before? |
And to answer one of the questions, you would select a |
pickle doesn't help much here, it's just a serialization format, and in any case, memjs only deals with values of type // largeValue is a `Buffer` to split, maxSize is the maximum size of any part
function split(largeValue, maxSize) {
var parts = []
while(largeValue.length > 0) {
parts.push(largeValue.slice(0, maxSize);
largeValue = largeValue.slice(maxSize);
}
return parts;
} Here is a gist of some ruby skeleton code for spreading this across the cache like this with |
The problem with just iterating until you get a miss is two-fold:
I think spreading values automatically is problematic because it breaks performance assumptions. I expect that when I call a memcache function on a low-level library, I get the performance of a single request (should be on the order 1ms). If all of a sudden I get a 2x or more performance hit because my value is slightly larger, that's less intuitive behavior (for a performance sensitive library) IMO than an error. |
Having said all that, I think this discussion you've helped highlight some missing lower level features that would make doing this easier and would also be useful on their own (specifically having a way to get at the version of get and set response, as well as supporting compare-and-swap). I'll open new issues for those (and /cc you). I don't mind at all including a utility function (although probably in a different namespace) to split up values in a particular way -- we've basically already pseudo-coded the algorithm in this thread :). |
I think you've misunderstood me. Spreading values wouldn't happen automatically. It would be an option in the setup, e.g., |
Oh interesting. OK, i don't mind that as much... reopening |
Does this library support spreadsaslmemcachedcache? See http://pythonhosted.org/Flask-Cache/.
SpreadSASLMemcachedCache – spreadsaslmemcachedcache
Same as SASLMemcachedCache however, it has the ablity to spread value across multiple keys if it is bigger than the memcached treshold which by default is 1M. Uses pickle.
The text was updated successfully, but these errors were encountered: