Open Ralith opened 5 years ago
Hey @Ralith! I don't think I have a clear grasp of what your use case is. Would you mind providing an example?
From what I do understand, it sounds like you could use an additional method like the put
method but which only inserts the value if the key doesn't already exist in the cache and that indicates whether the new value was inserted or not by returning a reference to the value that is stored in the cache? Something like the following perhaps:
use lru::LruCache;
let mut cache = LruCache::new(2);
let first = cache.put_if_missing(1, "a");
assert_eq!(first, &"a");
let second = cache.put_if_missing(1, "b");
assert_eq!(second, &"a");
I don't like the name put_if_missing
but couldn't think of anything better right now :)
The use case, and desired semantics, are exactly that of the std HashMap::entry
API (see also BTreeMap
, etc). put_if_missing
isn't a solution because my use of LruCache
is to avoid repeating expensive computations, which requires waiting until the entry is known to be missing before computing its value.
Gotcha, definitely seems useful! I'll take a crack at it when I get some cycles, but it will probably be a larger change so I don't anticipate I'll be able to get to it in the near future.
Actually, this is even more needed cause I can't get a value, return a reference to it or if missing create the value and return it cause it's borrow the cache as mutable more than one. It's quite annoying for a cache to not be able to do this basic operation.
I think one question for this API will be at what point an occupied entry is considered "used" and moved to the back of the queue? I think there are broadly three options:
OccupiedEntry
is created (i.e. when LruCache::entry(&mut self, key: K)
is called for an extant key);OccupiedEntry
(e.g. when OccupiedEntry::get(&self)
etc are invoked); orOccupiedEntry
is dropped. This has the advantage that no extraneous work is performed if the entry was removed from the cache.Would love to see this implemented!
The stdlib has a method .entry(&key).or_insert(val)
which is extremely nice to have. Otherwise I can't think of a better way to write the following:
let value = match cache.get(&key) {
Some(value) => value,
None => {
let val = foo();
cache.put(key, val);
cache.get(key).expect("item should exist")
}
};
It would be much better to do:
let value = cache.entry(key).or_insert_with(foo);
Note that LruCache
recently got a get_or_insert
method; however that method always require a fully-owned key. This means that when using, for example, String
as keys, it's not possible to only allocate when a new entry needs to be inserted.
There's also no get_or_insert_mut
that would return a mutable reference to the (possibly newly-inserted) value.
Finally, the entry API has other advantages, like working better when the function to create the missing value can fail.
I also miss get_or_insert_mut
. When trying to use this as a multimap I first have to do a get_or_insert
followed by a get_mut
, first to populate a vec for a key, and then to actually place a new item in the vec.
Another use-case for the entry API is referencing the key in the insert
fn without reallocating it:
let k: String = "some-key".into();
let v = match map.entry(k) {
Entry::Occupied(e) => e.into_mut(),
Entry::Vacant(v) => {
v.insert(query_my_server_for_whatever(v.key().as_str())?)
}
};
A common operation for a cache is to search for an element and insert it if it's missing before returning a reference to the element. This shouldn't require hashing the key ~twice~ three times.