I thought here would be a much easier method of communicating than over github issues, it just makes things faster
@Jonas Schievink In regards to the
lasso crate, would a hard cap on the memory used by string internment be something you'd be interested in? If so, how accurate of one would you like, since there's varying levels of accuracy that it can reach (trading implementation time and speed of course)
(size_of::<Key>() * self.map.len() * 2) + (size_of::<&str>() * self.map().len() * 2) + memory_in_arena(About as close as it's possible to get to being perfectly accurate, does involve much more complexity though)
I'm not sure, what would happen if the limit is reached?
(we wouldn't really need a terribly good accuracy here fwiw)
Well, it'd either make the
.get_or_intern call return a
None or it'd change all the functions that currently return
None to start returning
Result<T, LassoError> with a variant for
OutOfKey. That's in my opinion a more elegant solution since the user always has control, but panicking when the limit is met could also be an option, albeit a bad one. As for handling said OOM errors, that's largely up to the user of the library, if there's enough need for it then some method of inspecting the interner and removing unneeded strings could probably be implemented to allow for users to cull whatever strings they don't want
I think how the error would be handled by rust-analyzer is the most difficult part here. We cannot garbage-collect strings since we don't know which ones are still referenced, so that limits our options for handling allocation failure due to the limit too. I think we'd have to delete the entire database, since we don't know which queries contain
Names, at least not in a way that can be machine-checked.
Perhaps rust-analyzer would work better with reference-counted interned strings such as those provided by Servo's
string_cache crate. I briefly tried that out in https://github.com/rust-analyzer/rust-analyzer/pull/5411.