Closed ceesem closed 2 years ago
Because I know that having to roll your own solution every single time is a good way to make something never used, I added a small function tools.caching.CachedClient
that generates a client using the type of cache I just mentioned.
As a bonus, this saves ~0.6 seconds on subsequent initializations!
Small fix to allow info_cache to be passed to the caveclient.
The idea here is that if one is working in a programmatic situation where many cave clients are being initialized all the time, the info cache could itself be cached (e.g. an LRU cache hashed by data stack and server name with an expiration time of an hour), avoiding an extra lookup roundtrip and an extra hit on the info service to get the same dict upon each caveclient initialization.