Closed geoversed closed 7 hours ago
Serving cached tag autocomplete results is probably not a good idea due to how quickly tags can be added/changed/removed. Other bots that have this feature also do not cache their results, so tag autocomplete results will most likely keep being fetched from the database.
Item autocomplete results will definitely be cached however.
I'm considering also caching the results of kona tags, if it is feasible. These typically do not get deleted/created, so it is better to cache the results of these. Seems ideal to use LRU cache for all autocomplete results.
Settings will be cached later when they are actually implemented
Is your feature request related to a problem?
The autocomplete function gets called every single time they type a letter, so if you query the database every time, it'll get pretty slow.
Describe the solution you'd like
You'll want to aggressively cache results so you don't have to query every single time.
We want to cache the following types of autocomplete results:
These data stored which is acquired via autocomplete generally speaking will never change at run time. Thus, these are going to be cached.
Additional Context
There are only 2 big problems in computer science: cache validation and naming things.