Open BenjyWiener opened 4 weeks ago
An option to cache datasource reads
datasource reads are already cached, though I don't think it's documented...
If you use the --context
flag to load datasources, they're read once at startup and always held in memory. If you use --datasource
then reference with the datasource
/ds
function, the data is cached at first read.
Thanks for the reply.
Looking at the source code, I see that the string content of the datasource is cached, but not the parsed result.
I'm working with a large JSON file, so the time it takes to parse each time results in a ~100x slowdown compared to using context.
@BenjyWiener ah. is using context an option? Usually I recommend that anyway, given the simpler syntax.
Reopening since there may be a possible feature request here...
I'm currently using context, but I use a lot of nested/recursive templates, which requires a lot of {{ template "..." (dict "current_item" . "top_level" $.top_level) }}
.
I have a large JSON datasource that's used in a nested template, which is executed many times in a loop.
I currently load the datasource at the top level and use
dict
to pass the value along with the actual context to the nested template.I would like to be able to load the datasource from the nested template where it's used, but the runtime impact of reading and deserializing the JSON file each iteration is huge (50-100x slower in my case). An option to cache datasource reads would solve this.
Possible mechanisms:
cache
option in config file and/or as query parameterA new function that caches datasource reads (
datasourceCached
)A more general set of functions that allow writing to and reading from some global "state".