Closed gdower closed 3 months ago
I will add a new x-remove-cache-control
header to the API response which the backend can use to tell Varnish to remove the Cache-Control headers.
This allows us to cache the response in varnish, but avoid any client caching. I will activate that for external datasets, but not for releases which are most of our load and which are still being allowed to be cached for a week in both varnish and the browser client.
I have tested this on dev and it works, deploying to prod now
Description
If you browse the classification of an external dataset, and then import a new version of the dataset with changes to the classification, the frontend seems to cache the backend requests and displays the old data in the classification. Even if you do a hard reload, the frontend still loads the cached requests. This makes it hard to develop datasets with spending time debugging issues that were actually already fixed in the new dataset version, or having to launch a totally new browser profile to force the frontend to load the new data.
To reproduce it, import this archive, then browse down the classification to Dictyosteliomycetes and note the Incertae sedis:
Then import this archive and do a hard reload. At least in Firefox, you should still see the same Incertae sedis name in the classification even though it was removed in the 2nd archive.
If you manually send the backend request, the Incertae sedis is gone while the frontend still continues to load it as in the screenshot above, so it seems to be a frontend issue and not a varnish cache server issue.
I see similar issues with the issues page, although doing a hard reload usually actually fetches the new issues report. It would be better if after importing a new version of a dataset, the backend requests are not cached.
What browsers are you seeing the problem on?
Firefox
ChecklistBank URL
https://www.checklistbank.org/dataset/298076/browse https://www.checklistbank.org/dataset/298076/issues