Open NXXR opened 4 months ago
As far as I am aware this issue isn't too old so something recently (this year?) may be the cause for it. Since it seems to originate in the API the switch to vite won't fix this, but the switch to the new API might.
When we upload new case data, new rows are added to the table of the database. Although we only access the new entries from the frontend, the old entries don't seem to get deleted. I think this could be the issue. Do you know if there is a way to delete old entries for the current or the new backend?
I know @JonasGilg has been manually cleaning up the backend every now and then.
if the new entries are duplicates, using upsert
should reuse the exisiting row, updating the values, while still creating new rows for new data. this might help if we can just overwrite the data.
other than this we might need a script to prune the DB after your script that can run and sort out obsolete entries.
The district map takes significant time to load. Poking around on the website, it seems the Response from the Backend API takes way too long to generate. https://zam10063.zam.kfa-juelich.de/api/v1/rki/2024-03-02/?all&groups=total&compartments=MildInfections was the request I tested, which is used to request the infection data for a single day in order to fill the district map. The request was completed after 25-26 seconds, which is unreasonably long for roughly 100kb of data.
@annawendler you poked around in the backend a little, did you come across anything that would warrant this much time to be necessary to compile the response?
I reckon either the request has a mistake somewhere in the query that causes misses to delay the response or the database is badly indexed to delay the response.