Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Slow/long load times of district map. #337

Open
NXXR opened this issue Mar 4, 2024 · 3 comments
Open

Slow/long load times of district map. #337

NXXR opened this issue Mar 4, 2024 · 3 comments

Comments

@NXXR
Copy link
Collaborator

NXXR commented Mar 4, 2024

The district map takes significant time to load.
Poking around on the website, it seems the Response from the Backend API takes way too long to generate.
https://zam10063.zam.kfa-juelich.de/api/v1/rki/2024-03-02/?all&groups=total&compartments=MildInfections
was the request I tested, which is used to request the infection data for a single day in order to fill the district map.
The request was completed after 25-26 seconds, which is unreasonably long for roughly 100kb of data.

@annawendler you poked around in the backend a little, did you come across anything that would warrant this much time to be necessary to compile the response?

I reckon either the request has a mistake somewhere in the query that causes misses to delay the response or the database is badly indexed to delay the response.

@NXXR
Copy link
Collaborator Author

NXXR commented Mar 4, 2024

As far as I am aware this issue isn't too old so something recently (this year?) may be the cause for it.
Since it seems to originate in the API the switch to vite won't fix this, but the switch to the new API might.

@annawendler
Copy link
Contributor

When we upload new case data, new rows are added to the table of the database. Although we only access the new entries from the frontend, the old entries don't seem to get deleted. I think this could be the issue. Do you know if there is a way to delete old entries for the current or the new backend?

@NXXR
Copy link
Collaborator Author

NXXR commented Mar 12, 2024

I know @JonasGilg has been manually cleaning up the backend every now and then.
if the new entries are duplicates, using upsert should reuse the exisiting row, updating the values, while still creating new rows for new data. this might help if we can just overwrite the data.
other than this we might need a script to prune the DB after your script that can run and sort out obsolete entries.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants