For example: the large N&S sample monitoring network contains over 5000 timeseries. We can choose to:
send 1 request with page_size = 10000 to retrieve all timeseries at once
send 5 or 6 requests with page_size = 1000 to build all timeseries records
send > 50 requests with page_size = 100 to build all timeseries records
Test:
Run a test to see which option provides best result
Worth to consider (Tom):
1 request with 5000 records only need one round trip over the internet. while 50 seperate request need 50 round trips over the internet.
the backend now sends back all the data for those records, but we are only using the lat long (+ maybe label). The backend could support that we provide a list of required fields. This will make the size of the response smaller.
we will soon investigate vector tiles. I am not sure yet but I think vector tiles solve this by showing fewer assets when zoomed out and more when zoomed in. Regardless if we are going to use vector tiles yes or no, we could think about how we could use similar strategies to solve this problem.
For example: the large N&S sample monitoring network contains over 5000 timeseries. We can choose to:
Test:
Worth to consider (Tom):