geodesymiami / insarmaps

3 stars 0 forks source link

Why are all datasets being uploaded as high resolution? #105

Closed stackTom closed 1 month ago

stackTom commented 1 month ago

If you click on the actual size button, you get the error that the dataset is uploaded as high resolution. This is happening for all datasets uploaded recently. This affects the density we specify to tippecanoe to create the mbtiles. It creates higher density tiles which negatively affects on the fly recoloring as there are more points rendered. Visually, they look the same to our eyes when zoomed out, but there are more points actually rendered).

High resolution is enabled when either x_step or y_step are missing, see https://github.com/geodesymiami/insarmaps_scripts/blob/master/hdfeos5_2json_mbtiles.py#L276

Is this on purpose? Just curious. Before, high res mode was only for certain datasets like the one in miami beach, where we needed all points to render once zoomed in enough.

falkamelung commented 1 month ago

Yes, everything what we did lately is full resolution. It is the right way to do as all information is kept, which is important for Miami Beach. For the volcanoes we could use a grid for geocoding. But we have to select a spacing in degree.

I forgot what happens for geocoded data. We may not need this anymore but for now lets just keep it for simplicity.

The optimal spacing depends on latitude. I need to write a function to automatically select the optimal spacing but as long as we don't have that staying in radar coordinates is the safest. We will need equal spacing to infer the vertical and horizontal displacement so eventually we may for most cases go back to geocode data.

stackTom commented 1 month ago

Maybe we should add a flag for high resolution, instead of relying on whether x_step and y_step are encoded? We could have datasets without x_step and y_step, but can specify which need all their points rendered after a certain zoom level (high resolution like miami beach), and which ones don't (not high resolution).

falkamelung commented 1 month ago

We could do this, but I don't understand why. It works well at the moment?

I ingested a geocoded datasaet:

http://149.165.153.50/start/19.4043/-155.2520/9.7637?flyToDatasetCenter=false&startDataset=S1_IW12_087_0527_0531_20180902_XXXXXXXX_N18805_N20230_W156283_W154398&minScale=-8&maxScale=8

I don't see any benefits of the "actual size" and "reset size" button anymore. The Pixel Size button takes care of everything. Let'e remove these buttons.

stackTom commented 1 month ago

The generated mbtiles for high resolution datasets are more dense for certain given zoom levels. It doesn't affect how we perceive the dataset, but more points are rendered for a given zoom level, which slows down the on the fly recoloring.

So we should we should determine which datasets need the more dense rendering such as Miami Beach, and which ones don't. Using x_step and y_step is not a good idea. Unless every dataset that is not geocoded requires the same density as the Miami Beach one...

falkamelung commented 1 month ago

In principle we want the high-res ability like for Miami Beach for all not-geocoded data sets. But I don't know how much we would loose. It seems to be a trade-off, high-resolution versus recoloring speed. So we could add an option to json_ * .py. --highest-res or --no-highest-res?

stackTom commented 1 month ago

I don't know how much different in speed. I just looked at the generated mbtiles in detail and honestly, the difference looks like it's negligible. If we need the high res mode for all these datasets, than so be it. Don't think we need to change anything in the ingest scripts etc.