ajnisbet / opentopodata

Open alternative to the Google Elevation API!
https://www.opentopodata.org
MIT License
314 stars 68 forks source link

ERROR in api: Input is not a transformation #22

Closed janusw closed 3 years ago

janusw commented 3 years ago

So, I'm trying to add yet another dataset (non-public data), like this:

Unfortunately I only get errors in the end:

{
  "error": "Server error, please retry request.",
  "status": "SERVER_ERROR"
}

The other datasets on the server can be queried alright, only the new one throws this error, and I don't see what's wrong.

What is the best way to debug this? Is there a logfile somewhere that has further details? (Did not find any in the docker container.)

ajnisbet commented 3 years ago

How are you running the server? If you're using make run the logs are sent to stdout so should appear in your terminal.

If you're running in the background with make daemon or docker run -d then you can get the container id by running docker ps then get the logs from docker logs <container_id>.

I don't think either nginx or uwsgi are logging to file.

I'd be grateful if you could share the logs, it would be good to fix this bug!

janusw commented 3 years ago

How are you running the server? If you're using make run the logs are sent to stdout so should appear in your terminal.

If you're running in the background with make daemon or docker run -d then you can get the container id by running docker ps then get the logs from docker logs <container_id>.

Yeah, I run the server in the background, and indeed docker logs seems to have the relevant output.

I'd be grateful if you could share the logs, it would be good to fix this bug!

The error message that I see is: [2020-12-11 16:50:42,885] ERROR in api: Input is not a transformation. Do you know what that means?

janusw commented 3 years ago

Sounds a bit like it is related to the CRS. FYI, I'm using:

  filename_epsg: 3044
  filename_tile_size: 20000
ajnisbet commented 3 years ago

Yeah sounds like an issue transforming the lat,lon coords to the raster crs.

Could you run gdalsrsinfo raster_filename.tif on one of the tiles and the original .asc file?

Because you're converting them from .asc anyway you could add -a_srs epsg:3044 if they are all in the same UTM zone.

But it would be nice in any case if Open Topo Data gave a better error message.

janusw commented 3 years ago

Yeah sounds like an issue transforming the lat,lon coords to the raster crs.

Could you run gdalsrsinfo raster_filename.tif on one of the tiles and the original .asc file?

$ gdalsrsinfo dgm10_32920_5700_20.asc
ERROR 1: ERROR - failed to load SRS definition from dgm10_32920_5700_20.asc

$ gdalsrsinfo N5220000E580000.tif

PROJ.4 : ''

OGC WKT :
LOCAL_CS["unnamed",
    UNIT["unknown",1]]

Because you're converting them from .asc anyway you could add -a_srs epsg:3044 if they are all in the same UTM zone.

Will try that.

But it would be nice in any case if Open Topo Data gave a better error message.

True.

janusw commented 3 years ago

The dataset actually contains the following .prj file:

$ cat utm32s.prj
PROJCS["ETRS_1989_UTM_Zone_32N",GEOGCS["GCS_ETRS_1989",DATUM["D_ETRS_1989",SPHEROID["GRS_1980",6378137.0,298.257222101]],PRIMEM["Greenwich",0.0],UNIT["Degree",0.0174532925199433]],PROJECTION["Transverse_Mercator"],PARAMETER["False_Easting",500000.0],PARAMETER["False_Northing",0.0],PARAMETER["Central_Meridian",9.0],PARAMETER["Scale_Factor",0.9996],PARAMETER["Latitude_Of_Origin",0.0],UNIT["Meter",1.0]]

I hope that epsg:3044 is correct here.

janusw commented 3 years ago

Because you're converting them from .asc anyway you could add -a_srs epsg:3044 if they are all in the same UTM zone.

Will try that.

In fact this seems to do the trick!

Now I get:

$ gdalsrsinfo N5220000E580000.tif

PROJ.4 : '+proj=utm +zone=32 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs '

OGC WKT :
PROJCS["ETRS89 / UTM zone 32N (N-E)",
    GEOGCS["ETRS89",
        DATUM["European_Terrestrial_Reference_System_1989",
            SPHEROID["GRS 1980",6378137,298.257222101,
                AUTHORITY["EPSG","7019"]],
            TOWGS84[0,0,0,0,0,0,0],
            AUTHORITY["EPSG","6258"]],
        PRIMEM["Greenwich",0,
            AUTHORITY["EPSG","8901"]],
        UNIT["degree",0.0174532925199433,
            AUTHORITY["EPSG","9122"]],
        AUTHORITY["EPSG","4258"]],
    PROJECTION["Transverse_Mercator"],
    PARAMETER["latitude_of_origin",0],
    PARAMETER["central_meridian",9],
    PARAMETER["scale_factor",0.9996],
    PARAMETER["false_easting",500000],
    PARAMETER["false_northing",0],
    UNIT["metre",1,
        AUTHORITY["EPSG","9001"]],
    AUTHORITY["EPSG","3044"]]

... and the elevation requests are successful now. Thanks a lot for your help here, @ajnisbet !

janusw commented 3 years ago

In hindsight, the problems reported here were mostly due to wrong setup of the data files (and in particular missing information about the projection / reference system).

The only things left to fix in the opentopodata project might be:

  1. Better error messages. IMHO both the server-side error (ERROR in api: Input is not a transformation) as well as the client-side error (Server error, please retry request) could be improved. In particular, 'retrying' on the client side will not help in this case. Maybe something like Server error: invalid data might be more suitable?
  2. Better documentation of the data setup and conversion process. I could contribute instructions for setting up the BKG data with a grid of 200m (publicly available, cf. #19) and a grid of 10m (commercially available, and in fact the one that I was struggling with here). @ajnisbet, would you be interested in including such instructions in your data documentation? And maybe add the 200m data to your public server?
ajnisbet commented 3 years ago

So the dataset has files dgm10_32920_5700_20.asc and utm32s.prj? For the .prj file to be recognised by gdal/opentopodata it needs to have the same filename as the raster. So a better fix would have been to rename the projection file to dgm10_32920_5700_20.prj. You can see that for the 200m dataset for example the files are named dgm200_tm32.prj and dgm200_tm32.asc.

As for your comments:

  1. Agreed. Added better error message for invalid tiled datasets, and improved wording of unhandled errors: https://github.com/ajnisbet/opentopodata/commit/d8227a431e4b031a15da1c71c4f3a95fa5f42502
  2. I want to make sure non-open datasets work on opentopodate, but don't want to have instructions for adding them. For the 200m, I'm curious why you used it instead of the higher-resolution eu-dem or srtm or aster? Do you find it to be more accurate?
janusw commented 3 years ago

So the dataset has files dgm10_32920_5700_20.asc and utm32s.prj? For the .prj file to be recognised by gdal/opentopodata it needs to have the same filename as the raster. So a better fix would have been to rename the projection file to dgm10_32920_5700_20.prj.

Well, for this dataset there are 1025 asc files, which share a single prj file. In order to have matching names, I'd have to create 1025 copies of the prj file (or symlinks to it?). So that's not a particularly elegant solution.

Since I'm converting to tif anyway (for reasons of performance and storage size), using -a_srs in the conversion (as you proposed earlier) is a bit simpler actually.

As for your comments:

1. Agreed. Added better error message for invalid tiled datasets, and improved wording of unhandled errors: [d8227a4](https://github.com/ajnisbet/opentopodata/commit/d8227a431e4b031a15da1c71c4f3a95fa5f42502)

Great, thanks!

2. I want to make sure non-open datasets work on opentopodate, but don't want to have instructions for adding them. For the 200m, I'm curious why you used it instead of the higher-resolution eu-dem or srtm or aster? Do you find it to be more accurate?

My primary motivation for using the BKG-200m data was to have a precursor for going to BKG-10m, in order to check if and how it can be used with opentopodata (with publicly available data for reproducibility).

Regarding accuracy, I have not done a thorough analysis yet, but my first tests indicate that at least BKG-10m is somewhat more accurate than EU-DEM. It is usually a few meters below EU-DEM, and probably reflects the actual ground level a bit better (without obstacles on top).

BKG-200m is slightly different from BKG-10m, but probably still a bit more accurate than EU-DEM (at least vertically). Of course the inferior horizontal resolution can be a problem.