Open vdeparday opened 6 years ago
This should be solved with a reharvest of data - I updated the metadata to reference FATHOM.
@tonio is it possible you could do a pull of all data?
While this is being done, I would like a dataset removed from the ones thinkhazard pulls. Hazard set ID 'CF-GLOBAL-GAR15'. We derived a set of regional datasets from this, with higher resolution, but this global dataset overrides their use. I've changed its name in GeoNode and removed the hazard set ID from those layers, so ThinkHazard should not pick it up on a re-harvest, but as far as I understand, you need to specifically remove a dataset for it not to remain in ThinkHazard.
Thank you
While harvesting https://www.geonode-gfdrrlab.org/api/documents/1072/?username=xxx&api_key=yyy @tonio got an error message "Sorry, this request could not be processed. Please try again later."
The django container logs the following:
21/09/2018 09:14:1021/09/2018 09: Internal Server Error: /api/documents/723/
21/09/2018 09:14:10Traceback (most recent call last):
21/09/2018 09:14:10 File "/usr/local/lib/python2.7/dist-packages/tastypie/resources.py", line 221, in wrapper
21/09/2018 09:14:10 response = callback(request, *args, **kwargs)
21/09/2018 09:14:10 File "/usr/local/lib/python2.7/dist-packages/tastypie/resources.py", line 470, in dispatch_detail
21/09/2018 09:14:10 return self.dispatch('detail', request, **kwargs)
21/09/2018 09:14:10 File "/usr/local/lib/python2.7/dist-packages/tastypie/resources.py", line 493, in dispatch
21/09/2018 09:14:10 response = method(request, **kwargs)
21/09/2018 09:14:10 File "/usr/local/lib/python2.7/dist-packages/tastypie/resources.py", line 1376, in get_detail
21/09/2018 09:14:10 obj = self.cached_obj_get(bundle=basic_bundle, **self.remove_api_resource_names(kwargs))
21/09/2018 09:14:10 File "/usr/local/lib/python2.7/dist-packages/tastypie/resources.py", line 1195, in cached_obj_get
21/09/2018 09:14:10 cached_bundle = self.obj_get(bundle=bundle, **kwargs)
21/09/2018 09:14:10 File "/usr/local/lib/python2.7/dist-packages/tastypie/resources.py", line 2176, in obj_get
21/09/2018 09:14:10 applicable_filters = self.build_filters(filters=kwargs, ignore_bad_filters=True)
21/09/2018 09:14:10TypeError: build_filters() got an unexpected keyword argument 'ignore_bad_filters'
We're investigating the issue
Witnessed with build camptocamp/geonode_django:v2-20171123173713
.
Does this mean there was no harvesting done since 2017-11-23 ? Sounds weird.
That could well be the last harvest, as we've not added new data in the meantime
Urban Flood source: https://www.geonode-gfdrrlab.org/layers/hazard:adm2_uu_v5_4bit River Flood source: https://www.geonode-gfdrrlab.org/layers/hazard:adm2_fu_v3_4bit
Both private in Geonode (correct) Both not visible in TH - viewing in admin pages show no layer found. They should be used in TH.
However, the layer used instead is UF-GLOBAL-SSBN which is the same raw layer but has an incorrect name, and does not show the data owner correctly (data owner name is given as fathomglobal in the metadata.
We should only have one of these layers referenced by TH, and it should be displayed with UF-GLOBAL-FATHOM data set name.
Preference is to remove the four layers from TH entirely, and reinstate only using the two UF/FL-GLOBAL-FATHOM layers
Now River Flood is based on Fathom 2019; urban flood is still SSNB 2016, because new fathom does not include urban.
When trying to show data source, it says "Contact None" . Even the data is not free for download which should provide clear source and metadata on the data source.