Closed forman closed 2 years ago
For CUSTOM I also get strange behavior:
x1 = 1545577 # meters
y1 = 5761986 # meters
x2 = 1705367 # meters
y2 = 5857046 # meters
bbox = x1, y1, x2, y2
width = 512
spatial_res = (x2 - x1) / width
height = max(1, round((y2 - y1) / spatial_res))
cube_config = CubeConfig(dataset_name='CUSTOM',
band_names=['RED', 'GREEN', 'BLUE'],
tile_size=[width, height],
crs='http://www.opengis.net/def/crs/EPSG/0/3857',
bbox=bbox,
time_range=['2018-01-01', '2019-01-01'],
time_period='7d',
spatial_res=spatial_res,
band_sample_types='UINT8',
collection_id='1a3ab057-3c51-447c-9f85-27d4b633b3f5')
cube = open_cube(cube_config)
cube
results in:
wheras when I leave out the time_period
param:
cube_config = CubeConfig(dataset_name='CUSTOM',
band_names=['RED', 'GREEN', 'BLUE'],
tile_size=[width, height],
crs='http://www.opengis.net/def/crs/EPSG/0/3857',
bbox=bbox,
time_range=['2018-01-01', '2019-01-01'],
spatial_res=spatial_res,
band_sample_types='UINT8',
collection_id='1a3ab057-3c51-447c-9f85-27d4b633b3f5')
cube = open_cube(cube_config)
cube
I know, that there is more than 1 time slice with values in that custom dataset, so omitting the time_period
param returns a faulty cube.
Thanks @AliceBalfanz for testing.
I know, that there is more than 1 time slice with values in that custom dataset, so omitting the
time_period
param returns a faulty cube.
Omitting time_period
causes xcube-sh
to do a SH Catalogue query to retrieve the individual observation times.
It seems the BYOC you are using has no associated time information in the Catalogue. I observerd the same for my test data.
FYI @maximlamare
EDIT
Just saw, I used the same datasets for testing, namely 1a3ab057-3c51-447c-9f85-27d4b633b3f5
and maybe its catalogue entry is not correct.
SentinelHubError: 404 Client Error: Not Found for url: https://services.sentinel-hub.com/api/v1/catalog/search
This either means, the dataset "DEM"
has no entry in the SH Catalogue or xcube-sh
retrieves it incorrectly. I have no idea how to fix that, I believe we are lacking some detail information about the Catalogue and dataset naming here.
@forman I am communicating internally and will get back to you quickly.
https://github.com/dcs4cop/xcube-sh/pull/74#issuecomment-945888675
When I test it with newly provided collection id by Anja 3cca5a44-8f9b-4f58-bfef-39404307c74f ,
leaving the time_period
param leads to:
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
/tmp/ipykernel_59980/1823333892.py in <module>
----> 1 cube = open_cube(cube_config)
2 cube
~/Desktop/projects/xcube-sh/xcube_sh/cube.py in open_cube(cube_config, observer, trace_store_calls, max_cache_size, sentinel_hub, **sh_kwargs)
54 elif sh_kwargs:
55 raise ValueError(f'unexpected keyword-arguments: {", ".join(sh_kwargs.keys())}')
---> 56 cube_store = SentinelHubChunkStore(sentinel_hub, cube_config, observer=observer,
57 trace_store_calls=trace_store_calls)
58 if max_cache_size:
~/Desktop/projects/xcube-sh/xcube_sh/chunkstore.py in __init__(self, sentinel_hub, cube_config, observer, trace_store_calls)
557 d['band_names'] = sentinel_hub.band_names(cube_config.dataset_name)
558 cube_config = CubeConfig.from_dict(d)
--> 559 super().__init__(cube_config,
560 observer=observer,
561 trace_store_calls=trace_store_calls)
~/Desktop/projects/xcube-sh/xcube_sh/chunkstore.py in __init__(self, cube_config, observer, trace_store_calls)
92
93 if not self._time_ranges:
---> 94 raise ValueError('Could not determine any valid time stamps')
95
96 width, height = self._cube_config.size
ValueError: Could not determine any valid time stamps
Now I get the same behavior as described here: https://github.com/dcs4cop/xcube-sh/pull/74#issuecomment-945888675
I have exchanged the collection ID to 3cca5a44-8f9b-4f58-bfef-39404307c74f, so it is not the same as in the code snippets in the link. (because of here: https://github.com/dcs4cop/xcube-sh/pull/74#issuecomment-953001471)
Same forcollection_id = 'byoc-538fdf6b-d391-4ea8-bba7-47f90026d5e2'
However there I even get a key error, when I try to plot the 7D cube:
collection_id = 'byoc-538fdf6b-d391-4ea8-bba7-47f90026d5e2'
cube_config = CubeConfig(dataset_name='CUSTOM',
band_names=['RED', 'GREEN', 'BLUE'],
tile_size=[width, height],
crs='http://www.opengis.net/def/crs/EPSG/0/3857',
bbox=bbox,
time_range=['2018-01-01', '2019-01-01'],
time_period='7d',
spatial_res=spatial_res,
band_sample_types='UINT8',
collection_id=collection_id)
cube = open_cube(cube_config)
cube
import numpy as np
cubew = cube.isel(time=22)
rgb_data = np.zeros((305, 512, 3), 'uint8')
rgb_data[:, :, 0] = cubew.RED.values
rgb_data[:, :, 1] = cubew.GREEN.values
rgb_data[:, :, 2] = cubew.BLUE.values
rgb_array = xr.DataArray(rgb_data, dims=('y', 'x', 'b'), coords=dict(x=cube.RED.x, y=cube.RED.y))
rgb_array.plot.imshow(rgb='b', figsize=(16, 10))
Traceback:
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/zarr/storage.py in __getitem__(self, key)
2162 with self._mutex:
-> 2163 value = self._values_cache[key]
2164 # cache hit if no KeyError is raised
KeyError: 'RED/22.0.0'
During handling of the above exception, another exception occurred:
HTTPError Traceback (most recent call last)
~/Desktop/projects/xcube-sh/xcube_sh/sentinelhub.py in maybe_raise_for_response(cls, response)
558 try:
--> 559 response.raise_for_status()
560 except requests.HTTPError as e:
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/requests/models.py in raise_for_status(self)
952 if http_error_msg:
--> 953 raise HTTPError(http_error_msg, response=self)
954
HTTPError: 400 Client Error: Bad Request for url: https://services.sentinel-hub.com/api/v1/process
The above exception was the direct cause of the following exception:
SentinelHubError Traceback (most recent call last)
/tmp/ipykernel_7498/90259528.py in <module>
1 rgb_data = np.zeros((305, 512, 3), 'uint8')
----> 2 rgb_data[:, :, 0] = cubew.RED.values
3 rgb_data[:, :, 1] = cubew.GREEN.values
4 rgb_data[:, :, 2] = cubew.BLUE.values
5
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/xarray/core/dataarray.py in values(self)
644 type does not support coercion like this (e.g. cupy).
645 """
--> 646 return self.variable.values
647
648 @values.setter
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/xarray/core/variable.py in values(self)
517 def values(self):
518 """The variable's data as a numpy.ndarray"""
--> 519 return _as_array_or_item(self._data)
520
521 @values.setter
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/xarray/core/variable.py in _as_array_or_item(data)
257 TODO: remove this (replace with np.asarray) once these issues are fixed
258 """
--> 259 data = np.asarray(data)
260 if data.ndim == 0:
261 if data.dtype.kind == "M":
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/numpy/core/_asarray.py in asarray(a, dtype, order, like)
100 return _asarray_with_like(a, dtype=dtype, order=order, like=like)
101
--> 102 return array(a, dtype, copy=False, order=order)
103
104
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/dask/array/core.py in __array__(self, dtype, **kwargs)
1532
1533 def __array__(self, dtype=None, **kwargs):
-> 1534 x = self.compute()
1535 if dtype and x.dtype != dtype:
1536 x = x.astype(dtype)
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/dask/base.py in compute(self, **kwargs)
286 dask.base.compute
287 """
--> 288 (result,) = compute(self, traverse=False, **kwargs)
289 return result
290
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/dask/base.py in compute(*args, **kwargs)
568 postcomputes.append(x.__dask_postcompute__())
569
--> 570 results = schedule(dsk, keys, **kwargs)
571 return repack([f(r, *a) for r, (f, a) in zip(results, postcomputes)])
572
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/dask/threaded.py in get(dsk, result, cache, num_workers, pool, **kwargs)
77 pool = MultiprocessingPoolExecutor(pool)
78
---> 79 results = get_async(
80 pool.submit,
81 pool._max_workers,
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/dask/local.py in get_async(submit, num_workers, dsk, result, cache, get_id, rerun_exceptions_locally, pack_exception, raise_exception, callbacks, dumps, loads, chunksize, **kwargs)
505 _execute_task(task, data) # Re-execute locally
506 else:
--> 507 raise_exception(exc, tb)
508 res, worker_id = loads(res_info)
509 state["cache"][key] = res
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/dask/local.py in reraise(exc, tb)
313 if exc.__traceback__ is not tb:
314 raise exc.with_traceback(tb)
--> 315 raise exc
316
317
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/dask/local.py in execute_task(key, task_info, dumps, loads, get_id, pack_exception)
218 try:
219 task, data = loads(task_info)
--> 220 result = _execute_task(task, data)
221 id = get_id()
222 result = dumps((result, id))
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/dask/core.py in _execute_task(arg, cache, dsk)
117 # temporaries by their reference count and can execute certain
118 # operations in-place.
--> 119 return func(*(_execute_task(a, cache) for a in args))
120 elif not ishashable(arg):
121 return arg
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/dask/array/core.py in getter(a, b, asarray, lock)
107 # `is_arraylike` evaluates to `True` in that case.
108 if asarray and (not is_arraylike(c) or isinstance(c, np.matrix)):
--> 109 c = np.asarray(c)
110 finally:
111 if lock:
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/numpy/core/_asarray.py in asarray(a, dtype, order, like)
100 return _asarray_with_like(a, dtype=dtype, order=order, like=like)
101
--> 102 return array(a, dtype, copy=False, order=order)
103
104
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/xarray/core/indexing.py in __array__(self, dtype)
355
356 def __array__(self, dtype=None):
--> 357 return np.asarray(self.array, dtype=dtype)
358
359 def __getitem__(self, key):
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/numpy/core/_asarray.py in asarray(a, dtype, order, like)
100 return _asarray_with_like(a, dtype=dtype, order=order, like=like)
101
--> 102 return array(a, dtype, copy=False, order=order)
103
104
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/xarray/core/indexing.py in __array__(self, dtype)
519
520 def __array__(self, dtype=None):
--> 521 return np.asarray(self.array, dtype=dtype)
522
523 def __getitem__(self, key):
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/numpy/core/_asarray.py in asarray(a, dtype, order, like)
100 return _asarray_with_like(a, dtype=dtype, order=order, like=like)
101
--> 102 return array(a, dtype, copy=False, order=order)
103
104
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/xarray/core/indexing.py in __array__(self, dtype)
420 def __array__(self, dtype=None):
421 array = as_indexable(self.array)
--> 422 return np.asarray(array[self.key], dtype=None)
423
424 def transpose(self, order):
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/xarray/backends/zarr.py in __getitem__(self, key)
71 array = self.get_array()
72 if isinstance(key, indexing.BasicIndexer):
---> 73 return array[key.tuple]
74 elif isinstance(key, indexing.VectorizedIndexer):
75 return array.vindex[
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/zarr/core.py in __getitem__(self, selection)
660
661 fields, selection = pop_fields(selection)
--> 662 return self.get_basic_selection(selection, fields=fields)
663
664 def get_basic_selection(self, selection=Ellipsis, out=None, fields=None):
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/zarr/core.py in get_basic_selection(self, selection, out, fields)
785 fields=fields)
786 else:
--> 787 return self._get_basic_selection_nd(selection=selection, out=out,
788 fields=fields)
789
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/zarr/core.py in _get_basic_selection_nd(self, selection, out, fields)
828 indexer = BasicIndexer(selection, self)
829
--> 830 return self._get_selection(indexer=indexer, out=out, fields=fields)
831
832 def get_orthogonal_selection(self, selection, out=None, fields=None):
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/zarr/core.py in _get_selection(self, indexer, out, fields)
1118
1119 # load chunk selection into output array
-> 1120 self._chunk_getitem(chunk_coords, chunk_selection, out, out_selection,
1121 drop_axes=indexer.drop_axes, fields=fields)
1122 else:
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/zarr/core.py in _chunk_getitem(self, chunk_coords, chunk_selection, out, out_selection, drop_axes, fields)
1788 try:
1789 # obtain compressed data for chunk
-> 1790 cdata = self.chunk_store[ckey]
1791
1792 except KeyError:
~/miniconda3/envs/xcube-dev/lib/python3.9/site-packages/zarr/storage.py in __getitem__(self, key)
2169 except KeyError:
2170 # cache miss, retrieve value from the store
-> 2171 value = self._store[key]
2172 with self._mutex:
2173 self.misses += 1
~/Desktop/projects/xcube-sh/xcube_sh/chunkstore.py in __getitem__(self, key)
503 value = self._vfs[key]
504 if isinstance(value, tuple):
--> 505 return self._fetch_chunk(key, *value)
506 return value
507
~/Desktop/projects/xcube-sh/xcube_sh/chunkstore.py in _fetch_chunk(self, key, band_name, chunk_index)
414
415 if exception:
--> 416 raise exception
417
418 return chunk_data
~/Desktop/projects/xcube-sh/xcube_sh/chunkstore.py in _fetch_chunk(self, key, band_name, chunk_index)
395 try:
396 exception = None
--> 397 chunk_data = self.fetch_chunk(key,
398 band_name,
399 chunk_index,
~/Desktop/projects/xcube-sh/xcube_sh/chunkstore.py in fetch_chunk(self, key, band_name, chunk_index, bbox, time_range)
698 )
699
--> 700 response = self._sentinel_hub.get_data(
701 request,
702 mime_type='application/octet-stream'
~/Desktop/projects/xcube-sh/xcube_sh/sentinelhub.py in get_data(self, request, mime_type)
374
375 if self.error_policy == 'fail':
--> 376 SentinelHubError.maybe_raise_for_response(response)
377 elif self.error_policy == 'warn' and self.enable_warnings:
378 try:
~/Desktop/projects/xcube-sh/xcube_sh/sentinelhub.py in maybe_raise_for_response(cls, response)
567 except Exception:
568 pass
--> 569 raise SentinelHubError(f'{e}: {detail}' if detail else f'{e}',
570 response=response) from e
571
SentinelHubError: 400 Client Error: Bad Request for url: https://services.sentinel-hub.com/api/v1/process
@AliceBalfanz the dataset used for your testing seems not to have any timestamps attached. That's why it behaves lkike that.
This should fix #67. Requires some testing with real datasets.
Should fix too
EDIT
25 is not yet fixed, because we have no means to retrieve time-stamps for a DEM. We need to hard-code special "DEM" treatment.