Closed lucapaganotti closed 1 year ago
The error refers to line 170-175ish - the syntax of the pq_masks
section has changed, see https://datacube-ows.readthedocs.io/en/latest/cfg_styling.html#bit-flag-masks-pq-masks Please note the disclaimer at the top of ows_example_cfg.py:
# The file was originally the only documentation for the configuration file format.
# Detailed and up-to-date formal documentation is now available and this this file
# is no longer actively maintained and may contain errors or obsolete elements.
#
# https://datacube-ows.readthedocs.io/en/latest/configuration.html
In any case, neither of the styles you have defined for the co3 layer (style_rgb
and style_rgb_cloudmask
) will work as both use the bands: red
, green
and blue
- but the co3
layer only has bands c_o3_01
, c_o3_02
, c_o3_03
and c_o3_04
.
Thanks Paul, today I'll read the docs and try to define a suitable style.
Hi @SpacemanPaul ,
I redefined the styles associated to the co3 layer and leave commented the pq_masks section for the time being,
restarting the server with flask gives me this:
flask run --host=0.0.0.0
[2022-11-17 16:58:41,155] [WARNING] Environment variable $AWS_DEFAULT_REGION not set. (This warning can be ignored if all data is stored locally.)
[2022-11-17 16:58:41,905] [WARNING] get_ranges failed for layer cO3: (psycopg2.errors.UndefinedTable) relation "wms.product_ranges" does not exist
LINE 3: FROM wms.product_ranges
^
[SQL:
SELECT *
FROM wms.product_ranges
WHERE id=%s]
[parameters: (1,)]
(Background on this error at: https://sqlalche.me/e/14/f405)
* Serving Flask app 'datacube_ows/ogc.py'
* Debug mode: off
[2022-11-17 16:58:41,911] [INFO] WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
* Running on all addresses (0.0.0.0)
* Running on http://127.0.0.1:5000
* Running on http://192.168.88.86:5000
[2022-11-17 16:58:41,912] [INFO] Press CTRL+C to quit
[2022-11-17 16:59:00,212] [INFO] 192.168.88.87 - - [17/Nov/2022 16:59:00] "GET /?SERVICE=WMS&REQUEST=GetCapabilities HTTP/1.1" 200 -
[2022-11-17 16:59:58,144] [INFO] 192.168.88.87 - - [17/Nov/2022 16:59:58] "GET /?SERVICE=WMS&REQUEST=GetCapabilities HTTP/1.1" 200 -
[2022-11-17 17:00:10,867] [INFO] 192.168.88.87 - - [17/Nov/2022 17:00:10] "GET /?SERVICE=WMS&REQUEST=GetCapabilities HTTP/1.1" 200 -
[2022-11-17 17:00:45,425] [INFO] 192.168.88.87 - - [17/Nov/2022 17:00:45] "GET /?SERVICE=WMS&REQUEST=GetCapabilities HTTP/1.1" 200 -
[2022-11-17 17:00:46,113] [INFO] 192.168.88.87 - - [17/Nov/2022 17:00:46] "GET /?SERVICE=WMS&REQUEST=GetCapabilities HTTP/1.1" 200 -
[2022-11-17 17:00:46,745] [INFO] 192.168.88.87 - - [17/Nov/2022 17:00:46] "GET /?SERVICE=WMS&REQUEST=GetCapabilities HTTP/1.1" 200 -
[2022-11-17 17:00:47,249] [INFO] 192.168.88.87 - - [17/Nov/2022 17:00:47] "GET /?SERVICE=WMS&REQUEST=GetCapabilities HTTP/1.1" 200 -
[2022-11-17 17:01:27,415] [INFO] 192.168.88.87 - - [17/Nov/2022 17:01:27] "GET /?SERVICE=WMS&REQUEST=GetCapabilities HTTP/1.1" 200 -
that its ok for the GetCapabilities requests, but still displays a pair of warnings, I can ignore the first as my data are all local, but I do not know why my database is lacking the product_ranges relation, anyway there seems to be only a warning.
I then tried a GetMap request this way
http://192.168.88.86:5000/wms?request=GetMap&service=WMS&version=1.3.0&crs=EPSG:32632&layers=cO3&bbox=457000,4943000,697000,5175000&width=800&height=600&format=image/png
I was not able to find docs about how this request has to be done and which KVP it accepts but proceeding in trying I finally got this xml response:
<ServiceExceptionReport xmlns="http://www.opengis.net/ogc" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" version="1.3.0" xsi:schemaLocation="http://www.opengis.net/ogc http://schemas.opengis.net/wms/1.3.0/exceptions_1_3_0.xsd">
<ServiceException> Unexpected server error: 'NoneType' object is not subscriptable </ServiceException>
<ServiceException>
<![CDATA[ <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/ogc.py, line 152 in ogc_svc_impl> ]]>
<![CDATA[ <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/utils.py, line 24 in log_wrapper> ]]>
<![CDATA[ <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms.py, line 29 in handle_wms> ]]>
<![CDATA[ <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/utils.py, line 24 in log_wrapper> ]]>
<![CDATA[ <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/data.py, line 404 in get_map> ]]>
<![CDATA[ <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms_utils.py, line 301 in __init__> ]]>
<![CDATA[ <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms_utils.py, line 177 in get_times> ]]>
<![CDATA[ <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms_utils.py, line 177 in <listcomp>> ]]>
<![CDATA[ <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms_utils.py, line 205 in parse_time_item> ]]>
<![CDATA[ <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms_utils.py, line 169 in get_times_for_product> ]]>
</ServiceException>
</ServiceExceptionReport>
saying that I'm getting:
Unexpected server error: 'NoneType' object is not subscriptable
I checked my datacube's product and dataset, reread the ows config file many times, but I do not understand if I'm missing some setup parameter's value.
I'm trying to index a netcdf file with 24 time values, a 61x59 x-y grid and a single variable (c_O3), this is the output of the ncdump command displaying the netcdf header information:
ncdump -h ../../data/netcdf/test_no_zeta.nc
netcdf test_no_zeta {
dimensions:
time = 24 ;
y = 59 ;
x = 61 ;
variables:
double time(time) ;
time:units = "hours since 1-1-1 00:00:0.0" ;
time:delta_t = "0000-00-00 01:00:00.00 +00:00" ;
double y(y) ;
y:units = "metre" ;
y:standard_name = "projection_y_coordinate" ;
y:long_name = "Northing" ;
double x(x) ;
x:long_name = "Easting" ;
x:standard_name = "projection_x_coordinate" ;
x:units = "metre" ;
float c_O3(time, y, x) ;
c_O3:coordinates = "x y" ;
c_O3:grid_mapping = "CRS" ;
c_O3:units = "ppb" ;
c_O3:_FillValue = -9.96921e+36f ;
c_O3:actual_range = 6.292308e-06f, 93.20821f ;
int CRS ;
CRS:grid_mapping_name = "transverse_mercator" ;
CRS:semi_major_axis = 6378137 ;
CRS:inverse_flattening = 298.257223563 ;
CRS:longitude_of_prime_meridian = 0. ;
CRS:latitude_of_projection_origin = 0. ;
CRS:longitude_of_central_meridian = 9. ;
CRS:scale_factor_at_central_meridian = 0.9996 ;
CRS:false_easting = 500000. ;
CRS:false_northing = 0. ;
CRS:unit = "metre" ;
CRS:crs_wkt = "PROJCS[\"WGS 84 / UTM zone 32N\",GEOGCS[\"WGS 84\",DATUM[\"WGS_1984\",SPHEROID[\"WGS 84\",6378137,298.257223563,AUTHORITY[\"EPSG\",\"7030\"]],AUTHORITY[\"EPSG\",\"6326\"]],PRIMEM[\"Greenwich\",0,AUTHORITY[\"EPSG\",\"8901\"]],UNIT[\"degree\",0.01745329251994328,AUTHORITY[\"EPSG\",\"9122\"]],AUTHORITY[\"EPSG\",\"4326\"]],UNIT[\"metre\",1,AUTHORITY[\"EPSG\",\"9001\"]],PROJECTION[\"Transverse_Mercator\"],PARAMETER[\"latitude_of_origin\",0],PARAMETER[\"central_meridian\",9],PARAMETER[\"scale_factor\",0.9996],PARAMETER[\"false_easting\",500000],PARAMETER[\"false_northing\",0],AUTHORITY[\"EPSG\",\"32632\"],AXIS[\"Easting\",EAST],AXIS[\"Northing\",NORTH]]" ;
CRS:spatial_ref = "PROJCS[\"WGS 84 / UTM zone 32N\",GEOGCS[\"WGS 84\",DATUM[\"WGS_1984\",SPHEROID[\"WGS 84\",6378137,298.257223563,AUTHORITY[\"EPSG\",\"7030\"]],AUTHORITY[\"EPSG\",\"6326\"]],PRIMEM[\"Greenwich\",0,AUTHORITY[\"EPSG\",\"8901\"]],UNIT[\"degree\",0.01745329251994328,AUTHORITY[\"EPSG\",\"9122\"]],AUTHORITY[\"EPSG\",\"4326\"]],UNIT[\"metre\",1,AUTHORITY[\"EPSG\",\"9001\"]],PROJECTION[\"Transverse_Mercator\"],PARAMETER[\"latitude_of_origin\",0],PARAMETER[\"central_meridian\",9],PARAMETER[\"scale_factor\",0.9996],PARAMETER[\"false_easting\",500000],PARAMETER[\"false_northing\",0],AUTHORITY[\"EPSG\",\"32632\"],AXIS[\"Easting\",EAST],AXIS[\"Northing\",NORTH]]" ;
CRS:GeoTransform = "4000.0, 0.0, 455000.0, 0.0, -4000.0, 5177000.0" ;
// global attributes:
:Conventions = "COARDS" ;
:lib_ver = 20000 ;
:creation_time = "11 4 2015 H 11.09.06.997 (system local time)" ;
:description = "" ;
:model = "FARM" ;
:NCO = "netCDF Operators version 4.7.5 (Homepage = http://nco.sf.net, Code = http://github.com/nco/nco)" ;
}
What am I missing?
I will then try a simpler use case with only one time value to see if I will finally view an image about my test data.
Thank you for any answer.
Hi all, the error on line 169 in wms_utils.py file is on the last line of this python function:
def get_times_for_product(product):
ranges = product.ranges
return ranges['times']
can this be related to the missing wms.product_ranges in the database? If this is the case the warning I get about the wms.product_ranges cannot be ignored. My database is not ok.
So I checked both the core and ows setup, the database tables, materialized views and so on were not created. I had to setup my own user to have local access via UNIX socket to postgres, in the case I need to use another user/password credentials where these credentials have to be stored to be used by the system? After that I re-init the datacube add products and dataset, and datacube-ows-update --role myrole --schema
command created the needed tables, the datacube database is still without any relation, is this correct?
I have a agdc namespace were I have some relations. Is this correct?
The ows database contains the postgis views and three materialized views. Is this correct?
Starting again the flask server for datacube ows now gives me other warnings, I can ignore the first, but the other are still range related:
$ flask run --host=0.0.0.0
[2022-11-24 14:29:17,923] [WARNING] Environment variable $AWS_DEFAULT_REGION not set. (This warning can be ignored if all data is stored locally.)
[2022-11-24 14:29:18,721] [WARNING] get_ranges failed for layer cO3: Null product range
[2022-11-24 14:29:18,723] [WARNING] get_ranges failed for layer cO3new: Null product range
* Serving Flask app 'datacube_ows/ogc.py'
* Debug mode: off
[2022-11-24 14:29:18,728] [INFO] WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
* Running on all addresses (0.0.0.0)
* Running on http://127.0.0.1:5000
* Running on http://192.168.88.86:5000
[2022-11-24 14:29:18,728] [INFO] Press CTRL+C to quit
the warnings saying Null product range, where do I have to define product ranges? In the product metadata yaml file? In my ows_cfg
The GetMap request is returning the same error in the get_times_for_product function @ row 169 of wms_utils.py
Thank you for any answer.
It looks like you have not run datacube-ows-update --schema
against your database.
Please refer to the documentation here: https://datacube-ows.readthedocs.io/en/latest/database.html
Hi, sorry for the delay of this answer. You're right, I run the datacube-ows-update scripts as this:
datacube-ows-update --schema --role myrole
but this was run inside an initialization script and I missed the error logs.
The error I foun ws in the postgresql setup in my conda environment: the pg custer ws not able to start even if the postgresql service was active, postgresql did not find .s.PGSQL.5432 local unix socket because it was not created. Further more this unix socket was setup in /var/run/postgresql and the datacube ows scripts seem to search for it in /tmp/. Changing accordingly the postgresql.conf and pg_hba.conf files did the trick and pg_cluster started, the unix socket file was created and then datacube-ows-update completed with this log:
(odc) buck@odcdev:~/dev/odc/datacube-ows$ datacube-ows-update --schema --role buck
Single flag bands not in a list is deprecated. Please refer to the documentation for the new format (layer ls8_nbart_albers)
Could not parse layer (ls8_nbart_albers): Could not import python object: datacube_ows.ogc_utils.feature_info_url_template
Single flag bands not in a list is deprecated. Please refer to the documentation for the new format (layer ls8_level1_pds)
wcs section contains a 'default_bands' list. WCS default_bands list is no longer supported. Functionally, the default behaviour is now to return all available bands (as mandated by the WCS2.x spec).
wcs section contains a 'default_bands' list. WCS default_bands list is no longer supported. Functionally, the default behaviour is now to return all available bands (as mandated by the WCS2.x spec).
Single flag bands not in a list is deprecated. Please refer to the documentation for the new format (layer sentinel2_nrt)
wcs section contains a 'default_bands' list. WCS default_bands list is no longer supported. Functionally, the default behaviour is now to return all available bands (as mandated by the WCS2.x spec).
Could not parse layer (mangrove_cover): Required product names entry missing in named layer mangrove_cover
Could not load layer Level 1 USGS Landsat-8 Public Data Set: Could not find product ls8_level1_usgs in datacube for layer ls8_level1_pds
Could not load layer WOfS Summary: Could not find product wofs_summary in datacube for layer wofs_summary
Could not load layer Near Real-Time images from Sentinel-2 Satellites: Could not find product s2a_nrt_granule in datacube for layer sentinel2_nrt
Checking schema....
Creating or replacing WMS database schema...
Creating/replacing wms schema
Creating/replacing product ranges table
Creating/replacing sub-product ranges table
Creating/replacing multi-product ranges table
Granting usage on schema
Creating or replacing materialised views...
Installing Postgis extensions on public schema
Setting default timezone to UTC
Creating NEW TIME Materialised View (start of hard work)
Creating NEW SPACE Materialised View (Slowest step!)
Creating NEW combined SPACE-TIME Materialised View
Creating NEW Materialised View Index 1/4
Creating NEW Materialised View Index 2/4
Creating NEW Materialised View Index 3/4
Creating NEW Materialised View Index 4/4
Renaming old spacetime view (OWS down)
Renaming new view to space_time_view (OWS back up)
Dropping OLD spacetime view (and indexes)
Dropping OLD time view
Dropping OLD space view
Renaming NEW space_view
Renaming NEW time_view
Renaming new Materialised View Index 1/4
Renaming new Materialised View Index 2/4
Renaming new Materialised View Index 3/4
Renaming new Materialised View Index 4/4
Granting read permission to public
Done
so apart the warnings about the missing layers at the start and those WCS related (I didn't activate WCS in config my file) it seems to have updated ows postgresql views and tables.
Thanks for your help.
Now querying the WMS service in order to get a map about my data I'm facing another issue. The query I make is:
http://192.168.88.86:5000/wms?request=GetMap&crs=EPSG:32632&layers=cO3&bbox=457000,4943000,697000,5175000&width=800&height=600&format=png&service=WMS&version=1.3.0
the WMS answers with this exception:
<ServiceExceptionReport version="1.3.0" xsi:schemaLocation="http://www.opengis.net/ogc http://schemas.opengis.net/wms/1.3.0/exceptions_1_3_0.xsd">
<ServiceException>
Unexpected server error: 'NoneType' object is not subscriptable
</ServiceException>
<ServiceException>
<FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/ogc.py, line 152 in ogc_svc_impl>
<FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/utils.py, line 24 in log_wrapper>
<FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms.py, line 29 in handle_wms>
<FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/utils.py, line 24 in log_wrapper>
<FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/data.py, line 404 in get_map>
<FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms_utils.py, line 301 in __init__>
<FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms_utils.py, line 177 in get_times>
<FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms_utils.py, line 177 in <listcomp>>
<FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms_utils.py, line 205 in parse_time_item>
<FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms_utils.py, line 169 in get_times_for_product>
</ServiceException>
</ServiceExceptionReport>
that appears related to the way the time variable is stored/declared in my netcdf files, causing, I think, an undefined range.
Thanks again, have a nice day.
Also, please check the time_resolution for your layer in config. The documentation is here:
https://datacube-ows.readthedocs.io/en/latest/cfg_layers.html#time-resolution-time-resolution
If unsure run this SQL and post the results here.
select * from space_time_view stv
where stv.dataset_type_ref = (
select id
from agdc.dataset_type
where name='cO3');
(Substitute the product name as needed.)
Hi Paul, thank you for your answer.
There were many issues in my setup.
First of all, as dumb as I am, I was working on two different databases ... one for datacube core and another for datacube-ows, I copied trivially the example setup I found in the documentation where I setup a datacube database for datacube-core and then exported the env var DATACUBE_DB_URL as "postgresql:///ows", when I changed it to DATACUBE_DB_URL="postgresql:///datacube" the wms schema was created.
I then run datacube-ows-update but for some reason the product_ranges table was still void so I run directly python update_ranges.py and the table product_ranges was filled.
Then I had some inconsistencies in the metadata files that I solved by trying different setups and then some problems on the request I was making via browser.
The actual http request I make is:
At this point datacube-ows is not logging errors anymore but I get an empty (blank white) image, so I'm investigating about styles, crs in the ows_cfg.py file and my data values.
The logged informations that the flask command is displaying now are these:
(odc) @.***:~/dev/odc/datacube-ows$ flask run --host=0.0.0.0 geobox GeoBox(Geometry({'type': 'Polygon', 'coordinates': (((0.0, 0.0), (0.0, 0.0022996871267285134), (0.002299687127345975, 0.0022996871267285134), (0.002299687127345975, 0.0), (0.0, 0.0)),)}, CRS('EPSG:4326'))) [2022-12-20 10:48:53,350] [WARNING] Environment variable $AWS_DEFAULT_REGION not set. (This warning can be ignored if all data is stored locally.) buck None 5432 ows agdc-1.8.8 True False
I've put some print statements in the relevant files when I got errors.
The long GeoBox is sounding very strange to me, not understanding where this polygon comes from ... but maybe it's created upon wms image creation
Thanks again for your answer.
It's hard to say because I don't know where you added the print statement. What does the contents of the space_time_view for datasets in the product look like (SQL in my comment above)?
Hi Paul,
in my try-and-retry now I have 3 dataset:
datacube=# select id, name from agdc.dataset_type;
id | name
----+-----------------
1 | c_o3_no_z
2 | c_o3_pm
3 | sentinel_5p_no2
(3 rows)
I changed something in the metadata files for product and dataset, mainly the names. If necessary I can attach the metadata files and the netcdf ones I'm currently using.
So the query you suggested should be written this way for layer 2:
datacube=# select * from space_time_view stv
where stv.dataset_type_ref = (
select id
from agdc.dataset_type
where name='c_o3_pm');
id | dataset_type_ref |
spatial_extent
| temporal_extent
--------------------------------------+------------------+---------------------------------------------------------------------
-------------------------------------------------------------------------------------------------------------------------------
+-----------------------------------------------------
6a5b0bf9-df03-4115-b89e-7184c74fe66b | 2 | 0103000020E61000000100000005000000F69537F75B272740D9EA74378A5947400E
903EA939F7264084BC32A4824E46403321489466EA2040C975BC14CD5146409FB11658E0DF204007CD9AE9135D4740F69537F75B272740D9EA74378A594740
| ["2015-04-10 17:54:01+02","2015-04-10 17:54:01+02"]
(1 row)
with this spatial exent:
datacube=# select st_astext(spatial_extent) from space_time_view stv
where stv.dataset_type_ref = (
select id
from agdc.dataset_type
where name='c_o3_pm');
st_astext
-----------------------------------------------------------------------------------------------------------------------------------------------------
---------------------------
POLYGON((11.576873517547 46.6995305367361,11.482861794364 44.6133618591057,8.4578138673829 44.63907107546,8.43725848462128 46.7271701818336,11.57687
3517547 46.6995305367361))
(1 row)
and this way for layer 1
datacube=# select * from space_time_view stv
where stv.dataset_type_ref = (
select id
from agdc.dataset_type
where name='c_o3_no_z');
id | dataset_type_ref |
spatial_extent | temporal_extent
--------------------------------------+------------------+-------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------------+-------------------------------------------
----------
94cf3426-6200-44c8-b1b2-f4e37e850791 | 1 | 0103000020E61000000100000005000000F69537F75B272740D9EA74378A5947400E903EA939F7264084BC32A4
824E46403321489466EA2040C975BC14CD5146409FB11658E0DF204007CD9AE9135D4740F69537F75B272740D9EA74378A594740 | ["2015-04-10 04:00:00+02","2015-04-11 03:0
0:00+02"]
(1 row)
with this spatial extent:
datacube=# select st_astext(spatial_extent) from space_time_view stv
where stv.dataset_type_ref = (
select id
from agdc.dataset_type
where name='c_o3_no_z');
st_astext
-----------------------------------------------------------------------------------------------------------------------------------------------------
---------------------------
POLYGON((11.576873517547 46.6995305367361,11.482861794364 44.6133618591057,8.4578138673829 44.63907107546,8.43725848462128 46.7271701818336,11.57687
3517547 46.6995305367361))
(1 row)
both spatial and temporal extents seem to have valid values
I put some simple print statements in three files wms_utils.py, resource_limits.py and ogc_utils.py and now removed them. The one producing that output was in resource_limits.py file
print('geobox:', geobox)
at line 54 as the first line of function _standardise_geobox(...):
53 def _standardise_geobox(self, geobox: GeoBox) -> GeoBox:
54 print('geobox:', geobox)
55 if geobox.crs == 'EPSG:3857':
56 return geobox
57 bbox = geobox.extent.to_crs('EPSG:3857').boundingbox
58 return create_geobox(CRS('EPSG:3857'),
59 bbox.left, bbox.bottom,
60 bbox.right, bbox.top,
61 width=geobox.width, height=geobox.height
62 )
Thank you for your answer.
Have a nice day.
OK, so what's in wms.product_ranges
then?
select * from wms.product_ranges where id = 2
If that comes back empty, you will need to run datacube-ows-update
with no options. (Sorry - probably should have spotted that sooner.)
Hi Paul,
this is psql output for product ranges of product #2:
datacube=# select * from wms.product_ranges where id = 2;
id | lat_min | lat_max | lon_min | lon_max | dates |
bboxes
----+------------------+------------------+------------------+-----------------+----------------+----------------------------------------------------
-----------------------------------------------------------------------------------------------------------------------------------------------------
-----------------------------------------------------------------------------------------------------------------------------------------------------
------------------------
2 | 44.6133618591057 | 46.7271701818336 | 8.43725848462128 | 11.576873517547 | ["2015-04-10"] | {"EPSG:3857": {"top": 5897654.449820093, "left": 93
9231.3181992678, "right": 1288731.6649514658, "bottom": 5560857.219441153}, "EPSG:4326": {"top": 46.7271701818336, "left": 8.43725848462128, "right":
11.576873517547, "bottom": 44.6133618591057}, "EPSG:32632": {"top": 5178071.156306333, "left": 455350.0773172115, "right": 704459.3313960176, "botto
m": 4940088.327406288}}
(1 row)
when the wms schema was created in the right database I've checked this table and get suitable results querying for product ranges.
Thanks for your answer, have a good day.
Add &ows_stats=y
to the end of your GetMap query. You should get a short json document back, post it here.
Hi @SpacemanPaul, thank you for your answer and hint that was very usefull because seems that no dataset is returned after my request. As you suggested, adding the ows_stats GET parameter gives me this raw json response:
{
"profile":
{
"query": 0.04235363006591797,
"count-datasets": 0.012956857681274414,
"write": 0.019739866256713867
},
"info":
{
"n_dates": 1,
"zoom_factor": 15.799040853901776,
"n_datasets": 0,
"zoom_level_base": 4.370864517607939,
"zoom_level_adjusted": 9.756455172359999,
"datasets": {
"Query bands {'c_pm', 'c_o3'} from products [Product(name='c_o3_pm', id_=2)]": []
},
"write_action": "No datasets: Write Empty"
}
}
from which I get that
I then checked the datacube database to browse for datasets:
datacube=# select id, metadata_type_ref, dataset_type_ref from agdc.dataset;
id | metadata_type_ref | dataset_type_ref
--------------------------------------+-------------------+------------------
94cf3426-6200-44c8-b1b2-f4e37e850791 | 1 | 1
6a5b0bf9-df03-4115-b89e-7184c74fe66b | 1 | 2
50000000-0000-0000-0000-202205051041 | 1 | 3
(3 rows)
the metadata column for dataset 6a5b0bf9-df03-4115-b89e-7184c74fe66b contains this json value:
{
"id": "6a5b0bf9-df03-4115-b89e-7184c74fe66b",
"crs": "epsg:32632",
"grids": {
"default": {
"lat": {
"type": "double-range",
"max_offset": [[232000, 5175000, "end"]],
"min_offset": [[232000, 4943000, "begin"]],
"BoundingBox": [457000, 4943000, 697000, 5175000],
"description": "Latitude range"
},
"lon": {
"type": "double-range",
"max_offset": [[240000, 697000,"end"]],
"min_offset": [[240000, 457000, "begin"]],
"description": "Longitude range"
},
"shape": [59, 61],
"transform": [4000.0, 0.0, 455000.0, 0.0, -4000.0, 5177000.0, 0.0, 0.0, 1.0],
"spatial_reference": "PROJCS[\"WGS 84 / UTM zone 32N\",GEOGCS[\"WGS 84\" ,DATUM[\"WGS_1984\",SPHEROID[\"WGS84\",6378137,298.257223563,AUTHORITY[\"EPSG\",\"7030\"]],AUTHORITY[\"EPSG\",\"6326\"]],PRIMEM[\"Greenwich\",0,AUTHORITY[\"EPSG\",\"8901\"]],UNIT[\"degree\",0.01745329251994328,AUTHORITY[\"EPSG\",\"9122\"]],AUTHORITY[\"EPSG\",\"4326\"]],UNIT[\"metre\",1,AUTHORITY[\"EPSG\",\"9001\"]],PROJECTION[\"Transverse_Mercator\"],PARAMETER[\"latitude_of_origin\",0],PARAMETER[\"central_meridian\",9],PARAMETER[\"scale_factor\",0.9996],PARAMETER[\"false_easting\",500000],PARAMETER[\"false_northing\",0],AUTHORITY[\"EPSG\",\"32632\"],AXIS[\"Easting\",EAST],AXIS[\"Northing\",NORTH]]"
}
},
"extent": {
"lat": {
"end": 46.72717018183362,
"begin": 44.61336185910571
},
"lon": {
"end": 11.57687351754701,
"begin": 8.437258484621278
}
},
"$schema": "https://schemas.opendatacube.org/dataset",
"lineage": {
"source_datasets":{}
},
"product": {"name": "c_o3_pm"},
"geometry": {
"type": "Polygon",
"coordinates": [[[697000.0, 5175000.0], [697000.0, 4943000.0], [457000.0, 4943000.0], [457000.0, 5175000.0], [697000.0, 5175000.0]]]
},
"properties": {
"datetime": "2015-04-10 15:54:01+00:00",
"eo:platform": "arpa",
"eo:instrument": "WAISALA",
"odc:file_format": "NetCDF",
"odc:processing_datetime": "2022-05-12T18:02:03.926659"
},
"grid_spatial": {
"projection": {
"valid_data": {
"type": "Polygon",
"coordinates": [[[697000.0, 5175000.0], [697000.0, 4943000.0], [457000.0, 4943000.0], [457000.0, 5175000.0], [697000.0, 5175000.0]]]
},
"geo_ref_points": {
"ll": {
"x": 455000.0,
"y": 4941000.0
},
"lr": {
"x": 699000.0,
"y": 4941000.0
},
"ul":{
"x": 455000.0,
"y": 5177000.0
},
"ur": {
"x": 699000.0,
"y": 5177000.0
}
},
"spatial_reference": "epsg:32632"
}
},
"measurements": {
"c_o3": {
"path": "file:///home/buck/dev/odc/data/netcdf/twovars_time_1.nc",
"layer": "c_O3"
},
"c_pm":{
"path": "file:///home/buck/dev/odc/data/netcdf/twovars_time_1.nc",
"layer": "c_PM"
}
}
}
and it seems that 3 datasets are present in the database, as per the wms request the relevant dataset should be the second (6a5b0bf9-df03-4115-b89e-7184c74fe66b |1 |2) so I issued a datacube dataset info
for dataset 6a5b0bf9-df03-4115-b89e-7184c74fe66b that gave me this
(odc) buck@odcdev:~/dev/odc/datacube-ows$ datacube dataset info 6a5b0bf9-df03-4115-b89e-7184c74fe66b
buck None 5432 datacube-dataset-info agdc-1.8.8 True False
id: 6a5b0bf9-df03-4115-b89e-7184c74fe66b
product: c_o3_pm
status: active
indexed: 2022-12-17 19:55:16.175248+01:00
locations:
- file:///home/buck/dev/odc/data/netcdf/twovars_dataset.odc-metadata.yaml
fields:
cloud_cover: null
creation_time: 2022-05-12 18:02:03.926659
dataset_maturity: null
format: NetCDF
instrument: WAISALA
label: null
lat: {begin: 44.61336185910571, end: 46.72717018183362}
lon: {begin: 8.437258484621278, end: 11.57687351754701}
platform: arpa
product_family: null
region_code: null
time: {begin: '2015-04-10T15:54:01+00:00', end: '2015-04-10T15:54:01+00:00'}
(odc) buck@odcdev:~/dev/odc/datacube-ows$
from which I get that the dataset is active, it points to the correct yaml file (I must check then if this file is really correct ...), the bounding box seems ok and also the time extent (1 value) seems to match the netcdf contents. I do not know if some of the null values in this datacube command output are meaningful.
I guess that my dataset is not well indexed ... the datacube is not able to find what I think should be the data bands, or my ows_cfg.py does not define correctly the bands.
Thanks again for your answer, have a nice day.
OK: From your metadata:
"geo_ref_points": {
"ll": {
"x": 455000.0,
"y": 4941000.0
},
"lr": {
"x": 699000.0,
"y": 4941000.0
},
"ul":{
"x": 455000.0,
"y": 5177000.0
},
"ur": {
"x": 699000.0,
"y": 4941000.0
}
},
"spatial_reference": "epsg:32632"
So the coordinate ranges are:
X: 455000.0 - 699000.0,
Y: 4941000.0 - 5177000.0
Your WMS bbox query (from above) was:
bbox=5178071.156306333,%20455350.0773172115,%20704459.3313960176,%204940088.327406288
Removing the spaces (%20
) and expanding as minx, miny, maxx, maxy we have the query:
X: 5178071. - 704459.0
Y: 455350 - 4940088
Which does look it should (just) overlap.
But there's something going wrong with the time stamps.
Your metadata has:
"datetime": "2015-04-10 15:54:01+00:00",
But strangely you say space_time_view has:
["2015-04-10 17:54:01+02","2015-04-10 17:54:01+02"]
As far as I can see it should at least be ["2015-04-10 15:54:01+00:00", "2015-04-10 15:54:01+00:00"]
although that shouldn't stop it working.
You still haven't advise what time_resolution
configuration you are using for the layer - this may be the problem.
Hi Paul,
I'm sorry for this delay of mine, I was busy with other projects, I think that I'll get back to open datacube the next week, I will check against your suggestions. Sorry again for the delay of my answer.
Have a nice day.
-- softech s.r.l. email: -- @. -- @. -- https://github.com/lucapaganotti -- sourceforge email: -- @.*** -- skype name: luca.paganotti [image: http://it.linkedin.com/in/lucapaganotti] http://it.linkedin.com/in/lucapaganotti
-- Mistakes are portals of discovery - JAAJ
On Tue, Mar 7, 2023 at 2:55 AM Paul Haesler @.***> wrote:
But there's something going wrong with the time stamps.
Your metadata has:
"datetime": "2015-04-10 15:54:01+00:00",
But strangely space_time_view has:
["2015-04-10 04:00:00+02","2015-04-11 03:00:00+02"]
— Reply to this email directly, view it on GitHub https://github.com/opendatacube/datacube-ows/issues/897#issuecomment-1457368485, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABOYWHIAWVQQCSNXAN7HK5TW22ISFANCNFSM6AAAAAAR4VTXDA . You are receiving this because you authored the thread.Message ID: @.***>
Hi @SpacemanPaul, I got back on opendatacube the last week after a long work on other projects, I hope to have enough time to work on odc this month and the next. I'm beginning to view something in datacube ows. For the time being I'm able to view a map created from a simple netcdf file containing a single variable.
Now I'll try with a netcdf file containing more variables.
Thanks for all your help.
Great to hear Luca. I'll close this ticket, if it's OK with you. Feel free to open another if you get stuck again.
Yes, I agree, now I need to "clean" my environment so to have a good starting point for further exploring the datacube services. Thanks again for your support.
Description
Getting error on starting up datacube ows
Steps to Reproduce
Setup a datacube-core
Setup a datacube-ows
Flask output is:
datacube-ows is complaining about a missing required config item "band" in styling section for layer 'co3' but I have not found in the docs and in the configuration example file this item used for a layer styling. I don't know how to change my config file.
Context (Environment)
datacube-ows
version (datacube-ows --version):ows_config.py
file (link, sample code)in the conda environment
declare -x DATACUBE_OWS_CFG="datacube_ows.ows_cfg_example.ows_cfg"
Attached you will find my ows_cfg_example.py file renamed to text file ows_cfg_example.txt
datacube product metadata (datacube product show product_name)
Thank you for any answer.
Best regards.