geopython / GeoHealthCheck

Service Status and QoS Checker for OGC Web Services
https://geohealthcheck.org
MIT License
83 stars 71 forks source link

Extending the WFS-Probe request to handle the count parameter. #469

Open nollpa opened 4 months ago

nollpa commented 4 months ago

max_count added to the PARAM_DEFS with a default of 1000.

nollpa commented 4 months ago

@justb4 I've pushed two new commits to my branch (maxfeatures instead of count and default value 5). If this works, I will have a look for a possible implemetation of using different WFS versions, as you mentioned.

justb4 commented 4 months ago

The unit test still fails, but I think/hope for a different reason: the max_count parameter has required: True. Maybe that makes the test fail. Or the testdata requires that parameter, see this data which is filled in during unit test:

https://github.com/geopython/GeoHealthCheck/blob/master/tests/data/resources.json#L196

Maybe adding max_count to the parameters will make the test succeed, but also in the light of upgrades to existing installs with WFS GetFeature Probes it would be better to be forward compatible. It would be very hard to solve with an Alembic/DB upgrade as Probe parameters are stored in a free text-column...

Maybe first see what happens if required: False is set, and maybe add an extra test (I can help, it is just a change in the json above) with/without max_count.

nollpa commented 4 months ago

I've ran the tests on my local environment. After my newest changes, the tests regarding the WFS-Resources are fine. But I got an error on the LDPROXY FEATURES test. I got that error on master branch, too. So maybe it is a local problem? Does it pass successfully in your environment?

The max_count paramter must be set within the fixtures.json. Without, it raises an error for "missing key 'max_count'". No matter if the parameter is set to required or not. This is caused by the mismatching keys during self.REQUEST_TEMPLATE.format(**request_parms).

What is resources.json used for? I don't found a place of usage.

justb4 commented 4 months ago

All unit tests (at least from the GitHUb CI) now pass!

I've ran the tests on my local environment. After my newest changes, the tests regarding the WFS-Resources are fine. But I got an error on the LDPROXY FEATURES test. I got that error on master branch, too. So maybe it is a local problem? Does it pass successfully in your environment?

I would not worry about this. It passes in GH CI. General problem: relying on external services for tests. On the other hand we like to test with as many (in this OGC API Features) implementations as possible.

The max_count paramter must be set within the fixtures.json. Without, it raises an error for "missing key 'max_count'". No matter if the parameter is set to required or not. This is caused by the mismatching keys during self.REQUEST_TEMPLATE.format(**request_parms).

Hmm, required: False I see now, is only used in a few probes, like wms for STYLES={style} which is allowed to be empty. What I would expect with required: False and default: 5, is that the default is applied. What I suggest is to also add a value: field, is a bit of an experiment with the default. That value, which can be overridden in the UI Probe form or from values in DB (or the .json test filed) will be filled in self.REQUEST_TEMPLATE.format(**request_parms). I expect. We use value for fixed value fields like service={service} to have reusable templates. So:

        'max_count': {
            'type': 'string',
            'description': 'Maximum amount of features to select',
            'default': '5',
            'value': '5',
            'required': False,
            'range': None
        },

Alternatively, a bit more work, is to adapt expand_params(), and to use the default to add/fill in a value. But first try the value: attribute. Then I would also add test with and without max_count.

What is resources.json used for? I don't found a place of usage.

fixtures.json and resources.json are in effect 'fixtures' to populate a DB for subsequent test execution. Maybe more white-box than unit tests, but this provides a very quick way to maintain tests without writing explicit Python code.

But I now see only fixtures.json is used, and that resources.json nowhere indeed, is somewhat equal, maybe a dangling renaming issue. Ignore resources.json , we may delete. test_resources.py .testRunResoures() will basically run all tests in the DB, store and check results. Very compact.

Focus only on fixtures.json now. I suggest to add two new tests in fixtures.json without max_count.

Thanks for patience!

tomkralidis commented 3 months ago

Hi all: any update on this issue?