uw-cryo / coincident

Search and analysis of STV Precursor Coincident Datasets
https://coincident.readthedocs.io
MIT License
3 stars 1 forks source link

Pixi Run Test (and Lint) WSL Error #11

Closed Jack-Hayes closed 3 weeks ago

Jack-Hayes commented 1 month ago

10/17/2024 Performed after running previous lines of code in the ReadME

    pixi shell --environment dev 
    pre-commit install
    pixi run test
    FAILED tests/test_search.py::test_polygon_out_of_bounds - AttributeError: 'Point' object has no attribute 'exterior'

Platform: linux (WSL on Windows 11) -- Python 3.12.7 Pixi Version: pixi 0.33.0

Show code (coincident:dev) (base) jehayes@DESKTOP-L8QL6EI:~/coincident$ pixi run test ✨ Pixi task (test in dev): pytest -o markers=network -m 'not network' --cov --cov-report=xml --cov-report=term ===================================================================================== test session starts ====================================================================================== platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 rootdir: /home/jehayes/coincident configfile: pyproject.toml testpaths: tests plugins: cov-5.0.0, time-machine-2.15.0 collected 22 items / 11 deselected / 11 selected tests/test_datasets.py ... [ 27%] tests/test_overlaps.py ... [ 54%] tests/test_package.py . [ 63%] tests/test_search.py ...F [100%] =========================================================================================== FAILURES =========================================================================================== __________________________________________________________________________________ test_polygon_out_of_bounds __________________________________________________________________________________ def test_polygon_out_of_bounds(): feature_coll = { "type": "FeatureCollection", "features": [ { "id": "0", "properties": {"col1": "name1"}, "type": "Feature", "geometry": {"type": "Point", "coordinates": (1.0, 2.0)}, } ], } aoi = gpd.GeoDataFrame.from_features(feature_coll, crs="EPSG:4326") # with pytest.warns(UserWarning, match="Requested search polygon not within"): with pytest.raises(ValueError, match="Requested search polygon not within"): > m.search.search(dataset="3dep", intersects=aoi) aoi = geometry col1 0 POINT (1 2) name1 feature_coll = {'features': [{'geometry': {'coordinates': (1.0, 2.0), 'type': 'Point'}, 'id': '0', 'properties': {'col1': 'name1'}, 'type': 'Feature'}], 'type': 'FeatureCollection'} tests/test_search.py:61: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dataset = ThreeDEP(alias='3dep', has_stac_api=False, collections=[], search='https://prd-tnm.s3.amazonaws.com/StagedProducts/Elevation/metadata/WESM.gpkg', start='2000-12-01', end=None, type='lidar', provider='usgs', stac_kwargs={'limit': 1000}) intersects = geometry col1 0 POINT (1 2) name1, datetime = None, kwargs = {}, search_start = None, search_end = None, shapely_geometry = def search( dataset: str | Dataset, intersects: gpd.GeoDataFrame | gpd.GeoSeries | None = None, datetime: str | list[str] | None = None, # NOTE: change to explicitly stac_kwargs? not sure of proper kwargs type (Optional[dict[str, Any]]?) **kwargs: Any, ) -> gpd.GeoDataFrame: """ Perform a search for geospatial data using STAC or non-STAC APIs for a single dataset. Parameters ---------- dataset : str or Dataset The dataset to search. Can be a string alias or a Dataset object. intersects : gpd.GeoDataFrame A GeoDataFrame containing a single row, or GeoSeries containing a geometry to restrict the search area. datetime : str or None, optional The datetime range for the search in ISO 8601 format. If None, no datetime filter is applied. **kwargs : Any Additional keyword arguments to pass to the search functions. see https://pystac-client.readthedocs.io/en/latest/api.html#item-search Returns ------- gpd.GeoDataFrame A GeoDataFrame containing the search results. Raises ------ ValueError If the provided dataset alias is not supported. """ # Validate Dataset if isinstance(dataset, str): try: dataset = _alias_to_Dataset[dataset] except KeyError as e: message = ( f"{dataset} is not a supported dataset: {_alias_to_Dataset.keys()}" ) raise ValueError(message) from e # Validate Datetimes _validate_temporal_bounds(dataset, datetime) if datetime is not None: start, end = _pystac_client._format_datetime(datetime).split("/") search_start = ( gpd.pd.to_datetime(start).tz_localize(None) if start != ".." else None ) search_end = gpd.pd.to_datetime(end).tz_localize(None) if end != ".." else None else: search_start = None # or gpd.pd.Timestamp(dataset.start) ? search_end = None # or gpd.pd.Timestamp.today()? # Validate Intersects if intersects is not None: _validate_spatial_bounds(intersects) # NOTE: not very robust, explode() demotes MultiPolygons to single Polygon (seems many GeoJSONs have this) # ANd 'exterior' not available for Multipolygons, just shapely_geometry = intersects.geometry.explode().iloc[0] if ( > not shapely_geometry.exterior.is_ccw ): # Apparently NASA CMR enforces polygon CCW order E AttributeError: 'Point' object has no attribute 'exterior' dataset = ThreeDEP(alias='3dep', has_stac_api=False, collections=[], search='https://prd-tnm.s3.amazonaws.com/StagedProducts/Elevation/metadata/WESM.gpkg', start='2000-12-01', end=None, type='lidar', provider='usgs', stac_kwargs={'limit': 1000}) datetime = None intersects = geometry col1 0 POINT (1 2) name1 kwargs = {} search_end = None search_start = None shapely_geometry = src/coincident/search/main.py:80: AttributeError ---------- coverage: platform linux, python 3.12.7-final-0 ----------- Name Stmts Miss Cover ------------------------------------------------------------------- src/coincident/__init__.py 4 0 100% src/coincident/_version.py 10 1 90% src/coincident/datasets/__init__.py 7 0 100% src/coincident/datasets/csda.py 13 0 100% src/coincident/datasets/general.py 14 0 100% src/coincident/datasets/maxar.py 48 16 67% src/coincident/datasets/nasa.py 22 0 100% src/coincident/datasets/planetary_computer.py 24 0 100% src/coincident/datasets/usgs.py 12 0 100% src/coincident/overlaps/__init__.py 33 6 82% src/coincident/search/__init__.py 4 0 100% src/coincident/search/main.py 73 42 42% src/coincident/search/stac.py 27 14 48% src/coincident/search/wesm.py 51 35 31% ------------------------------------------------------------------- TOTAL 342 114 67% Coverage XML written to file coverage.xml =================================================================================== short test summary info ==================================================================================== FAILED tests/test_search.py::test_polygon_out_of_bounds - AttributeError: 'Point' object has no attribute 'exterior' ========================================================================= 1 failed, 10 passed, 11 deselected in 5.50s ==========================================================================
Jack-Hayes commented 1 month ago

lint: command not found

pixi run lint code ``` (coincident:dev) (base) jehayes@DESKTOP-L8QL6EI:~/coincident$ pixi run lint lint: command not found Available tasks: docs networktest precommit pylint test ```

pipx: command not found

pixi run pylint code ``` (coincident:dev) (base) jehayes@DESKTOP-L8QL6EI:~/coincident$ pixi run pylint ✨ Pixi task (pylint in dev): pipx run nox -s pylint pipx: command not found Available tasks: docs networktest precommit pylint test ```
Jack-Hayes commented 1 month ago

pixi run docs ran without error

scottyhq commented 1 month ago

pipx: command not found

I think we can remove a few tools to simplify dev/CI. The package template has pylint running separately from other checks b/c it's recommended not to run as a pre-commit hook. But I think that's mainly for large repos. Also I think "pixi tasks" effectively take the place of "nox"... so I'm going to try to remove some things! https://pylint.readthedocs.io/en/stable/user_guide/installation/pre-commit-integration.html#pre-commit-integration

scottyhq commented 3 weeks ago

Fixed by #4 ... although might be good to suggest WSL over alternatives in a contribution guide