opendatacube / datacube-core

Open Data Cube analyses continental scale Earth Observation data through time
http://www.opendatacube.org
Apache License 2.0
513 stars 178 forks source link

Trouble running Integration Tests within `conda` environment and Trouble Building Documentation #1329

Open lucapaganotti opened 2 years ago

lucapaganotti commented 2 years ago

Expected behaviour

No test failing

Actual behaviour

at the beginning:

 ./check-code.sh integration_tests
+ '[' integration_tests == --with-docker ']'
+ '[' no '!=' yes ']'
+ pycodestyle tests integration_tests examples --max-line-length 120
+ pylint -j 2 --reports no datacube
************* Module datacube.drivers.rio._reader
datacube/drivers/rio/_reader.py:72:26: I1101: Module 'rasterio.crs' has no 'CRS' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
************* Module datacube.utils.geometry._warp
datacube/utils/geometry/_warp.py:14:11: I1101: Module 'rasterio.crs' has no 'CRS' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
...

.... at the end

===================================================== slowest 5 durations ======================================================
69.19s call     integration_tests/test_config_tool.py::test_add_example_dataset_types[datacube-US/Pacific]
30.37s setup    integration_tests/test_index_datasets_search.py::test_index_datasets_search_light[US/Pacific-datacube]
26.67s setup    integration_tests/index/test_index_data.py::test_get_dataset[datacube-UTC]
25.81s setup    integration_tests/index/test_search_legacy.py::test_search_by_product[US/Pacific-datacube]
21.28s setup    integration_tests/index/test_search_eo3.py::test_count_by_product_searches_eo3[datacube-US/Pacific]
=================================================== short test summary info ====================================================
SKIPPED [2] integration_tests/test_3d.py:26: could not import 'dcio_example.xarray_3d': No module named 'dcio_example'
SKIPPED [1] ../../../anaconda3/envs/odc/lib/python3.8/site-packages/_pytest/doctest.py:452: all tests skipped by +SKIP option
XFAIL tests/test_geometry.py::test_lonalt_bounds_more_than_180
  Bounds computation for large geometries in safe mode is broken
XFAIL tests/test_utils_docs.py::test_merge_with_nan
  Merging dictionaries with content of NaN doesn't work currently
ERROR integration_tests/test_cli_output.py::test_cli_product_subcommand[experimental-US/Pacific] - sqlalchemy.exc.Operational...
ERROR integration_tests/test_cli_output.py::test_cli_product_subcommand[experimental-UTC] - sqlalchemy.exc.OperationalError: ...
ERROR integration_tests/test_cli_output.py::test_cli_metadata_subcommand[experimental-US/Pacific] - sqlalchemy.exc.Operationa...
ERROR integration_tests/test_cli_output.py::test_cli_metadata_subcommand[experimental-UTC] - sqlalchemy.exc.OperationalError:...
ERROR integration_tests/test_cli_output.py::test_cli_dataset_subcommand[experimental-US/Pacific] - sqlalchemy.exc.Operational...
ERROR integration_tests/test_cli_output.py::test_cli_dataset_subcommand[experimental-UTC] - sqlalchemy.exc.OperationalError: ...
ERROR integration_tests/test_cli_output.py::test_readd_and_update_metadata_product_dataset_command[experimental-US/Pacific]
ERROR integration_tests/test_cli_output.py::test_readd_and_update_metadata_product_dataset_command[experimental-UTC] - sqlalc...
ERROR integration_tests/test_config_tool.py::test_add_example_dataset_types[experimental-US/Pacific] - sqlalchemy.exc.Operati...
ERROR integration_tests/test_config_tool.py::test_add_example_dataset_types[experimental-UTC] - sqlalchemy.exc.OperationalErr...
ERROR integration_tests/test_config_tool.py::test_error_returned_on_invalid[experimental-US/Pacific] - sqlalchemy.exc.Operati...
ERROR integration_tests/test_config_tool.py::test_error_returned_on_invalid[experimental-UTC] - sqlalchemy.exc.OperationalErr...
ERROR integration_tests/test_config_tool.py::test_config_check[experimental-US/Pacific] - sqlalchemy.exc.OperationalError: (p...
ERROR integration_tests/test_config_tool.py::test_config_check[experimental-UTC] - sqlalchemy.exc.OperationalError: (psycopg2...
ERROR integration_tests/test_config_tool.py::test_list_users_does_not_fail[experimental-US/Pacific] - sqlalchemy.exc.Operatio...
ERROR integration_tests/test_config_tool.py::test_list_users_does_not_fail[experimental-UTC] - sqlalchemy.exc.OperationalErro...
ERROR integration_tests/test_config_tool.py::test_db_init_noop[experimental-US/Pacific] - sqlalchemy.exc.OperationalError: (p...
ERROR integration_tests/test_config_tool.py::test_db_init_noop[experimental-UTC] - sqlalchemy.exc.OperationalError: (psycopg2...
ERROR integration_tests/test_config_tool.py::test_db_init[experimental-US/Pacific] - sqlalchemy.exc.OperationalError: (psycop...
ERROR integration_tests/test_config_tool.py::test_db_init[experimental-UTC] - sqlalchemy.exc.OperationalError: (psycopg2.Oper...
ERROR integration_tests/test_config_tool.py::test_add_no_such_product[experimental-US/Pacific] - sqlalchemy.exc.OperationalEr...
ERROR integration_tests/test_config_tool.py::test_add_no_such_product[experimental-UTC] - sqlalchemy.exc.OperationalError: (p...
ERROR integration_tests/test_config_tool.py::test_user_creation[experimental-example_user0-US/Pacific] - sqlalchemy.exc.Opera...
ERROR integration_tests/test_config_tool.py::test_user_creation[experimental-example_user0-UTC] - sqlalchemy.exc.OperationalE...
ERROR integration_tests/test_config_tool.py::test_user_creation[experimental-example_user1-US/Pacific] - sqlalchemy.exc.Opera...
ERROR integration_tests/test_config_tool.py::test_user_creation[experimental-example_user1-UTC] - sqlalchemy.exc.OperationalE...
ERROR integration_tests/test_config_tool.py::test_user_creation[experimental-example_user2-US/Pacific] - sqlalchemy.exc.Opera...
ERROR integration_tests/test_config_tool.py::test_user_creation[experimental-example_user2-UTC] - sqlalchemy.exc.OperationalE...
ERROR integration_tests/test_config_tool.py::test_user_creation[experimental-example_user3-US/Pacific] - sqlalchemy.exc.Opera...
ERROR integration_tests/test_config_tool.py::test_user_creation[experimental-example_user3-UTC] - sqlalchemy.exc.OperationalE...
ERROR integration_tests/test_model.py::test_crs_parse[experimental-US/Pacific] - sqlalchemy.exc.OperationalError: (psycopg2.O...
ERROR integration_tests/test_model.py::test_crs_parse[experimental-UTC] - sqlalchemy.exc.OperationalError: (psycopg2.Operatio...
ERROR integration_tests/test_validate_ingestion.py::test_invalid_ingestor_config[experimental-US/Pacific] - sqlalchemy.exc.Op...
ERROR integration_tests/test_validate_ingestion.py::test_invalid_ingestor_config[experimental-UTC] - sqlalchemy.exc.Operation...
ERROR integration_tests/index/test_config_docs.py::test_field_expression_unchanged_postgis[US/Pacific-experimental] - sqlalch...
ERROR integration_tests/index/test_config_docs.py::test_field_expression_unchanged_postgis[UTC-experimental] - sqlalchemy.exc...
ERROR integration_tests/index/test_config_docs.py::test_idempotent_add_dataset_type[experimental-US/Pacific] - sqlalchemy.exc...
ERROR integration_tests/index/test_config_docs.py::test_idempotent_add_dataset_type[experimental-UTC] - sqlalchemy.exc.Operat...
ERROR integration_tests/index/test_config_docs.py::test_update_dataset[experimental-US/Pacific] - sqlalchemy.exc.OperationalE...
ERROR integration_tests/index/test_config_docs.py::test_update_dataset[experimental-UTC] - sqlalchemy.exc.OperationalError: (...
ERROR integration_tests/index/test_config_docs.py::test_product_update_cli[experimental-US/Pacific] - sqlalchemy.exc.Operatio...
ERROR integration_tests/index/test_config_docs.py::test_product_update_cli[experimental-UTC] - sqlalchemy.exc.OperationalErro...
ERROR integration_tests/index/test_config_docs.py::test_update_metadata_type[experimental-US/Pacific] - sqlalchemy.exc.Operat...
ERROR integration_tests/index/test_config_docs.py::test_update_metadata_type[experimental-UTC] - sqlalchemy.exc.OperationalEr...
ERROR integration_tests/index/test_config_docs.py::test_filter_types_by_fields[experimental-US/Pacific] - sqlalchemy.exc.Oper...
ERROR integration_tests/index/test_config_docs.py::test_filter_types_by_fields[experimental-UTC] - sqlalchemy.exc.Operational...
ERROR integration_tests/index/test_config_docs.py::test_filter_types_by_search[experimental-US/Pacific] - sqlalchemy.exc.Oper...
ERROR integration_tests/index/test_config_docs.py::test_filter_types_by_search[experimental-UTC] - sqlalchemy.exc.Operational...
ERROR integration_tests/index/test_index_data.py::test_archive_datasets[experimental-US/Pacific] - sqlalchemy.exc.Operational...
ERROR integration_tests/index/test_index_data.py::test_archive_datasets[experimental-UTC] - sqlalchemy.exc.OperationalError: ...
ERROR integration_tests/index/test_index_data.py::test_purge_datasets[experimental-US/Pacific] - sqlalchemy.exc.OperationalEr...
ERROR integration_tests/index/test_index_data.py::test_purge_datasets[experimental-UTC] - sqlalchemy.exc.OperationalError: (p...
ERROR integration_tests/index/test_index_data.py::test_purge_datasets_cli[experimental-US/Pacific] - sqlalchemy.exc.Operation...
ERROR integration_tests/index/test_index_data.py::test_purge_datasets_cli[experimental-UTC] - sqlalchemy.exc.OperationalError...
ERROR integration_tests/index/test_index_data.py::test_purge_all_datasets_cli[experimental-US/Pacific] - sqlalchemy.exc.Opera...
ERROR integration_tests/index/test_index_data.py::test_purge_all_datasets_cli[experimental-UTC] - sqlalchemy.exc.OperationalE...
ERROR integration_tests/index/test_index_data.py::test_index_duplicate_dataset[experimental-US/Pacific] - sqlalchemy.exc.Oper...
ERROR integration_tests/index/test_index_data.py::test_index_duplicate_dataset[experimental-UTC] - sqlalchemy.exc.Operational...
ERROR integration_tests/index/test_index_data.py::test_has_dataset[experimental-US/Pacific] - sqlalchemy.exc.OperationalError...
ERROR integration_tests/index/test_index_data.py::test_has_dataset[experimental-UTC] - sqlalchemy.exc.OperationalError: (psyc...
ERROR integration_tests/index/test_index_data.py::test_get_dataset[experimental-US/Pacific] - sqlalchemy.exc.OperationalError...
ERROR integration_tests/index/test_index_data.py::test_get_dataset[experimental-UTC] - sqlalchemy.exc.OperationalError: (psyc...
ERROR integration_tests/index/test_index_data.py::test_transactions_api_ctx_mgr[experimental-US/Pacific] - sqlalchemy.exc.Ope...
ERROR integration_tests/index/test_index_data.py::test_transactions_api_ctx_mgr[experimental-UTC] - sqlalchemy.exc.Operationa...
ERROR integration_tests/index/test_index_data.py::test_transactions_api_manual[experimental-US/Pacific] - sqlalchemy.exc.Oper...
ERROR integration_tests/index/test_index_data.py::test_transactions_api_manual[experimental-UTC] - sqlalchemy.exc.Operational...
ERROR integration_tests/index/test_index_data.py::test_transactions_api_hybrid[experimental-US/Pacific] - sqlalchemy.exc.Oper...
ERROR integration_tests/index/test_index_data.py::test_transactions_api_hybrid[experimental-UTC] - sqlalchemy.exc.Operational...
ERROR integration_tests/index/test_index_data.py::test_get_missing_things[experimental-US/Pacific] - sqlalchemy.exc.Operation...
ERROR integration_tests/index/test_index_data.py::test_get_missing_things[experimental-UTC] - sqlalchemy.exc.OperationalError...
ERROR integration_tests/index/test_pluggable_indexes.py::test_with_standard_index[experimental-US/Pacific] - sqlalchemy.exc.O...
ERROR integration_tests/index/test_pluggable_indexes.py::test_with_standard_index[experimental-UTC] - sqlalchemy.exc.Operatio...
ERROR integration_tests/index/test_pluggable_indexes.py::test_system_init[experimental-US/Pacific] - sqlalchemy.exc.Operation...
ERROR integration_tests/index/test_pluggable_indexes.py::test_system_init[experimental-UTC] - sqlalchemy.exc.OperationalError...
ERROR integration_tests/index/test_postgis_index.py::test_create_spatial_index[US/Pacific-experimental] - sqlalchemy.exc.Oper...
ERROR integration_tests/index/test_postgis_index.py::test_create_spatial_index[UTC-experimental] - sqlalchemy.exc.Operational...
ERROR integration_tests/index/test_postgis_index.py::test_spatial_index_maintain[US/Pacific-experimental] - sqlalchemy.exc.Op...
ERROR integration_tests/index/test_postgis_index.py::test_spatial_index_maintain[UTC-experimental] - sqlalchemy.exc.Operation...
ERROR integration_tests/index/test_postgis_index.py::test_spatial_index_populate[US/Pacific-experimental] - sqlalchemy.exc.Op...
ERROR integration_tests/index/test_postgis_index.py::test_spatial_index_populate[UTC-experimental] - sqlalchemy.exc.Operation...
ERROR integration_tests/index/test_postgis_index.py::test_spatial_index_crs_validity[US/Pacific-experimental] - sqlalchemy.ex...
ERROR integration_tests/index/test_postgis_index.py::test_spatial_index_crs_validity[UTC-experimental] - sqlalchemy.exc.Opera...
ERROR integration_tests/index/test_postgis_index.py::test_spatial_extent[US/Pacific-experimental] - sqlalchemy.exc.Operationa...
ERROR integration_tests/index/test_postgis_index.py::test_spatial_extent[UTC-experimental] - sqlalchemy.exc.OperationalError:...
ERROR integration_tests/index/test_postgis_index.py::test_spatial_search[US/Pacific-experimental] - sqlalchemy.exc.Operationa...
ERROR integration_tests/index/test_postgis_index.py::test_spatial_search[UTC-experimental] - sqlalchemy.exc.OperationalError:...
ERROR integration_tests/index/test_search_eo3.py::test_search_dataset_equals_eo3[experimental-US/Pacific] - sqlalchemy.exc.Op...
ERROR integration_tests/index/test_search_eo3.py::test_search_dataset_equals_eo3[experimental-UTC] - sqlalchemy.exc.Operation...
ERROR integration_tests/index/test_search_eo3.py::test_search_dataset_by_metadata_eo3[experimental-US/Pacific] - sqlalchemy.e...
ERROR integration_tests/index/test_search_eo3.py::test_search_dataset_by_metadata_eo3[experimental-UTC] - sqlalchemy.exc.Oper...
ERROR integration_tests/index/test_search_eo3.py::test_search_day_eo3[experimental-US/Pacific] - sqlalchemy.exc.OperationalEr...
ERROR integration_tests/index/test_search_eo3.py::test_search_day_eo3[experimental-UTC] - sqlalchemy.exc.OperationalError: (p...
ERROR integration_tests/index/test_search_eo3.py::test_search_dataset_ranges_eo3[experimental-US/Pacific] - sqlalchemy.exc.Op...
ERROR integration_tests/index/test_search_eo3.py::test_search_dataset_ranges_eo3[experimental-UTC] - sqlalchemy.exc.Operation...
ERROR integration_tests/index/test_search_eo3.py::test_search_globally_eo3[experimental-US/Pacific] - sqlalchemy.exc.Operatio...
ERROR integration_tests/index/test_search_eo3.py::test_search_globally_eo3[experimental-UTC] - sqlalchemy.exc.OperationalErro...
ERROR integration_tests/index/test_search_eo3.py::test_search_by_product_eo3[experimental-US/Pacific] - sqlalchemy.exc.Operat...
ERROR integration_tests/index/test_search_eo3.py::test_search_by_product_eo3[experimental-UTC] - sqlalchemy.exc.OperationalEr...
ERROR integration_tests/index/test_search_eo3.py::test_search_limit_eo3[experimental-US/Pacific] - sqlalchemy.exc.Operational...
ERROR integration_tests/index/test_search_eo3.py::test_search_limit_eo3[experimental-UTC] - sqlalchemy.exc.OperationalError: ...
ERROR integration_tests/index/test_search_eo3.py::test_search_or_expressions_eo3[experimental-US/Pacific] - sqlalchemy.exc.Op...
ERROR integration_tests/index/test_search_eo3.py::test_search_or_expressions_eo3[experimental-UTC] - sqlalchemy.exc.Operation...
ERROR integration_tests/index/test_search_eo3.py::test_search_returning_eo3[experimental-US/Pacific] - sqlalchemy.exc.Operati...
ERROR integration_tests/index/test_search_eo3.py::test_search_returning_eo3[experimental-UTC] - sqlalchemy.exc.OperationalErr...
ERROR integration_tests/index/test_search_eo3.py::test_search_returning_rows_eo3[experimental-US/Pacific] - sqlalchemy.exc.Op...
ERROR integration_tests/index/test_search_eo3.py::test_search_returning_rows_eo3[experimental-UTC] - sqlalchemy.exc.Operation...
ERROR integration_tests/index/test_search_eo3.py::test_searches_only_type_eo3[experimental-US/Pacific] - sqlalchemy.exc.Opera...
ERROR integration_tests/index/test_search_eo3.py::test_searches_only_type_eo3[experimental-UTC] - sqlalchemy.exc.OperationalE...
ERROR integration_tests/index/test_search_eo3.py::test_search_special_fields_eo3[experimental-US/Pacific] - sqlalchemy.exc.Op...
ERROR integration_tests/index/test_search_eo3.py::test_search_special_fields_eo3[experimental-UTC] - sqlalchemy.exc.Operation...
ERROR integration_tests/index/test_search_eo3.py::test_search_by_uri_eo3[experimental-US/Pacific] - sqlalchemy.exc.Operationa...
ERROR integration_tests/index/test_search_eo3.py::test_search_by_uri_eo3[experimental-UTC] - sqlalchemy.exc.OperationalError:...
ERROR integration_tests/index/test_search_eo3.py::test_search_conflicting_types[experimental-US/Pacific] - sqlalchemy.exc.Ope...
ERROR integration_tests/index/test_search_eo3.py::test_search_conflicting_types[experimental-UTC] - sqlalchemy.exc.Operationa...
ERROR integration_tests/index/test_search_eo3.py::test_fetch_all_of_md_type[experimental-US/Pacific] - sqlalchemy.exc.Operati...
ERROR integration_tests/index/test_search_eo3.py::test_fetch_all_of_md_type[experimental-UTC] - sqlalchemy.exc.OperationalErr...
ERROR integration_tests/index/test_search_eo3.py::test_count_searches[experimental-US/Pacific] - sqlalchemy.exc.OperationalEr...
ERROR integration_tests/index/test_search_eo3.py::test_count_searches[experimental-UTC] - sqlalchemy.exc.OperationalError: (p...
ERROR integration_tests/index/test_search_eo3.py::test_count_by_product_searches_eo3[experimental-US/Pacific] - sqlalchemy.ex...
ERROR integration_tests/index/test_search_eo3.py::test_count_by_product_searches_eo3[experimental-UTC] - sqlalchemy.exc.Opera...
ERROR integration_tests/index/test_search_eo3.py::test_count_time_groups[experimental-US/Pacific] - sqlalchemy.exc.Operationa...
ERROR integration_tests/index/test_search_eo3.py::test_count_time_groups[experimental-UTC] - sqlalchemy.exc.OperationalError:...
ERROR integration_tests/index/test_search_eo3.py::test_count_time_groups_cli[experimental-US/Pacific] - sqlalchemy.exc.Operat...
ERROR integration_tests/index/test_search_eo3.py::test_count_time_groups_cli[experimental-UTC] - sqlalchemy.exc.OperationalEr...
ERROR integration_tests/index/test_search_eo3.py::test_search_cli_basic[experimental-US/Pacific] - sqlalchemy.exc.Operational...
ERROR integration_tests/index/test_search_eo3.py::test_search_cli_basic[experimental-UTC] - sqlalchemy.exc.OperationalError: ...
ERROR integration_tests/index/test_search_eo3.py::test_cli_info_eo3[experimental-US/Pacific] - sqlalchemy.exc.OperationalErro...
ERROR integration_tests/index/test_search_eo3.py::test_cli_info_eo3[experimental-UTC] - sqlalchemy.exc.OperationalError: (psy...
ERROR integration_tests/index/test_search_eo3.py::test_find_duplicates_eo3[experimental-US/Pacific] - sqlalchemy.exc.Operatio...
ERROR integration_tests/index/test_search_eo3.py::test_find_duplicates_eo3[experimental-UTC] - sqlalchemy.exc.OperationalErro...
ERROR integration_tests/index/test_search_eo3.py::test_csv_search_via_cli_eo3[experimental-US/Pacific] - sqlalchemy.exc.Opera...
ERROR integration_tests/index/test_search_eo3.py::test_csv_search_via_cli_eo3[experimental-UTC] - sqlalchemy.exc.OperationalE...
ERROR integration_tests/index/test_search_eo3.py::test_csv_structure_eo3[experimental-US/Pacific] - sqlalchemy.exc.Operationa...
ERROR integration_tests/index/test_search_eo3.py::test_csv_structure_eo3[experimental-UTC] - sqlalchemy.exc.OperationalError:...
ERROR integration_tests/index/test_search_eo3.py::test_query_dataset_multi_product_eo3[experimental-US/Pacific] - sqlalchemy....
ERROR integration_tests/index/test_search_eo3.py::test_query_dataset_multi_product_eo3[experimental-UTC] - sqlalchemy.exc.Ope...
ERROR integration_tests/index/test_search_legacy.py::test_search_dataset_equals[experimental-US/Pacific] - sqlalchemy.exc.Ope...
ERROR integration_tests/index/test_search_legacy.py::test_search_dataset_equals[experimental-UTC] - sqlalchemy.exc.Operationa...
ERROR integration_tests/index/test_search_legacy.py::test_search_dataset_by_metadata[experimental-US/Pacific] - sqlalchemy.ex...
ERROR integration_tests/index/test_search_legacy.py::test_search_dataset_by_metadata[experimental-UTC] - sqlalchemy.exc.Opera...
ERROR integration_tests/index/test_search_legacy.py::test_search_day[experimental-US/Pacific] - sqlalchemy.exc.OperationalErr...
ERROR integration_tests/index/test_search_legacy.py::test_search_day[experimental-UTC] - sqlalchemy.exc.OperationalError: (ps...
ERROR integration_tests/index/test_search_legacy.py::test_search_globally[experimental-US/Pacific] - sqlalchemy.exc.Operation...
ERROR integration_tests/index/test_search_legacy.py::test_search_globally[experimental-UTC] - sqlalchemy.exc.OperationalError...
ERROR integration_tests/index/test_search_legacy.py::test_search_limit[experimental-US/Pacific] - sqlalchemy.exc.OperationalE...
ERROR integration_tests/index/test_search_legacy.py::test_search_limit[experimental-UTC] - sqlalchemy.exc.OperationalError: (...
ERROR integration_tests/index/test_search_legacy.py::test_search_returning_rows[experimental-US/Pacific] - sqlalchemy.exc.Ope...
ERROR integration_tests/index/test_search_legacy.py::test_search_returning_rows[experimental-UTC] - sqlalchemy.exc.Operationa...
ERROR integration_tests/index/test_search_legacy.py::test_searches_only_type[experimental-US/Pacific] - sqlalchemy.exc.Operat...
ERROR integration_tests/index/test_search_legacy.py::test_searches_only_type[experimental-UTC] - sqlalchemy.exc.OperationalEr...
ERROR integration_tests/index/test_search_legacy.py::test_count_by_product_searches[experimental-US/Pacific] - sqlalchemy.exc...
ERROR integration_tests/index/test_search_legacy.py::test_count_by_product_searches[experimental-UTC] - sqlalchemy.exc.Operat...
ERROR integration_tests/index/test_search_legacy.py::test_cli_info[experimental-US/Pacific] - sqlalchemy.exc.OperationalError...
ERROR integration_tests/index/test_search_legacy.py::test_cli_info[experimental-UTC] - sqlalchemy.exc.OperationalError: (psyc...
ERROR integration_tests/index/test_search_legacy.py::test_cli_missing_info[experimental-US/Pacific] - sqlalchemy.exc.Operatio...
ERROR integration_tests/index/test_search_legacy.py::test_cli_missing_info[experimental-UTC] - sqlalchemy.exc.OperationalErro...
FAILED tests/api/test_grid_workflow.py::test_gridworkflow_with_time_depth - AssertionError
FAILED tests/api/test_virtual.py::test_aggregate - ValueError: time already exists as coordinate or variable name.
FAILED integration_tests/test_end_to_end.py::test_end_to_end[US/Pacific-datacube] - AssertionError
FAILED integration_tests/test_end_to_end.py::test_end_to_end[UTC-datacube] - AssertionError
================== 4 failed, 733 passed, 3 skipped, 2 xfailed, 18 warnings, 156 errors in 1652.79s (0:27:32) ===================

after make html I get

$ make html
Running Sphinx v4.5.0
['/home/buck/dev/odc/datacube-core/docs', '/home/buck/dev/odc/datacube-core', '/home/buck/anaconda3/envs/odc/bin', '/home/buck/anaconda3/envs/odc/lib/python38.zip', '/home/buck/anaconda3/envs/odc/lib/python3.8', '/home/buck/anaconda3/envs/odc/lib/python3.8/lib-dynload', '/home/buck/anaconda3/envs/odc/lib/python3.8/site-packages']
making output directory... fatto

Exception occurred:
  File "/home/buck/anaconda3/envs/odc/lib/python3.8/site-packages/sphinx/theming.py", line 199, in load_external_theme
    theme_entry_points = entry_points(group='sphinx.html_themes')
TypeError: entry_points() got an unexpected keyword argument 'group'
The full traceback has been saved in /tmp/sphinx-err-6bdd1gn0.log, if you want to report the issue to the developers.
Please also report this if it was a user error, so that a better error message can be provided next time.
A bug report can be filed in the tracker at <https://github.com/sphinx-doc/sphinx/issues>. Thanks!
make: *** [Makefile:34: html] Errore 2

Steps to reproduce the behaviour

follow the setup instructions on https://datacube-core.readthedocs.io/en/latest/installation/setup/ubuntu.html

Environment information

Thank you

SpacemanPaul commented 2 years ago

Re integration tests - did you set up a postgres database and configure the config as per the instructions? It looks a lot like you don't have a working database.

But it also looks like I need to have a good look at the Conda environment, and those instructions as they are still targetting Ubuntu 20.04.

SpacemanPaul commented 2 years ago

It seems to be something to do with the way conda repackages postgres.

You can get around by editing integration_tests/agdcintegration.conf - and setting db_hostname to localhost and setting db_user and db_password to the postgres username and password of your test database.

The documentation definitely needs updating.

lucapaganotti commented 2 years ago

Hi,

after changing integration_tests\agdcintegration.conf as suggested by @SpacemanPaul , I've copied the file in my home folder as .datacube_integration.conf, here is the file:

[datacube]
db_hostname: localhost
db_database: agdcintegration
db_username: buck
db_password: password
index_driver: default

[experimental]
db_hostname:
db_database: odcintegration
index_driver: postgis

[no_such_driver_env]
index_driver: no_such_driver

[null_driver]
index_driver: null

[local_memory]
index_driver: memory

postgres is working I can login directly from the command line:

buck@odcdev:~/dev/odc/datacube-core$ psql -d agdcintegration
psql (12.12 (Ubuntu 12.12-0ubuntu0.20.04.1))
Type "help" for help.

agdcintegration=# 

but this is not happening when I acitivate the conda environment

(odc_env) buck@odcdev:~/dev/odc/datacube-core$ psql -d agdcintegration
psql: error: connection to server on socket "/tmp/.s.PGSQL.5432" failed: File o directory non esistente
    Is the server running locally and accepting connections on that socket?
(odc_env) buck@odcdev:~/dev/odc/datacube-core$ 

I can login to postgres if I issue psql this way

(odc_env) buck@odcdev:~/dev/odc/datacube-core$ psql -d agdcintegration -h localhost
Password for user buck: 
psql (14.5, server 12.12 (Ubuntu 12.12-0ubuntu0.20.04.1))
SSL connection (protocol: TLSv1.3, cipher: TLS_AES_256_GCM_SHA384, bits: 256, compression: off)
Type "help" for help.

agdcintegration=#

but the user password is asked on the command line to perform login.

I restarted from scratch the whole installation of datacube-core, this time on a new ubuntu 20.04 virtual machine, (not the last ubuntu LTS) following all the steps. No hope. But the error are changed.

(odc_env) buck@odcdev:~/dev/odc/datacube-core$ ./check-code.sh integration_tests
+ '[' integration_tests == --with-docker ']'
+ '[' no '!=' yes ']'
+ pycodestyle tests integration_tests examples --max-line-length 120
+ pylint -j 2 --reports no datacube
************* Module datacube.drivers.rio._reader
datacube/drivers/rio/_reader.py:72:26: I1101: Module 'rasterio.crs' has no 'CRS' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
************* Module datacube.utils.geometry._warp
datacube/utils/geometry/_warp.py:14:11: I1101: Module 'rasterio.crs' has no 'CRS' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)

--------------------------------------------------------------------
Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)

+ pytest -r a --cov datacube --doctest-ignore-import-errors --durations=5 datacube tests integration_tests
ImportError while loading conftest '/home/buck/dev/odc/datacube-core/tests/conftest.py'.
tests/conftest.py:17: in <module>
    from datacube import Datacube
datacube/__init__.py:29: in <module>
    from .api import Datacube
datacube/api/__init__.py:9: in <module>
    from .core import Datacube, TerminateCurrentLoad
datacube/api/core.py:16: in <module>
    from datacube.storage import reproject_and_fuse, BandInfo
datacube/storage/__init__.py:11: in <module>
    from ..drivers.datasource import (
datacube/drivers/__init__.py:9: in <module>
    from .indexes import index_driver_by_name, index_drivers
datacube/drivers/indexes.py:9: in <module>
    from ..index.abstract import AbstractIndexDriver
datacube/index/__init__.py:9: in <module>
    from ._api import index_connect
datacube/index/_api.py:12: in <module>
    from datacube.index.abstract import AbstractIndex
datacube/index/abstract.py:18: in <module>
    from datacube.index.fields import Field
datacube/index/fields.py:13: in <module>
    from datacube.model import Range
datacube/model/__init__.py:20: in <module>
    from datacube.utils import geometry, without_lineage_sources, parse_time, cached_property, uri_to_local_path, \
datacube/utils/geometry/__init__.py:8: in <module>
    from ._base import (
datacube/utils/geometry/_base.py:19: in <module>
    import rasterio                    # type: ignore[import]
../../../anaconda3/envs/odc_env/lib/python3.8/site-packages/rasterio/__init__.py:28: in <module>
    from rasterio._version import gdal_version, get_geos_version, get_proj_version
E   ImportError: libLerc.so.4: cannot open shared object file: No such file or directory
(odc_env) buck@odcdev:~/dev/odc/datacube-core$ 

So, what about Module 'rasterio.crs' has no 'CRS' member, but source is unavailable?

and then where I do need to find libLerc? It's not listed among ubuntu 20.4 packages.

I'm sorry for the noise but datacube-core is not installed and setup correctly even on a fresh ubuntu 20.04 LTS box

SpacemanPaul commented 1 year ago
  1. Ignore Module 'rasterio.crs' has no 'CRS' member, but source is unavailable? it is just warning from the code checker.

  2. Yes, postgres in a conda install is weird. If you specify a hostname it seems to work OK.

  3. I have seen the libLerc thing - it appears to a be an issue with the conda package for lerc. The following will fix it (if you have set up your conda environment per the instructions):

cd ~/anaconda3/envs/odc_env/lib
ln -s libLerc.so libLerc.so.4
  1. You may have better luck installing via pip into a virtualenv on Ubuntu 22.04.
lucapaganotti commented 1 year ago

Hi Paul,

thank you for your answer, I'll check the proposed solution. In the meantime I give up with the Ubuntu 22.04 box and deleted it, I created a new fresh ubuntu 20.04 box, then after the initial setup I checkout the stable branch from github not the development one, I then follow instructions on github and NOT on readthedocs. Doing this, tests are performed but with a lot of errors (more than 100). I do not expect to have so much errors, anyway at least they are performed.

Thank you again.

-- lp

On Mon, Nov 7, 2022 at 5:29 AM Paul Haesler @.***> wrote:

1.

Ignore Module 'rasterio.crs' has no 'CRS' member, but source is unavailable? it is just warning from the code checker. 2.

Yes, postgres in a conda install is weird. If you specify a hostname it seems to work OK. 3.

I have seen the libLerc thing - it appears to a be an issue with the conda package for lerc. The following will fix it (if you have set up your conda environment per the instructions):

cd ~/anaconda3/envs/odc_env/lib ln -s libLerc.so libLerc.so.4

  1. You may have better luck installing via pip into a virtualenv on Ubuntu 22.04.

— Reply to this email directly, view it on GitHub https://github.com/opendatacube/datacube-core/issues/1329#issuecomment-1305067339, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABOYWHKEOSU7RXDHDCWVEXTWHCASTANCNFSM6AAAAAARQ2VSLQ . You are receiving this because you authored the thread.Message ID: @.***>

Kirill888 commented 1 year ago

@lucapaganotti conda based environments assume that postgres is using /tmp/ folder for unix: file connections to the database. On ubuntu postgres is using /var/run/postgres/ folder instead, when installing from pip db libraries are compiled and so have ubuntu defaults, when using conda, system libraries are not used and so you need to change that default via config.

db_hostname=/var/run/postgresql

if using environment variable as config

export DATACUBE_DB_URL='postgresql:///datacube?host=/var/run/postgresql'

another option is to set PGHOST environment variable, and this should be used for configs that leave hostname as default.

export PGHOST=/var/run/postgresql

⬆️ @SpacemanPaul @pindge @caitlinadams this should really be in the docs, conda environments are very common, and connecting via localhost is probably a bit slower, but also requires setting up password for db users and has some other security implications.

SpacemanPaul commented 1 year ago

@Kirill888 see @omad 's extensive comments on #1258

lucapaganotti commented 1 year ago

Hi Paul,

I've setup postgresql connection via .datacube.conf and .datacube_integration.conf in my home folder. Anyway I'm getting errors during check_code.sh tests that are requiring a local socket connection to postgres. I'll take only the last:


integration_tests/index/test_search_legacy.py::test_cli_missing_info[experimental-UTC]
- sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection
to server on socket "/tmp/.s.PGSQL.5432" failed: No such file or
directory```

Why is this code trying to connect via a local socket to postgresql if I've
defined host, database, username and password in the config file? Is there
another place where postgresql connection has to be defined?

The symbolic link to libLerc.so in the specific environment is working,
thanks.

To tell the true I started with a 22.04 LTS ubuntu image but I had the same
problems and I give up with it falling back to 20.04.

Thanks again for your help.

----------------------------------------------------------------
-- Dott. Ing. Luca Paganotti
-- Via Baroffio 4
-- 21040 Vedano Olona (VA)
-- 393 1346898
----------------------------------------------------------------
-- softech s.r.l. email:
-- ***@***.***
-- ***@***.***
-- https://github.com/lucapaganotti
-- sourceforge email:
-- ***@***.***
-- skype name: luca.paganotti
[image: http://it.linkedin.com/in/lucapaganotti]
<http://it.linkedin.com/in/lucapaganotti>
-- ---------------------------------------------------------------
-- Mistakes are portals of discovery - JAAJ
--- --------------------------------------------------------------

On Mon, Nov 7, 2022 at 5:29 AM Paul Haesler ***@***.***>
wrote:

>
>    1.
>
>    Ignore Module 'rasterio.crs' has no 'CRS' member, but source is
>    unavailable? it is just warning from the code checker.
>    2.
>
>    Yes, postgres in a conda install is weird. If you specify a hostname
>    it seems to work OK.
>    3.
>
>    I have seen the libLerc thing - it appears to a be an issue with the
>    conda package for lerc. The following will fix it (if you have set up your
>    conda environment per the instructions):
>
> cd ~/anaconda3/envs/odc_env/lib
> ln -s libLerc.so libLerc.so.4
>
>
>    1. You may have better luck installing via pip into a virtualenv on
>    Ubuntu 22.04.
>
> —
> Reply to this email directly, view it on GitHub
> <https://github.com/opendatacube/datacube-core/issues/1329#issuecomment-1305067339>,
> or unsubscribe
> <https://github.com/notifications/unsubscribe-auth/ABOYWHKEOSU7RXDHDCWVEXTWHCASTANCNFSM6AAAAAARQ2VSLQ>
> .
> You are receiving this because you authored the thread.Message ID:
> ***@***.***>
>
lucapaganotti commented 1 year ago

Hi Kiril,

I've tried to change db_hostname value as you suggested in ~/.datacube_integration.conf but I get the same connection errors, the code is trying to use a local socket as for example:

integration_tests/index/test_search_legacy.py::test_cli_missing_info[experimental-UTC]
- sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection
to server on socket "/tmp/.s.PGSQL.5432" failed: No such file or
directory```

postgresql connection is defined the same way in
~/.datacube_integration.conf and ~/.datacube.conf as

# .datacube_integration.conf
[datacube]
db_hostname: localhost
db_database: agdcintegration
db_username: myuser
db_password: mypassword
index_driver: default

I'm sorry, I did not change the experimental section of the config file
that was left as it was (didn't find anything that told me to do so, I
thought only the datacube section was meaningful ... my mistake ...), also,
looking at the error trace it seems that "experimental" tests are failing
... I then replicated the same setup for the experimental section:

...
[experimental]
db_hostname: localhost
db_database: agdcintegration
db_username: buck
db_password: password
index_driver: postgis
...

and now, running the tests again, I'm still having some failure but not so
many as before and above all no error, they are:

...
tests/api/*test_grid_workflow.py ..F*
...
 tests/api/*test_virtual.py .............F....*
...
ntegration_tests/*test_end_to_end.py FF*
...
integration_tests/*test_full_ingestion.py ..F.*
...

with this stack trace:

...

TOTAL 13598 1104 92%

========================================================================== slowest 5 durations

66.34s call integration_tests/test_config_tool.py::test_add_example_dataset_types[datacube-US/Pacific] 22.24s setup integration_tests/index/test_search_legacy.py::test_search_returning[US/Pacific-datacube] 18.71s setup integration_tests/index/test_search_eo3.py::test_search_returning_eo3[datacube-US/Pacific] 18.25s setup integration_tests/test_full_ingestion.py::test_process_all_ingest_jobs[US/Pacific-datacube] 17.87s setup integration_tests/index/test_search_legacy.py::test_search_returning_rows[datacube-US/Pacific]

short test summary info

SKIPPED [2] integration_tests/test_3d.py:26: could not import 'dcio_example.xarray_3d': No module named 'dcio_example' SKIPPED [1] ../../../anaconda3/envs/odc/lib/python3.8/site-packages/_pytest/doctest.py:455: all tests skipped by +SKIP option XFAIL tests/test_geometry.py::test_lonalt_bounds_more_than_180 - Bounds computation for large geometries in safe mode is broken XFAIL tests/test_utils_docs.py::test_merge_with_nan - Merging dictionaries with content of NaN doesn't work currently FAILED tests/api/test_grid_workflow.py::test_gridworkflow_with_time_depth - AssertionError FAILED tests/api/test_virtual.py::test_aggregate - ValueError: time already exists as coordinate or variable name. FAILED integration_tests/test_end_to_end.py::test_end_to_end[US/Pacific-datacube]

I don't know if it is Ok to go on, I have:

one module 'dcio_example' missing computation over large geometries is broken in safe mode (what does this mean?) 3 AssertionError

The datacube seems to be running ok

.datacube.conf

db_hostname=localhost db_database=datacube db_username=myuser db_password=mypassword index_driver=default

I've prepared a product and a dataset metadata file and datacube ingested them.

I would like to have more information about how to write metadata files expressed in eo3, I wrote my own basing myself on some examples, but I would like to have detailed info if possible about what can be done with eo3.

Then I would like then to view my data with datacube-ows.

Thanks for all the help you gave me.

Have a nice day.


-- Dott. Ing. Luca Paganotti -- Via Baroffio 4 -- 21040 Vedano Olona (VA) -- 393 1346898

-- softech s.r.l. email: -- @. -- @. -- https://github.com/lucapaganotti -- sourceforge email: -- @.*** -- skype name: luca.paganotti [image: http://it.linkedin.com/in/lucapaganotti] http://it.linkedin.com/in/lucapaganotti


-- Mistakes are portals of discovery - JAAJ


On Wed, Nov 9, 2022 at 11:13 PM Kirill Kouzoubov @.***> wrote:

@lucapaganotti https://github.com/lucapaganotti conda based environments assume that postgres is using /tmp/ folder for unix: file connections to the database. On ubuntu postgres is using /var/run/postgres/ folder instead, when installing from pip db libraries are compiled and so have ubuntu defaults, when using conda, system libraries are not used and so you need to change that default via config.

db_hostname=/var/run/postgresql

if using environment variable as config

export DATACUBE_DB_URL='postgresql:///datacube?host=/var/run/postgresql'

another option is to set PGHOST environment variable, and this should be used for default configs that leave hostname as default.

export PGHOST=/var/run/postgresql

⬆️ @SpacemanPaul https://github.com/SpacemanPaul @pindge https://github.com/pindge @caitlinadams https://github.com/caitlinadams this should really be in the docs, conda environments are very common, and connecting via localhost is probably a bit slower, but also requires setting up password for db users and has some other security implications.

— Reply to this email directly, view it on GitHub https://github.com/opendatacube/datacube-core/issues/1329#issuecomment-1309449122, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABOYWHKX735BCBIJ24MHAO3WHQO2FANCNFSM6AAAAAARQ2VSLQ . You are receiving this because you were mentioned.Message ID: @.***>

SpacemanPaul commented 1 year ago

Hi Luca, I think you are safe to proceed from here. It's going to get increasingly difficult to debug the remaining issues. The instructions you are following are in desperate need of a full rewrite, and we are in the early stages of overhauling our whole testing framework to make it easier to run the tests.

EO3 is also not as well documented as it could be. You may want to have a look at the opendatacube/eodatasets repository, which includes a tool for testing validity of EO3 documents. If you are sourcing data from a provider that uses a STAC API, there are tools in odc-apps-dc-tools package that allow indexing into a ODC instance directly from a STAC API endpoint, with automatic conversion from STAC to EO3.

permezel commented 1 year ago

I recently attempted to, am attempting to, follow the instructions to setup in a Parallel's based VM running 22.04.1 LTS (jammy jellyfish). I ran into the issue of psql failing:

dap@odc ~/proj/datacube-core % psql -d agdcintegration                                      
psql: could not connect to server: No such file or directory
    Is the server running locally and accepting
    connections on Unix domain socket "/tmp/.s.PGSQL.5432"?

The reason for this is that psql has been installed into the condo env path. Explicitly using the system wide one works. I suspect that one could locate the config file for the conda env version and fixing the directory for the socket.

dap@odc ~/proj/datacube-core % /usr/bin/psql -d agdcintegration                                      
psql (14.5 (Ubuntu 14.5-0ubuntu0.22.04.1))
Type "help" for help.

agdcintegration=# \q

It appears that the version conda installed does not look for any config file:

Here is the system wide file (extraction):

dap@odc ~/proj/datacube-core % strace /usr/bin/psql -d agdcintegration 2>&1 | grep postgresql.conf
newfstatat(AT_FDCWD, "/etc/postgresql/14/./postgresql.conf", 0xaaaae27634a8, 0) = -1 ENOENT (No such file or directory)
newfstatat(AT_FDCWD, "/etc/postgresql/14/main/postgresql.conf", {st_mode=S_IFREG|0644, st_size=29054, ...}, 0) = 0
newfstatat(AT_FDCWD, "/etc/postgresql/14/main/postgresql.conf", {st_mode=S_IFREG|0644, st_size=29054, ...}, 0) = 0
newfstatat(AT_FDCWD, "/etc/postgresql/14/main/postgresql.conf", {st_mode=S_IFREG|0644, st_size=29054, ...}, 0) = 0
openat(AT_FDCWD, "/etc/postgresql/14/main/postgresql.conf", O_RDONLY|O_CLOEXEC) = 4
newfstatat(AT_FDCWD, "/etc/postgresql/14/main/postgresql.conf", {st_mode=S_IFREG|0644, st_size=29054, ...}, 0) = 0
newfstatat(AT_FDCWD, "/etc/postgresql/14/main/postgresql.conf", {st_mode=S_IFREG|0644, st_size=29054, ...}, 0) = 0
openat(AT_FDCWD, "/etc/postgresql/14/main/postgresql.conf", O_RDONLY|O_CLOEXEC) = 4
newfstatat(AT_FDCWD, "/etc/postgresql/14/main/postgresql.conf", {st_mode=S_IFREG|0644, st_size=29054, ...}, 0) = 0
openat(AT_FDCWD, "/etc/postgresql/14/main/postgresql.conf", O_RDONLY|O_CLOEXEC) = 4
dap@odc ~/proj/datacube-core % strace psql -d agdcintegration 2>&1 | grep postgresql.conf 
dap@odc ~/proj/datacube-core % 
permezel commented 1 year ago

So ignoring the psql issue for now, I run into the next issue for me:

dap@odc ~/proj/datacube-core % ./check-code.sh integration_tests                         
+ '[' integration_tests == --with-docker ']'
+ '[' no '!=' yes ']'
+ pycodestyle tests integration_tests examples --max-line-length 120
./check-code.sh: line 25: pycodestyle: command not found
dap@odc ~/proj/datacube-core % 

This is solved trivially, but becomes a doco issue, perhaps?

dap@odc ~/proj/datacube-core % conda install pycodestyle
Collecting package metadata (current_repodata.json): done
Solving environment: done

## Package Plan ##

  environment location: /home/dap/anaconda3/envs/odc

  added / updated specs:
    - pycodestyle

The following NEW packages will be INSTALLED:

  pycodestyle        pkgs/main/noarch::pycodestyle-2.8.0-pyhd3eb1b0_0 

Proceed ([y]/n)? y

Downloading and Extracting Packages

Preparing transaction: done
Verifying transaction: done
Executing transaction: done
dap@odc ~/proj/datacube-core % 

But then I find I need pylint also:

dap@odc ~/proj/datacube-core % ./check-code.sh integration_tests                         
+ '[' integration_tests == --with-docker ']'
+ '[' no '!=' yes ']'
+ pycodestyle tests integration_tests examples --max-line-length 120
+ pylint -j 2 --reports no datacube
./check-code.sh: line 26: pylint: command not found
dap@odc ~/proj/datacube-core % 
permezel commented 1 year ago

Once pylint is installed, I can proceed to the next hurdle:

dap@odc ~/proj/datacube-core % ./check-code.sh integration_tests
+ '[' integration_tests == --with-docker ']'
+ '[' no '!=' yes ']'
+ pycodestyle tests integration_tests examples --max-line-length 120
+ pylint -j 2 --reports no datacube
************* Module datacube.drivers.rio._reader
datacube/drivers/rio/_reader.py:72:26: I1101: Module 'rasterio.crs' has no 'CRS' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
************* Module datacube.utils.geometry._warp
datacube/utils/geometry/_warp.py:14:11: I1101: Module 'rasterio.crs' has no 'CRS' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)

------------------------------------
Your code has been rated at 10.00/10

+ pytest -r a --cov datacube --doctest-ignore-import-errors --durations=5 datacube tests integration_tests
ImportError while loading conftest '/home/dap/proj/datacube-core/integration_tests/conftest.py'.
integration_tests/conftest.py:21: in <module>
    import datacube.scripts.cli_app
datacube/scripts/cli_app.py:17: in <module>
    import datacube.scripts.system    # noqa: F401
datacube/scripts/system.py:13: in <module>
    from datacube.drivers.postgres._connections import IndexSetupError
datacube/drivers/postgres/__init__.py:11: in <module>
    from ._connections import PostgresDb
datacube/drivers/postgres/_connections.py:30: in <module>
    from . import _api
datacube/drivers/postgres/_api.py:31: in <module>
    from ._fields import parse_fields, Expression, PgField, PgExpression  # noqa: F401
datacube/drivers/postgres/_fields.py:14: in <module>
    from psycopg2.extras import NumericRange, DateTimeTZRange
../../anaconda3/envs/odc/lib/python3.10/site-packages/psycopg2/__init__.py:51: in <module>
    from psycopg2._psycopg import (                     # noqa
E   ImportError: /home/dap/anaconda3/envs/odc/lib/python3.10/site-packages/psycopg2/_psycopg.cpython-310-aarch64-linux-gnu.so: undefined symbol: PQencryptPasswordConn
dap@odc ~/proj/datacube-core % 
permezel commented 1 year ago

According to this ancient, closed issue, the version being used is less than 10.XXXX:

dap@odc ~/proj/datacube-core % which python
/home/dap/anaconda3/envs/odc/bin/python
dap@odc ~/proj/datacube-core % python
Python 3.10.8 | packaged by conda-forge | (main, Nov 22 2022, 08:13:45) [GCC 10.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import ctypes
>>> libpq = ctypes.cdll.LoadLibrary("libpq.so")
>>> print(libpq.PQlibVersion())
90606
>>>
>>> from psycopg2._psycopg import PQencryptPasswordConn
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/dap/anaconda3/envs/odc/lib/python3.10/site-packages/psycopg2/__init__.py", line 51, in <module>
    from psycopg2._psycopg import (                     # noqa
ImportError: /home/dap/anaconda3/envs/odc/lib/python3.10/site-packages/psycopg2/_psycopg.cpython-310-aarch64-linux-gnu.so: undefined symbol: PQencryptPasswordConn
>>> 

This seems to be a show stopper.

permezel commented 1 year ago

I did some more poking and found that there are multiple versions of libpq installed, some including and some omitting the desired import. Sigh. I am beginning to once again dislike conda.

dap@odc ~/proj/datacube-core % F=$(find ~ -print | grep libpq.so) 

(also getting annoyed as zsh: witness the required $(echo $F))

dap@odc ~/proj/datacube-core % for f in $(echo $F); do echo "=-= $f"; nm $f | grep PQenc; done
=-= /home/dap/anaconda3/lib/libpq.so.5.12
000000000000c684 T PQencryptPassword
000000000000c714 T PQencryptPasswordConn
=-= /home/dap/anaconda3/lib/libpq.so.5
000000000000c684 T PQencryptPassword
000000000000c714 T PQencryptPasswordConn
=-= /home/dap/anaconda3/lib/libpq.so
000000000000c684 T PQencryptPassword
000000000000c714 T PQencryptPasswordConn
=-= /home/dap/anaconda3/envs/odc/lib/libpq.so.5.14
0000000000025460 T PQencryptPassword
00000000000254f0 T PQencryptPasswordConn
=-= /home/dap/anaconda3/envs/odc/lib/libpq.so.5
000000000000b520 T PQencryptPassword
=-= /home/dap/anaconda3/envs/odc/lib/libpq.so.5.9
000000000000b520 T PQencryptPassword
=-= /home/dap/anaconda3/envs/odc/lib/libpq.so
000000000000b520 T PQencryptPassword
=-= /home/dap/anaconda3/pkgs/postgresql-9.6.6-he3421e9_1/lib/libpq.so.5
000000000000b520 T PQencryptPassword
=-= /home/dap/anaconda3/pkgs/postgresql-9.6.6-he3421e9_1/lib/libpq.so.5.9
000000000000b520 T PQencryptPassword
=-= /home/dap/anaconda3/pkgs/postgresql-9.6.6-he3421e9_1/lib/libpq.so
000000000000b520 T PQencryptPassword
=-= /home/dap/anaconda3/pkgs/libpq-14.5-h0f47c37_2/lib/libpq.so.5.14
0000000000025460 T PQencryptPassword
00000000000254f0 T PQencryptPasswordConn
=-= /home/dap/anaconda3/pkgs/libpq-14.5-h0f47c37_2/lib/libpq.so.5
0000000000025460 T PQencryptPassword
00000000000254f0 T PQencryptPasswordConn
=-= /home/dap/anaconda3/pkgs/libpq-14.5-h0f47c37_2/lib/libpq.so
0000000000025460 T PQencryptPassword
00000000000254f0 T PQencryptPasswordConn
=-= /home/dap/anaconda3/pkgs/libpq-12.9-h140f9b7_3/lib/libpq.so.5.12
000000000000c684 T PQencryptPassword
000000000000c714 T PQencryptPasswordConn
=-= /home/dap/anaconda3/pkgs/libpq-12.9-h140f9b7_3/lib/libpq.so.5
000000000000c684 T PQencryptPassword
000000000000c714 T PQencryptPasswordConn
=-= /home/dap/anaconda3/pkgs/libpq-12.9-h140f9b7_3/lib/libpq.so
000000000000c684 T PQencryptPassword
000000000000c714 T PQencryptPasswordConn
dap@odc ~/proj/datacube-core % strace python my.py 2>&1 | grep libpq              
read(3, "libpq.PQlibVersion())\n", 4096) = 22
read(3, "import ctypes\nlibpq = ctypes.cdl"..., 87) = 86
read(3, "import ctypes\nlibpq = ctypes.cdl"..., 4096) = 86
read(3, "import ctypes\nlibpq = ctypes.cdl"..., 4096) = 86
openat(AT_FDCWD, "/home/dap/anaconda3/lib/python3.9/lib-dynload/../../libpq.so", O_RDONLY|O_CLOEXEC) = 3
dap@odc ~/proj/datacube-core % 

So given that I am running in a conda env build from python=3.10 and running python 3.10 and there is a plethora or two of possibly suitable libraries, despite this, it reaches back into the non-env python3.9 to pull out the wrong library.

Hmm.

permezel commented 1 year ago

If I create the conda env with python=3.9 I run into different issues.

+ conda run -n odc ./check-code.sh integration_tests                                                                                

--------------------------------------------------------------------                                                                
Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)                                                                

+ '[' integration_tests == --with-docker ']'
+ '[' no '!=' yes ']'
+ pycodestyle tests integration_tests examples --max-line-length 120
+ pylint -j 2 --reports no datacube
+ pytest -r a --cov datacube --doctest-ignore-import-errors --durations=5 datacube tests integration_tests
ImportError while loading conftest '/home/dap/proj/datacube-core/integration_tests/conftest.py'.
integration_tests/conftest.py:24: in <module>
    from datacube.drivers.postgis import _core as pgis_core
datacube/drivers/postgis/__init__.py:11: in <module>
    from ._connections import PostGisDb
datacube/drivers/postgis/_connections.py:31: in <module>
    from . import _api
datacube/drivers/postgis/_api.py:37: in <module>
    from ._spatial import geom_alchemy
datacube/drivers/postgis/_spatial.py:15: in <module>
    from geoalchemy2 import Geometry
E   ModuleNotFoundError: No module named 'geoalchemy2'

ERROR conda.cli.main_run:execute(47): `conda run ./check-code.sh integration_tests` failed. (See above for error)

However, the prior issue has been resolved:

dap@odc ~/proj/datacube-core % cat my.py
import ctypes
libpq = ctypes.cdll.LoadLibrary("libpq.so")
print(libpq.PQlibVersion())
dap@odc ~/proj/datacube-core % python ./my.py
120009

Adding in the missing:


dap@odc ~/proj/datacube-core % conda install geoalchemy2
Collecting package metadata (current_repodata.json): done
Solving environment: done

## Package Plan ##

  environment location: /home/dap/anaconda3/envs/odc

  added / updated specs:
    - geoalchemy2

The following NEW packages will be INSTALLED:

  geoalchemy2        pkgs/main/noarch::geoalchemy2-0.9.2-pyhd3eb1b0_0 

Proceed ([y]/n)? y

Downloading and Extracting Packages

Preparing transaction: done
Verifying transaction: done
Executing transaction: done
dap@odc ~/proj/datacube-core % ./check-code.sh 
+ '[' '' == --with-docker ']'
+ '[' no '!=' yes ']'
+ pycodestyle tests integration_tests examples --max-line-length 120
+ pylint -j 2 --reports no datacube

--------------------------------------------------------------------
Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)

+ pytest -r a --cov datacube --doctest-ignore-import-errors --durations=5 datacube tests
======================================================= test session starts ========================================================
platform linux -- Python 3.9.15, pytest-7.1.2, pluggy-1.0.0
rootdir: /home/dap/proj/datacube-core, configfile: pytest.ini
plugins: cov-3.0.0, anyio-3.5.0, hypothesis-6.29.3
collected 476 items / 1 error / 1 skipped                                                                                          

============================================================== ERRORS ==============================================================
____________________________________________ ERROR collecting tests/test_utils_dask.py _____________________________________________
ImportError while importing test module '/home/dap/proj/datacube-core/tests/test_utils_dask.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../../anaconda3/envs/odc/lib/python3.9/importlib/__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
tests/test_utils_dask.py:6: in <module>
    import moto
E   ModuleNotFoundError: No module named 'moto'
========================================================= warnings summary =========================================================
../../anaconda3/envs/odc/lib/python3.9/site-packages/botocore/httpsession.py:41
  /home/dap/anaconda3/envs/odc/lib/python3.9/site-packages/botocore/httpsession.py:41: DeprecationWarning: 'urllib3.contrib.pyopenssl' module is deprecated and will be removed in a future release of urllib3 2.x. Read more in this issue: https://github.com/urllib3/urllib3/issues/2680
    from urllib3.contrib.pyopenssl import orig_util_SSLContext as SSLContext

datacube/utils/geometry/_base.py:153
datacube/utils/geometry/_base.py:153
  /home/dap/proj/datacube-core/datacube/utils/geometry/_base.py:153: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
    DEFAULT_WKT_VERSION = (WktVersion.WKT1_GDAL if LooseVersion(rasterio.__gdal_version__) < LooseVersion("3.0.0")

<frozen importlib._bootstrap>:228
  <frozen importlib._bootstrap>:228: RuntimeWarning: numpy.ndarray size changed, may indicate binary incompatibility. Expected 16 from C header, got 88 from PyObject

datacube/storage/masking.py:9
  /home/dap/proj/datacube-core/datacube/storage/masking.py:9: DeprecationWarning: datacube.storage.masking has moved to datacube.utils.masking
    warnings.warn("datacube.storage.masking has moved to datacube.utils.masking",

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html

---------- coverage: platform linux, python 3.9.15-final-0 -----------
Name                                         Stmts   Miss  Cover
----------------------------------------------------------------
datacube/__init__.py                             8      0   100%
datacube/__main__.py                             0      0   100%
datacube/api/__init__.py                         4      0   100%
datacube/api/core.py                           384    330    14%
datacube/api/grid_workflow.py                  137    100    27%
datacube/api/query.py                          213    175    18%
datacube/config.py                             126     98    22%
datacube/drivers/__init__.py                     5      0   100%
datacube/drivers/_tools.py                      14      9    36%
datacube/drivers/_types.py                      46      0   100%
datacube/drivers/datasource.py                  30      0   100%
datacube/drivers/driver_cache.py                29     25    14%
datacube/drivers/indexes.py                     24     13    46%
datacube/drivers/netcdf/__init__.py              4      0   100%
datacube/drivers/netcdf/_safestrings.py         41     22    46%
datacube/drivers/netcdf/_write.py               55     46    16%
datacube/drivers/netcdf/driver.py               36     14    61%
datacube/drivers/netcdf/writer.py              168    137    18%
datacube/drivers/postgis/__init__.py             4      0   100%
datacube/drivers/postgis/_api.py               396    297    25%
datacube/drivers/postgis/_connections.py       132     74    44%
datacube/drivers/postgis/_core.py               99     74    25%
datacube/drivers/postgis/_fields.py            266    141    47%
datacube/drivers/postgis/_schema.py            107      1    99%
datacube/drivers/postgis/_spatial.py            88     58    34%
datacube/drivers/postgis/sql.py                 55     12    78%
datacube/drivers/postgres/__init__.py            4      0   100%
datacube/drivers/postgres/_api.py              302    208    31%
datacube/drivers/postgres/_connections.py      105     56    47%
datacube/drivers/postgres/_core.py             107     82    23%
datacube/drivers/postgres/_dynamic.py           64     52    19%
datacube/drivers/postgres/_fields.py           268    140    48%
datacube/drivers/postgres/_schema.py            14      0   100%
datacube/drivers/postgres/sql.py                55     12    78%
datacube/drivers/readers.py                     41     26    37%
datacube/drivers/rio/__init__.py                 1      0   100%
datacube/drivers/rio/_reader.py                134     84    37%
datacube/drivers/writers.py                     20     10    50%
datacube/execution/__init__.py                   0      0   100%
datacube/execution/worker.py                    30     20    33%
datacube/executor.py                           169    144    15%
datacube/helpers.py                             19     14    26%
datacube/index/__init__.py                       6      0   100%
datacube/index/_api.py                          14      8    43%
datacube/index/abstract.py                     378    140    63%
datacube/index/eo3.py                          101     83    18%
datacube/index/exceptions.py                    10      2    80%
datacube/index/fields.py                        31     16    48%
datacube/index/hl.py                           158    136    14%
datacube/index/memory/__init__.py                1      0   100%
datacube/index/memory/_datasets.py             475    409    14%
datacube/index/memory/_fields.py                11      5    55%
datacube/index/memory/_metadata_types.py        69     46    33%
datacube/index/memory/_products.py             107     85    21%
datacube/index/memory/_users.py                 38     27    29%
datacube/index/memory/index.py                  67     25    63%
datacube/index/null/__init__.py                  1      0   100%
datacube/index/null/_datasets.py                63     24    62%
datacube/index/null/_metadata_types.py          16      4    75%
datacube/index/null/_products.py                23      7    70%
datacube/index/null/_users.py                   10      2    80%
datacube/index/null/index.py                    62     22    65%
datacube/index/postgis/__init__.py               0      0   100%
datacube/index/postgis/_datasets.py            377    310    18%
datacube/index/postgis/_metadata_types.py       82     58    29%
datacube/index/postgis/_products.py            126    100    21%
datacube/index/postgis/_transaction.py          27     12    56%
datacube/index/postgis/_users.py                21     11    48%
datacube/index/postgis/index.py                103     52    50%
datacube/index/postgres/__init__.py              0      0   100%
datacube/index/postgres/_datasets.py           366    300    18%
datacube/index/postgres/_metadata_types.py      82     58    29%
datacube/index/postgres/_products.py           123     97    21%
datacube/index/postgres/_transaction.py         27     12    56%
datacube/index/postgres/_users.py               21     11    48%
datacube/index/postgres/index.py                90     45    50%
datacube/model/__init__.py                     515    321    38%
datacube/model/_base.py                          6      3    50%
datacube/model/fields.py                        83     56    33%
datacube/model/utils.py                        164    131    20%
datacube/scripts/__init__.py                     0      0   100%
datacube/scripts/cli_app.py                      8      0   100%
datacube/scripts/dataset.py                    358    259    28%
datacube/scripts/ingest.py                     265    209    21%
datacube/scripts/metadata.py                    95     60    37%
datacube/scripts/product.py                    131     86    34%
datacube/scripts/search_tool.py                 75     34    55%
datacube/scripts/system.py                      54     30    44%
datacube/scripts/user.py                        62     20    68%
datacube/storage/__init__.py                     5      0   100%
datacube/storage/_base.py                       56     44    21%
datacube/storage/_hdf5.py                        2      0   100%
datacube/storage/_load.py                       86     64    26%
datacube/storage/_read.py                      127    114    10%
datacube/storage/_rio.py                       143     95    34%
datacube/storage/masking.py                      3      0   100%
datacube/testutils/__init__.py                 208    169    19%
datacube/testutils/geom.py                      66     50    24%
datacube/testutils/io.py                       204    174    15%
datacube/testutils/iodriver.py                  31     19    39%
datacube/testutils/threads.py                   15      8    47%
datacube/ui/__init__.py                          5      0   100%
datacube/ui/click.py                           163     91    44%
datacube/ui/common.py                           52     42    19%
datacube/ui/expression.py                       46     24    48%
datacube/ui/task_app.py                        159    116    27%
datacube/utils/__init__.py                      10      0   100%
datacube/utils/_misc.py                          7      2    71%
datacube/utils/aws/__init__.py                 180    144    20%
datacube/utils/changes.py                       75     50    33%
datacube/utils/cog.py                          104     84    19%
datacube/utils/dask.py                          93     68    27%
datacube/utils/dates.py                         69     43    38%
datacube/utils/documents.py                    280    156    44%
datacube/utils/generic.py                       39     29    26%
datacube/utils/geometry/__init__.py              5      0   100%
datacube/utils/geometry/_base.py               765    496    35%
datacube/utils/geometry/_warp.py                47     33    30%
datacube/utils/geometry/gbox.py                109     79    28%
datacube/utils/geometry/tools.py               269    233    13%
datacube/utils/io.py                            31     24    23%
datacube/utils/masking.py                      118     98    17%
datacube/utils/math.py                         116     96    17%
datacube/utils/py.py                            33     19    42%
datacube/utils/rio/__init__.py                   3      0   100%
datacube/utils/rio/_rio.py                      65     47    28%
datacube/utils/serialise.py                     44     25    43%
datacube/utils/uris.py                         108     79    27%
datacube/utils/xarray_geoextensions.py         102     84    18%
datacube/virtual/__init__.py                    89     64    28%
datacube/virtual/catalog.py                     37     20    46%
datacube/virtual/expr.py                        47     19    60%
datacube/virtual/impl.py                       449    326    27%
datacube/virtual/transformations.py            213    166    22%
datacube/virtual/utils.py                       31     23    26%
----------------------------------------------------------------
TOTAL                                        13615   9307    32%

===================================================== short test summary info ======================================================
SKIPPED [1] ../../anaconda3/envs/odc/lib/python3.9/site-packages/_pytest/doctest.py:548: unable to import module PosixPath('/home/dap/proj/datacube-core/tests/test_utils_dask.py')
ERROR tests/test_utils_dask.py
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 1 error during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
============================================= 1 skipped, 5 warnings, 1 error in 2.90s ==============================================
dap@odc ~/proj/datacube-core % 
permezel commented 1 year ago

After adding in moto I get:

dap@odc ~/proj/datacube-core % ./check-code.sh   
+ '[' '' == --with-docker ']'
+ '[' no '!=' yes ']'
+ pycodestyle tests integration_tests examples --max-line-length 120
+ pylint -j 2 --reports no datacube

--------------------------------------------------------------------
Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)

+ pytest -r a --cov datacube --doctest-ignore-import-errors --durations=5 datacube tests
======================================================= test session starts ========================================================
platform linux -- Python 3.9.15, pytest-7.1.2, pluggy-1.0.0
rootdir: /home/dap/proj/datacube-core, configfile: pytest.ini
plugins: cov-3.0.0, anyio-3.5.0, hypothesis-6.29.3
collected 489 items                                                                                                                

datacube/api/query.py .                                                                                                      [  0%]
datacube/drivers/postgis/_connections.py .                                                                                   [  0%]
datacube/drivers/postgis/_core.py ...                                                                                        [  1%]
datacube/drivers/postgis/_fields.py ..                                                                                       [  1%]
datacube/drivers/postgres/_connections.py .                                                                                  [  1%]
datacube/drivers/postgres/_core.py ...                                                                                       [  2%]
datacube/drivers/postgres/_dynamic.py .                                                                                      [  2%]
datacube/drivers/postgres/_fields.py ..                                                                                      [  2%]
datacube/model/__init__.py ..                                                                                                [  3%]
datacube/ui/click.py .                                                                                                       [  3%]
datacube/utils/masking.py s.                                                                                                 [  3%]
datacube/utils/math.py .                                                                                                     [  4%]
datacube/utils/geometry/_base.py .......                                                                                     [  5%]
tests/test_3d.py ..                                                                                                          [  5%]
tests/test_concurrent_executor.py ..                                                                                         [  6%]
tests/test_config.py .......                                                                                                 [  7%]
tests/test_driver.py ..FF......                                                                                              [  9%]
tests/test_dynamic_db_passwd.py .                                                                                            [ 10%]
tests/test_eo3.py .......                                                                                                    [ 11%]
tests/test_gbox_ops.py ..                                                                                                    [ 11%]
tests/test_geometry.py ....F..............................................F................x                                 [ 25%]
tests/test_load_data.py ........                                                                                             [ 27%]
tests/test_metadata_fields.py .....                                                                                          [ 28%]
tests/test_model.py ...............                                                                                          [ 31%]
tests/test_testutils.py ....                                                                                                 [ 32%]
tests/test_utils_aws.py ............                                                                                         [ 34%]
tests/test_utils_changes.py ..                                                                                               [ 35%]
tests/test_utils_cog.py .............                                                                                        [ 38%]
tests/test_utils_dask.py .............                                                                                       [ 40%]
tests/test_utils_dates.py ....                                                                                               [ 41%]
tests/test_utils_docs.py ...........................................E......x........                                         [ 53%]
tests/test_utils_generic.py ...                                                                                              [ 54%]
tests/test_utils_other.py ..............................................                                                     [ 63%]
tests/test_utils_rio.py ......                                                                                               [ 64%]
tests/test_warp.py ...                                                                                                       [ 65%]
tests/test_xarray_extension.py .......                                                                                       [ 66%]
tests/api/test_core.py ....                                                                                                  [ 67%]
tests/api/test_grid_workflow.py ..F                                                                                          [ 68%]
tests/api/test_masking.py ..........                                                                                         [ 70%]
tests/api/test_query.py ..............................                                                                       [ 76%]
tests/api/test_virtual.py .............F....                                                                                 [ 80%]
tests/drivers/test_rio_reader.py ........                                                                                    [ 81%]
tests/index/test_api_index_dataset.py ...                                                                                    [ 82%]
tests/index/test_fields.py ....                                                                                              [ 83%]
tests/index/test_hl_index.py .                                                                                               [ 83%]
tests/index/test_query.py .                                                                                                  [ 83%]
tests/index/test_validate_dataset_type.py ....................                                                               [ 87%]
tests/scripts/test_search_tool.py ..                                                                                         [ 88%]
tests/storage/test_base.py ...                                                                                               [ 88%]
tests/storage/test_netcdfwriter.py ...........                                                                               [ 91%]
tests/storage/test_storage.py .................                                                                              [ 94%]
tests/storage/test_storage_load.py ..                                                                                        [ 94%]
tests/storage/test_storage_read.py ......                                                                                    [ 96%]
tests/ui/test_common.py ..E                                                                                                  [ 96%]
tests/ui/test_expression_parsing.py .........                                                                                [ 98%]
tests/ui/test_task_app.py .......                                                                                            [100%]

============================================================== ERRORS ==============================================================
____________________________________________ ERROR at setup of test_read_docs_from_http ____________________________________________
file /home/dap/proj/datacube-core/tests/test_utils_docs.py, line 200
  def test_read_docs_from_http(sample_document_files, httpserver):
E       fixture 'httpserver' not found
>       available fixtures: anyio_backend, anyio_backend_name, anyio_backend_options, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, cov, dask_client, data_folder, doctest_namespace, eo3_dataset_doc, eo3_dataset_file, eo3_dataset_s2, eo3_metadata, eo3_metadata_file, eo_dataset_doc, eo_dataset_file, example_gdal_path, example_netcdf_path, monkeypatch, no_cover, no_crs_gdal_path, non_geo_dataset_doc, non_geo_dataset_file, odc_style_xr_dataset, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, sample_document_files, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, tmpnetcdf_filename, without_aws_env, workdir
>       use 'pytest --fixtures [testpath]' for help on them.

/home/dap/proj/datacube-core/tests/test_utils_docs.py:200
____________________________________________ ERROR at setup of test_ui_path_doc_stream _____________________________________________
file /home/dap/proj/datacube-core/tests/ui/test_common.py, line 90
  def test_ui_path_doc_stream(httpserver):
E       fixture 'httpserver' not found
>       available fixtures: anyio_backend, anyio_backend_name, anyio_backend_options, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, cov, dask_client, data_folder, doctest_namespace, eo3_dataset_doc, eo3_dataset_file, eo3_dataset_s2, eo3_metadata, eo3_metadata_file, eo_dataset_doc, eo_dataset_file, example_gdal_path, example_netcdf_path, monkeypatch, no_cover, no_crs_gdal_path, non_geo_dataset_doc, non_geo_dataset_file, odc_style_xr_dataset, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, tmpnetcdf_filename, without_aws_env, workdir
>       use 'pytest --fixtures [testpath]' for help on them.

/home/dap/proj/datacube-core/tests/ui/test_common.py:90
============================================================= FAILURES =============================================================
_______________________________________________________ test_writer_drivers ________________________________________________________

    def test_writer_drivers():
        available_drivers = writer_drivers()
>       assert 'netcdf' in available_drivers
E       AssertionError: assert 'netcdf' in []

tests/test_driver.py:44: AssertionError
-------------------------------------------------------- Captured log call ---------------------------------------------------------
WARNING  datacube.drivers.driver_cache:driver_cache.py:38 Failed to resolve driver datacube.plugins.io.write::netcdf
WARNING  datacube.drivers.driver_cache:driver_cache.py:39 Error was: VersionConflict(rasterio 1.2.10 (/home/dap/anaconda3/envs/odc/lib/python3.9/site-packages), Requirement.parse('rasterio>=1.3.2'))
________________________________________________________ test_index_drivers ________________________________________________________

    def test_index_drivers():
        available_drivers = index_drivers()
        assert 'default' in available_drivers
>       assert 'null' in available_drivers
E       AssertionError: assert 'null' in ['default', 'postgres']

tests/test_driver.py:51: AssertionError
-------------------------------------------------------- Captured log call ---------------------------------------------------------
WARNING  datacube.drivers.driver_cache:driver_cache.py:38 Failed to resolve driver datacube.plugins.index::default
WARNING  datacube.drivers.driver_cache:driver_cache.py:39 Error was: VersionConflict(rasterio 1.2.10 (/home/dap/anaconda3/envs/odc/lib/python3.9/site-packages), Requirement.parse('rasterio>=1.3.2'))
WARNING  datacube.drivers.driver_cache:driver_cache.py:38 Failed to resolve driver datacube.plugins.index::memory
WARNING  datacube.drivers.driver_cache:driver_cache.py:39 Error was: VersionConflict(rasterio 1.2.10 (/home/dap/anaconda3/envs/odc/lib/python3.9/site-packages), Requirement.parse('rasterio>=1.3.2'))
WARNING  datacube.drivers.driver_cache:driver_cache.py:38 Failed to resolve driver datacube.plugins.index::null
WARNING  datacube.drivers.driver_cache:driver_cache.py:39 Error was: VersionConflict(rasterio 1.2.10 (/home/dap/anaconda3/envs/odc/lib/python3.9/site-packages), Requirement.parse('rasterio>=1.3.2'))
WARNING  datacube.drivers.driver_cache:driver_cache.py:38 Failed to resolve driver datacube.plugins.index::postgis
WARNING  datacube.drivers.driver_cache:driver_cache.py:39 Error was: VersionConflict(rasterio 1.2.10 (/home/dap/anaconda3/envs/odc/lib/python3.9/site-packages), Requirement.parse('rasterio>=1.3.2'))
_____________________________________________________________ test_ops _____________________________________________________________

    def test_ops():
        box1 = geometry.box(10, 10, 30, 30, crs=epsg4326)
        box2 = geometry.box(20, 10, 40, 30, crs=epsg4326)
        box3 = geometry.box(20, 10, 40, 30, crs=epsg4326)
        box4 = geometry.box(40, 10, 60, 30, crs=epsg4326)
        no_box = None

        assert box1 != box2
        assert box2 == box3
        assert box3 != no_box

        union1 = box1.union(box2)
        assert union1.area == 600.0

        with pytest.raises(geometry.CRSMismatchError):
            box1.union(box2.to_crs(epsg3857))

        inter1 = box1.intersection(box2)
        assert bool(inter1)
        assert inter1.area == 200.0

        inter2 = box1.intersection(box4)
        assert not bool(inter2)
        assert inter2.is_empty
        # assert not inter2.is_valid  TODO: what's going on here?

        diff1 = box1.difference(box2)
        assert diff1.area == 200.0

        symdiff1 = box1.symmetric_difference(box2)
        assert symdiff1.area == 400.0

        # test segmented
        line = geometry.line([(0, 0), (0, 5), (10, 5)], epsg4326)
        line2 = line.segmented(2)
        assert line.crs is line2.crs
        assert line.length == line2.length
        assert len(line.coords) < len(line2.coords)
        poly = geometry.polygon([(0, 0), (0, 5), (10, 5)], epsg4326)
        poly2 = poly.segmented(2)
        assert poly.crs is poly2.crs
        assert poly.length == poly2.length
>       assert poly.area == poly2.area
E       assert 25.0 == 25.000000000000004
E        +  where 25.0 = Geometry(POLYGON ((0 0, 0 5, 10 5, 0 0)), EPSG:4326).area
E        +  and   25.000000000000004 = Geometry(POLYGON ((0 0, 0 5, 2 5, 4 5, 6 5, 8 5, 10 5, 8.211145618000169 4.105572809000084, 6.422291236000336 3.211145...4000506 2.316718427000253, 2.844582472000673 1.422291236000337, 1.055728090000841 0.5278640450004207, 0 0)), EPSG:4326).area

tests/test_geometry.py:232: AssertionError
_________________________________________________________ test_crs_compat __________________________________________________________

    def test_crs_compat():
        import rasterio.crs

        crs = CRS("epsg:3577")
        assert crs.epsg == 3577
        crs2 = CRS(crs)
        assert crs.epsg == crs2.epsg

        crs_rio = rasterio.crs.CRS(init='epsg:3577')
>       assert CRS(crs_rio).epsg == 3577
E       assert None == 3577
E        +  where None = CRS('PROJCS["GDA94 / Australian Albers",GEOGCS["GDA94",DATUM["Geocentric_Datum_of_Australia_1994",SPHEROID["GRS 1980",...ng",0],UNIT["metre",1,AUTHORITY["EPSG","9001"]],AXIS["Easting",EAST],AXIS["Northing",NORTH],AUTHORITY["EPSG","3577"]]').epsg
E        +    where CRS('PROJCS["GDA94 / Australian Albers",GEOGCS["GDA94",DATUM["Geocentric_Datum_of_Australia_1994",SPHEROID["GRS 1980",...ng",0],UNIT["metre",1,AUTHORITY["EPSG","9001"]],AXIS["Easting",EAST],AXIS["Northing",NORTH],AUTHORITY["EPSG","3577"]]') = CRS(CRS.from_epsg(3577))

tests/test_geometry.py:1454: AssertionError
________________________________________________ test_gridworkflow_with_time_depth _________________________________________________

    def test_gridworkflow_with_time_depth():
        """Test GridWorkflow with time series.
        Also test `Tile` methods `split` and `split_by_time`
        """
        fakecrs = geometry.CRS("EPSG:4326")

        grid = 100  # spatial frequency in crs units
        pixel = 10  # square pixel linear dimension in crs units
        # if cell(0,0) has lower left corner at grid origin,
        # and cell indices increase toward upper right,
        # then this will be cell(1,-2).
        gridspec = GridSpec(
            crs=fakecrs, tile_size=(grid, grid), resolution=(-pixel, pixel)
        )  # e.g. product gridspec

        def make_fake_datasets(num_datasets):
            start_time = datetime.datetime(2001, 2, 15)
            delta = datetime.timedelta(days=16)
            for i in range(num_datasets):
                fakedataset = MagicMock()
                fakedataset.extent = geometry.box(
                    left=grid, bottom=-grid, right=2 * grid, top=-2 * grid, crs=fakecrs
                )
                fakedataset.center_time = start_time + (delta * i)
                yield fakedataset

        fakeindex = PickableMock()
        fakeindex.datasets.get_field_names.return_value = ["time"]  # permit query on time
        fakeindex.datasets.search_eager.return_value = list(make_fake_datasets(100))

        # ------ test with time dimension ----

        gw = GridWorkflow(fakeindex, gridspec)
        query = dict(product="fake_product_name")

        cells = gw.list_cells(**query)
        for cell_index, cell in cells.items():

            #  test Tile.split()
>           for label, tile in cell.split("time"):

tests/api/test_grid_workflow.py:277: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
datacube/api/grid_workflow.py:97: in split
    yield self.sources[dim].values[i], self[tuple(indexer)]
datacube/api/grid_workflow.py:79: in __getitem__
    sources = _fast_slice(self.sources, chunk[:len(self.sources.shape)])
datacube/api/grid_workflow.py:27: in _fast_slice
    return xarray.DataArray(variable, coords=coords, fastpath=True)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <[RecursionError('maximum recursion depth exceeded') raised in repr()] DataArray object at 0xffff28718820>
data = <xarray.Variable (time: 1)>
array([(<MagicMock id='281470553854256'>,)], dtype=object)
coords = OrderedDict([('time', <xarray.Variable (time: 1)>
array(['2001-02-15T00:00:00.000000000'], dtype='datetime64[ns]')
Attributes:
    units:    seconds since 1970-01-01 00:00:00)])
dims = None, name = None, attrs = None, indexes = None, fastpath = True

    def __init__(
        self,
        data: Any = dtypes.NA,
        coords: Sequence[Sequence[Any] | pd.Index | DataArray]
        | Mapping[Any, Any]
        | None = None,
        dims: Hashable | Sequence[Hashable] | None = None,
        name: Hashable = None,
        attrs: Mapping = None,
        # internal parameters
        indexes: dict[Hashable, Index] = None,
        fastpath: bool = False,
    ) -> None:
        if fastpath:
            variable = data
            assert dims is None
            assert attrs is None
>           assert indexes is not None
E           AssertionError

../../anaconda3/envs/odc/lib/python3.9/site-packages/xarray/core/dataarray.py:390: AssertionError
__________________________________________________________ test_aggregate __________________________________________________________

dc = <MagicMock id='281471359713136'>
query = {'lat': (-35.2, -35.21), 'lon': (149.0, 149.01), 'time': ('2014-01-01', '2014-03-01')}
catalog = <datacube.virtual.catalog.Catalog object at 0xffff75608190>

    def test_aggregate(dc, query, catalog):
        aggr = catalog['mean_blue']

        measurements = aggr.output_measurements({product.name: product
                                                 for product in dc.index.products.get_all()})
        assert 'blue' in measurements

        with mock.patch('datacube.virtual.impl.Datacube') as mock_datacube, warnings.catch_warnings():
            warnings.simplefilter("ignore")
            mock_datacube.load_data = load_data
            mock_datacube.group_datasets = group_datasets
>           data = aggr.load(dc, **query)

tests/api/test_virtual.py:492: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
datacube/virtual/impl.py:310: in load
    return self.fetch(grouped, **query)
datacube/virtual/impl.py:549: in fetch
    result = xarray.concat(groups, dim=dim).assign_attrs(**select_unique([g.attrs for g in groups]))
../../anaconda3/envs/odc/lib/python3.9/site-packages/xarray/core/concat.py:243: in concat
    return _dataset_concat(
../../anaconda3/envs/odc/lib/python3.9/site-packages/xarray/core/concat.py:485: in _dataset_concat
    datasets = [cast(T_Dataset, ds.expand_dims(dim)) for ds in datasets]
../../anaconda3/envs/odc/lib/python3.9/site-packages/xarray/core/concat.py:485: in <listcomp>
    datasets = [cast(T_Dataset, ds.expand_dims(dim)) for ds in datasets]
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <xarray.Dataset>
Dimensions:      (y: 50, x: 43)
Coordinates:
  * y            (y) float64 -3.947e+06 -3.947e+06 ... -...nan nan nan nan nan nan ... nan nan nan nan nan
Attributes:
    crs:           EPSG:3577
    grid_mapping:  spatial_ref
dim = {'time': 1}, axis = [0], dim_kwargs = {}, d = 'time'

    def expand_dims(
        self,
        dim: None | Hashable | Sequence[Hashable] | Mapping[Any, Any] = None,
        axis: None | int | Sequence[int] = None,
        **dim_kwargs: Any,
    ) -> Dataset:
        """Return a new object with an additional axis (or axes) inserted at
        the corresponding position in the array shape.  The new object is a
        view into the underlying array, not a copy.

        If dim is already a scalar coordinate, it will be promoted to a 1D
        coordinate consisting of a single value.

        Parameters
        ----------
        dim : hashable, sequence of hashable, mapping, or None
            Dimensions to include on the new variable. If provided as hashable
            or sequence of hashable, then dimensions are inserted with length
            1. If provided as a mapping, then the keys are the new dimensions
            and the values are either integers (giving the length of the new
            dimensions) or array-like (giving the coordinates of the new
            dimensions).
        axis : int, sequence of int, or None, default: None
            Axis position(s) where new axis is to be inserted (position(s) on
            the result array). If a sequence of integers is passed,
            multiple axes are inserted. In this case, dim arguments should be
            same length list. If axis=None is passed, all the axes will be
            inserted to the start of the result array.
        **dim_kwargs : int or sequence or ndarray
            The keywords are arbitrary dimensions being inserted and the values
            are either the lengths of the new dims (if int is given), or their
            coordinates. Note, this is an alternative to passing a dict to the
            dim kwarg and will only be used if dim is None.

        Returns
        -------
        expanded : Dataset
            This object, but with additional dimension(s).

        See Also
        --------
        DataArray.expand_dims
        """
        if dim is None:
            pass
        elif isinstance(dim, Mapping):
            # We're later going to modify dim in place; don't tamper with
            # the input
            dim = dict(dim)
        elif isinstance(dim, int):
            raise TypeError(
                "dim should be hashable or sequence of hashables or mapping"
            )
        elif isinstance(dim, str) or not isinstance(dim, Sequence):
            dim = {dim: 1}
        elif isinstance(dim, Sequence):
            if len(dim) != len(set(dim)):
                raise ValueError("dims should not contain duplicate values.")
            dim = {d: 1 for d in dim}

        dim = either_dict_or_kwargs(dim, dim_kwargs, "expand_dims")
        assert isinstance(dim, MutableMapping)

        if axis is None:
            axis = list(range(len(dim)))
        elif not isinstance(axis, Sequence):
            axis = [axis]

        if len(dim) != len(axis):
            raise ValueError("lengths of dim and axis should be identical.")
        for d in dim:
            if d in self.dims:
                raise ValueError(f"Dimension {d} already exists.")
            if d in self._variables and not utils.is_scalar(self._variables[d]):
>               raise ValueError(
                    "{dim} already exists as coordinate or"
                    " variable name.".format(dim=d)
                )
E               ValueError: time already exists as coordinate or variable name.

../../anaconda3/envs/odc/lib/python3.9/site-packages/xarray/core/dataset.py:3915: ValueError
========================================================= warnings summary =========================================================
../../anaconda3/envs/odc/lib/python3.9/site-packages/botocore/httpsession.py:41
  /home/dap/anaconda3/envs/odc/lib/python3.9/site-packages/botocore/httpsession.py:41: DeprecationWarning: 'urllib3.contrib.pyopenssl' module is deprecated and will be removed in a future release of urllib3 2.x. Read more in this issue: https://github.com/urllib3/urllib3/issues/2680
    from urllib3.contrib.pyopenssl import orig_util_SSLContext as SSLContext

datacube/utils/geometry/_base.py:153
datacube/utils/geometry/_base.py:153
  /home/dap/proj/datacube-core/datacube/utils/geometry/_base.py:153: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
    DEFAULT_WKT_VERSION = (WktVersion.WKT1_GDAL if LooseVersion(rasterio.__gdal_version__) < LooseVersion("3.0.0")

<frozen importlib._bootstrap>:228
  <frozen importlib._bootstrap>:228: RuntimeWarning: numpy.ndarray size changed, may indicate binary incompatibility. Expected 16 from C header, got 88 from PyObject

datacube/storage/masking.py:9
  /home/dap/proj/datacube-core/datacube/storage/masking.py:9: DeprecationWarning: datacube.storage.masking has moved to datacube.utils.masking
    warnings.warn("datacube.storage.masking has moved to datacube.utils.masking",

tests/api/test_grid_workflow.py::test_gridworkflow
  /home/dap/proj/datacube-core/tests/api/test_grid_workflow.py:199: DeprecationWarning: `np.int` is a deprecated alias for the builtin `int`. To silence this warning, use `int` by itself. Doing this will not modify any behavior and is safe. When replacing `np.int`, you may wish to use e.g. `np.int64` or `np.int32` to specify the precision. If you wish to review your current use, check the release note link for additional information.
  Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
    measurement = dict(nodata=0, dtype=numpy.int)

tests/api/test_virtual.py::test_select_transform
  /home/dap/proj/datacube-core/datacube/virtual/transformations.py:304: DeprecationWarning: the `select` transform is deprecated, please use `expressions` instead
    warnings.warn("the `select` transform is deprecated, please use `expressions` instead",

tests/api/test_virtual.py::test_rename_transform
  /home/dap/proj/datacube-core/datacube/virtual/transformations.py:254: DeprecationWarning: the `rename` transform is deprecated, please use `expressions` instead
    warnings.warn("the `rename` transform is deprecated, please use `expressions` instead",

tests/api/test_virtual.py::test_to_float_transform
tests/api/test_virtual.py::test_aggregate
  /home/dap/proj/datacube-core/datacube/virtual/transformations.py:196: DeprecationWarning: the `to_float` transform is deprecated, please use `expressions` instead
    warnings.warn("the `to_float` transform is deprecated, please use `expressions` instead",

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html

---------- coverage: platform linux, python 3.9.15-final-0 -----------
Name                                         Stmts   Miss  Cover
----------------------------------------------------------------
datacube/__init__.py                             8      0   100%
datacube/__main__.py                             0      0   100%
datacube/api/__init__.py                         4      0   100%
datacube/api/core.py                           384    126    67%
datacube/api/grid_workflow.py                  137     10    93%
datacube/api/query.py                          213     19    91%
datacube/config.py                             126      6    95%
datacube/drivers/__init__.py                     5      0   100%
datacube/drivers/_tools.py                      14      0   100%
datacube/drivers/_types.py                      46      0   100%
datacube/drivers/datasource.py                  30      0   100%
datacube/drivers/driver_cache.py                29     10    66%
datacube/drivers/indexes.py                     24      0   100%
datacube/drivers/netcdf/__init__.py              4      0   100%
datacube/drivers/netcdf/_safestrings.py         41      2    95%
datacube/drivers/netcdf/_write.py               55      0   100%
datacube/drivers/netcdf/driver.py               36     11    69%
datacube/drivers/netcdf/writer.py              168     14    92%
datacube/drivers/postgis/__init__.py             4      0   100%
datacube/drivers/postgis/_api.py               396    297    25%
datacube/drivers/postgis/_connections.py       132     67    49%
datacube/drivers/postgis/_core.py               99     66    33%
datacube/drivers/postgis/_fields.py            266    134    50%
datacube/drivers/postgis/_schema.py            107      1    99%
datacube/drivers/postgis/_spatial.py            88     58    34%
datacube/drivers/postgis/sql.py                 55     12    78%
datacube/drivers/postgres/__init__.py            4      0   100%
datacube/drivers/postgres/_api.py              302    197    35%
datacube/drivers/postgres/_connections.py      105     35    67%
datacube/drivers/postgres/_core.py             107     74    31%
datacube/drivers/postgres/_dynamic.py           64     51    20%
datacube/drivers/postgres/_fields.py           268     77    71%
datacube/drivers/postgres/_schema.py            14      0   100%
datacube/drivers/postgres/sql.py                55     12    78%
datacube/drivers/readers.py                     41      7    83%
datacube/drivers/rio/__init__.py                 1      0   100%
datacube/drivers/rio/_reader.py                134      0   100%
datacube/drivers/writers.py                     20      5    75%
datacube/execution/__init__.py                   0      0   100%
datacube/execution/worker.py                    30     20    33%
datacube/executor.py                           169     59    65%
datacube/helpers.py                             19     14    26%
datacube/index/__init__.py                       6      0   100%
datacube/index/_api.py                          14      8    43%
datacube/index/abstract.py                     378    140    63%
datacube/index/eo3.py                          101      1    99%
datacube/index/exceptions.py                    10      2    80%
datacube/index/fields.py                        31      7    77%
datacube/index/hl.py                           158     85    46%
datacube/index/memory/__init__.py                1      0   100%
datacube/index/memory/_datasets.py             475    409    14%
datacube/index/memory/_fields.py                11      5    55%
datacube/index/memory/_metadata_types.py        69     46    33%
datacube/index/memory/_products.py             107     85    21%
datacube/index/memory/_users.py                 38     27    29%
datacube/index/memory/index.py                  67     25    63%
datacube/index/null/__init__.py                  1      0   100%
datacube/index/null/_datasets.py                63     24    62%
datacube/index/null/_metadata_types.py          16      4    75%
datacube/index/null/_products.py                23      7    70%
datacube/index/null/_users.py                   10      2    80%
datacube/index/null/index.py                    62     22    65%
datacube/index/postgis/__init__.py               0      0   100%
datacube/index/postgis/_datasets.py            377    310    18%
datacube/index/postgis/_metadata_types.py       82     58    29%
datacube/index/postgis/_products.py            126    100    21%
datacube/index/postgis/_transaction.py          27     12    56%
datacube/index/postgis/_users.py                21     11    48%
datacube/index/postgis/index.py                103     52    50%
datacube/index/postgres/__init__.py              0      0   100%
datacube/index/postgres/_datasets.py           366    264    28%
datacube/index/postgres/_metadata_types.py      82     58    29%
datacube/index/postgres/_products.py           123     97    21%
datacube/index/postgres/_transaction.py         27     10    63%
datacube/index/postgres/_users.py               21     11    48%
datacube/index/postgres/index.py                90     41    54%
datacube/model/__init__.py                     515     50    90%
datacube/model/_base.py                          6      0   100%
datacube/model/fields.py                        83      4    95%
datacube/model/utils.py                        164     37    77%
datacube/scripts/__init__.py                     0      0   100%
datacube/scripts/cli_app.py                      8      0   100%
datacube/scripts/dataset.py                    358    259    28%
datacube/scripts/ingest.py                     265    209    21%
datacube/scripts/metadata.py                    95     60    37%
datacube/scripts/product.py                    131     86    34%
datacube/scripts/search_tool.py                 75     10    87%
datacube/scripts/system.py                      54     30    44%
datacube/scripts/user.py                        62     20    68%
datacube/storage/__init__.py                     5      0   100%
datacube/storage/_base.py                       56      0   100%
datacube/storage/_hdf5.py                        2      0   100%
datacube/storage/_load.py                       86      0   100%
datacube/storage/_read.py                      127      3    98%
datacube/storage/_rio.py                       143     15    90%
datacube/storage/masking.py                      3      0   100%
datacube/testutils/__init__.py                 208      9    96%
datacube/testutils/geom.py                      66      0   100%
datacube/testutils/io.py                       204      7    97%
datacube/testutils/iodriver.py                  31      0   100%
datacube/testutils/threads.py                   15      0   100%
datacube/ui/__init__.py                          5      0   100%
datacube/ui/click.py                           163     91    44%
datacube/ui/common.py                           52     20    62%
datacube/ui/expression.py                       46     10    78%
datacube/ui/task_app.py                        159     30    81%
datacube/utils/__init__.py                      10      0   100%
datacube/utils/_misc.py                          7      0   100%
datacube/utils/aws/__init__.py                 180      0   100%
datacube/utils/changes.py                       75      1    99%
datacube/utils/cog.py                          104      0   100%
datacube/utils/dask.py                          93      0   100%
datacube/utils/dates.py                         69      8    88%
datacube/utils/documents.py                    280     12    96%
datacube/utils/generic.py                       39      0   100%
datacube/utils/geometry/__init__.py              5      0   100%
datacube/utils/geometry/_base.py               765     13    98%
datacube/utils/geometry/_warp.py                47      0   100%
datacube/utils/geometry/gbox.py                109      1    99%
datacube/utils/geometry/tools.py               269      0   100%
datacube/utils/io.py                            31      1    97%
datacube/utils/masking.py                      118      0   100%
datacube/utils/math.py                         116      0   100%
datacube/utils/py.py                            33      1    97%
datacube/utils/rio/__init__.py                   3      0   100%
datacube/utils/rio/_rio.py                      65      0   100%
datacube/utils/serialise.py                     44      9    80%
datacube/utils/uris.py                         108      5    95%
datacube/utils/xarray_geoextensions.py         102      0   100%
datacube/virtual/__init__.py                    89     12    87%
datacube/virtual/catalog.py                     37     11    70%
datacube/virtual/expr.py                        47      3    94%
datacube/virtual/impl.py                       449     72    84%
datacube/virtual/transformations.py            213     41    81%
datacube/virtual/utils.py                       31      4    87%
----------------------------------------------------------------
TOTAL                                        13615   4451    67%

======================================================= slowest 5 durations ========================================================
4.15s call     tests/test_utils_aws.py::test_s3_basics
2.04s call     tests/test_concurrent_executor.py::test_concurrent_executor
1.33s call     tests/test_utils_dask.py::test_pmap
1.21s call     tests/test_utils_dask.py::test_compute_tasks
0.87s call     tests/test_utils_dask.py::test_start_local_dask_dashboard_link
===================================================== short test summary info ======================================================
SKIPPED [1] ../../anaconda3/envs/odc/lib/python3.9/site-packages/_pytest/doctest.py:452: all tests skipped by +SKIP option
XFAIL tests/test_geometry.py::test_lonalt_bounds_more_than_180
  Bounds computation for large geometries in safe mode is broken
XFAIL tests/test_utils_docs.py::test_merge_with_nan
  Merging dictionaries with content of NaN doesn't work currently
ERROR tests/test_utils_docs.py::test_read_docs_from_http
ERROR tests/ui/test_common.py::test_ui_path_doc_stream
FAILED tests/test_driver.py::test_writer_drivers - AssertionError: assert 'netcdf' in []
FAILED tests/test_driver.py::test_index_drivers - AssertionError: assert 'null' in ['default', 'postgres']
FAILED tests/test_geometry.py::test_ops - assert 25.0 == 25.000000000000004
FAILED tests/test_geometry.py::test_crs_compat - assert None == 3577
FAILED tests/api/test_grid_workflow.py::test_gridworkflow_with_time_depth - AssertionError
FAILED tests/api/test_virtual.py::test_aggregate - ValueError: time already exists as coordinate or variable name.
=========================== 6 failed, 478 passed, 1 skipped, 2 xfailed, 10 warnings, 2 errors in 19.92s ============================
dap@odc ~/proj/datacube-core % 

Just to confirm that Postgres is now working correctly:

~/proj/datacube-core % psql -d agdcintegration                             
psql (14.5 (Ubuntu 14.5-0ubuntu0.22.04.1))
Type "help" for help.

agdcintegration=# 

Enough for now.

permezel commented 1 year ago

OK. One more go. I created a python=3.8 env and tried.

set -x
PYTHON=${1-3.8}
ENV=odc_${PYTHON}
conda env remove -n ${ENV} --yes
conda config --append channels conda-forge
conda update -n base --yes -c defaults conda
conda create --name ${ENV} --yes python=${PYTHON} datacube
conda install -n ${ENV} --yes pycodestyle
conda install -n ${ENV} --yes pylint
conda install -n ${ENV} --yes jupyter matplotlib scipy pytest-cov hypothesis
conda install -n ${ENV} --yes geoalchemy2 moto 
cat ~/.datacube_integration.conf
cd ~/proj/datacube-core
conda run -n ${ENV} ./check-code.sh integration_tests

The results are somewhat better, but not pretty:

---------- coverage: platform linux, python 3.8.15-final-0 -----------
Name                                         Stmts   Miss  Cover
----------------------------------------------------------------
datacube/__init__.py                             8      0   100%
datacube/__main__.py                             0      0   100%
datacube/api/__init__.py                         4      0   100%
datacube/api/core.py                           384    107    72%
datacube/api/grid_workflow.py                  137     10    93%
datacube/api/query.py                          213     18    92%
datacube/config.py                             126      3    98%
datacube/drivers/__init__.py                     5      0   100%
datacube/drivers/_tools.py                      14      0   100%
datacube/drivers/_types.py                      46      0   100%
datacube/drivers/datasource.py                  30      0   100%
datacube/drivers/driver_cache.py                29     10    66%
datacube/drivers/indexes.py                     24      0   100%
datacube/drivers/netcdf/__init__.py              4      0   100%
datacube/drivers/netcdf/_safestrings.py         41      2    95%
datacube/drivers/netcdf/_write.py               55      0   100%
datacube/drivers/netcdf/driver.py               36     11    69%
datacube/drivers/netcdf/writer.py              168     14    92%
datacube/drivers/postgis/__init__.py             4      0   100%
datacube/drivers/postgis/_api.py               396    288    27%
datacube/drivers/postgis/_connections.py       132     56    58%
datacube/drivers/postgis/_core.py               99     65    34%
datacube/drivers/postgis/_fields.py            266    134    50%
datacube/drivers/postgis/_schema.py            107      1    99%
datacube/drivers/postgis/_spatial.py            88     58    34%
datacube/drivers/postgis/sql.py                 55     12    78%
datacube/drivers/postgres/__init__.py            4      0   100%
datacube/drivers/postgres/_api.py              302     10    97%
datacube/drivers/postgres/_connections.py      105     12    89%
datacube/drivers/postgres/_core.py             107      8    93%
datacube/drivers/postgres/_dynamic.py           64      7    89%
datacube/drivers/postgres/_fields.py           268     25    91%
datacube/drivers/postgres/_schema.py            14      0   100%
datacube/drivers/postgres/sql.py                55      1    98%
datacube/drivers/readers.py                     41      7    83%
datacube/drivers/rio/__init__.py                 1      0   100%
datacube/drivers/rio/_reader.py                134      0   100%
datacube/drivers/writers.py                     20      3    85%
datacube/execution/__init__.py                   0      0   100%
datacube/execution/worker.py                    30     20    33%
datacube/executor.py                           169     59    65%
datacube/helpers.py                             19     14    26%
datacube/index/__init__.py                       6      0   100%
datacube/index/_api.py                          14      1    93%
datacube/index/abstract.py                     378     64    83%
datacube/index/eo3.py                          101      1    99%
datacube/index/exceptions.py                    10      0   100%
datacube/index/fields.py                        31      1    97%
datacube/index/hl.py                           158      1    99%
datacube/index/memory/__init__.py                1      0   100%
datacube/index/memory/_datasets.py             475    409    14%
datacube/index/memory/_fields.py                11      5    55%
datacube/index/memory/_metadata_types.py        69     46    33%
datacube/index/memory/_products.py             107     85    21%
datacube/index/memory/_users.py                 38     27    29%
datacube/index/memory/index.py                  67     25    63%
datacube/index/null/__init__.py                  1      0   100%
datacube/index/null/_datasets.py                63     24    62%
datacube/index/null/_metadata_types.py          16      4    75%
datacube/index/null/_products.py                23      7    70%
datacube/index/null/_users.py                   10      2    80%
datacube/index/null/index.py                    62     22    65%
datacube/index/postgis/__init__.py               0      0   100%
datacube/index/postgis/_datasets.py            377    310    18%
datacube/index/postgis/_metadata_types.py       82     58    29%
datacube/index/postgis/_products.py            126    100    21%
datacube/index/postgis/_transaction.py          27     12    56%
datacube/index/postgis/_users.py                21     11    48%
datacube/index/postgis/index.py                103     52    50%
datacube/index/postgres/__init__.py              0      0   100%
datacube/index/postgres/_datasets.py           366     20    95%
datacube/index/postgres/_metadata_types.py      82      3    96%
datacube/index/postgres/_products.py           123      6    95%
datacube/index/postgres/_transaction.py         27      0   100%
datacube/index/postgres/_users.py               21      0   100%
datacube/index/postgres/index.py                90      1    99%
datacube/model/__init__.py                     515     38    93%
datacube/model/_base.py                          6      0   100%
datacube/model/fields.py                        83      4    95%
datacube/model/utils.py                        164     34    79%
datacube/scripts/__init__.py                     0      0   100%
datacube/scripts/cli_app.py                      8      0   100%
datacube/scripts/dataset.py                    358     38    89%
datacube/scripts/ingest.py                     265    188    29%
datacube/scripts/metadata.py                    95     20    79%
datacube/scripts/product.py                    131     24    82%
datacube/scripts/search_tool.py                 75      2    97%
datacube/scripts/system.py                      54      6    89%
datacube/scripts/user.py                        62      7    89%
datacube/storage/__init__.py                     5      0   100%
datacube/storage/_base.py                       56      0   100%
datacube/storage/_hdf5.py                        2      0   100%
datacube/storage/_load.py                       86      0   100%
datacube/storage/_read.py                      127      3    98%
datacube/storage/_rio.py                       143     15    90%
datacube/storage/masking.py                      3      0   100%
datacube/testutils/__init__.py                 208      4    98%
datacube/testutils/geom.py                      66      0   100%
datacube/testutils/io.py                       204      7    97%
datacube/testutils/iodriver.py                  31      0   100%
datacube/testutils/threads.py                   15      0   100%
datacube/ui/__init__.py                          5      0   100%
datacube/ui/click.py                           163     26    84%
datacube/ui/common.py                           52      1    98%
datacube/ui/expression.py                       46     10    78%
datacube/ui/task_app.py                        159     30    81%
datacube/utils/__init__.py                      10      0   100%
datacube/utils/_misc.py                          7      0   100%
datacube/utils/aws/__init__.py                 180      0   100%
datacube/utils/changes.py                       75      1    99%
datacube/utils/cog.py                          104      0   100%
datacube/utils/dask.py                          93      0   100%
datacube/utils/dates.py                         69      8    88%
datacube/utils/documents.py                    280      5    98%
datacube/utils/generic.py                       39      0   100%
datacube/utils/geometry/__init__.py              5      0   100%
datacube/utils/geometry/_base.py               765     13    98%
datacube/utils/geometry/_warp.py                47      0   100%
datacube/utils/geometry/gbox.py                109      1    99%
datacube/utils/geometry/tools.py               269      0   100%
datacube/utils/io.py                            31      1    97%
datacube/utils/masking.py                      118      0   100%
datacube/utils/math.py                         116      0   100%
datacube/utils/py.py                            33      0   100%
datacube/utils/rio/__init__.py                   3      0   100%
datacube/utils/rio/_rio.py                      65      0   100%
datacube/utils/serialise.py                     44      0   100%
datacube/utils/uris.py                         108      5    95%
datacube/utils/xarray_geoextensions.py         102      0   100%
datacube/virtual/__init__.py                    89     12    87%
datacube/virtual/catalog.py                     37     11    70%
datacube/virtual/expr.py                        47      3    94%
datacube/virtual/impl.py                       449     72    84%
datacube/virtual/transformations.py            213     41    81%
datacube/virtual/utils.py                       31      4    87%
----------------------------------------------------------------
TOTAL                                        13615   2886    79%

======================================================= slowest 5 durations ========================================================
12.37s call     integration_tests/test_config_tool.py::test_add_example_dataset_types[datacube-US/Pacific]
4.16s call     tests/test_utils_aws.py::test_s3_basics
2.05s call     tests/test_concurrent_executor.py::test_concurrent_executor
1.50s call     tests/test_utils_dask.py::test_pmap
1.24s call     tests/test_utils_dask.py::test_compute_tasks
===================================================== short test summary info ======================================================
SKIPPED [2] integration_tests/test_3d.py:26: could not import 'dcio_example.xarray_3d': No module named 'dcio_example'
SKIPPED [1] ../../anaconda3/envs/odc_3.8/lib/python3.8/site-packages/_pytest/doctest.py:452: all tests skipped by +SKIP option
XFAIL tests/test_geometry.py::test_lonalt_bounds_more_than_180
  Bounds computation for large geometries in safe mode is broken
XFAIL tests/test_utils_docs.py::test_merge_with_nan
  Merging dictionaries with content of NaN doesn't work currently
ERROR tests/test_utils_docs.py::test_read_docs_from_http
ERROR tests/ui/test_common.py::test_ui_path_doc_stream
ERROR integration_tests/test_cli_output.py::test_cli_product_subcommand[experimental-US/Pacific] - sqlalchemy.exc.OperationalErro...
ERROR integration_tests/test_cli_output.py::test_cli_product_subcommand[experimental-UTC] - sqlalchemy.exc.OperationalError: (psy...
ERROR integration_tests/test_cli_output.py::test_cli_metadata_subcommand[experimental-US/Pacific] - sqlalchemy.exc.OperationalErr...
ERROR integration_tests/test_cli_output.py::test_cli_metadata_subcommand[experimental-UTC] - sqlalchemy.exc.OperationalError: (ps...
ERROR integration_tests/test_cli_output.py::test_cli_dataset_subcommand[experimental-US/Pacific] - sqlalchemy.exc.OperationalErro...
ERROR integration_tests/test_cli_output.py::test_cli_dataset_subcommand[experimental-UTC] - sqlalchemy.exc.OperationalError: (psy...
ERROR integration_tests/test_cli_output.py::test_readd_and_update_metadata_product_dataset_command[experimental-US/Pacific] - sql...
ERROR integration_tests/test_cli_output.py::test_readd_and_update_metadata_product_dataset_command[experimental-UTC] - sqlalchemy...
ERROR integration_tests/test_config_tool.py::test_add_example_dataset_types[experimental-US/Pacific] - sqlalchemy.exc.Operational...
ERROR integration_tests/test_config_tool.py::test_add_example_dataset_types[experimental-UTC] - sqlalchemy.exc.OperationalError: ...
ERROR integration_tests/test_config_tool.py::test_error_returned_on_invalid[experimental-US/Pacific] - sqlalchemy.exc.Operational...
ERROR integration_tests/test_config_tool.py::test_error_returned_on_invalid[experimental-UTC] - sqlalchemy.exc.OperationalError: ...
ERROR integration_tests/test_config_tool.py::test_config_check[experimental-US/Pacific] - sqlalchemy.exc.OperationalError: (psyco...
ERROR integration_tests/test_config_tool.py::test_config_check[experimental-UTC] - sqlalchemy.exc.OperationalError: (psycopg2.Ope...
ERROR integration_tests/test_config_tool.py::test_list_users_does_not_fail[experimental-US/Pacific] - sqlalchemy.exc.OperationalE...
ERROR integration_tests/test_config_tool.py::test_list_users_does_not_fail[experimental-UTC] - sqlalchemy.exc.OperationalError: (...
ERROR integration_tests/test_config_tool.py::test_db_init_noop[experimental-US/Pacific] - sqlalchemy.exc.OperationalError: (psyco...
ERROR integration_tests/test_config_tool.py::test_db_init_noop[experimental-UTC] - sqlalchemy.exc.OperationalError: (psycopg2.Ope...
ERROR integration_tests/test_config_tool.py::test_db_init[experimental-US/Pacific] - sqlalchemy.exc.OperationalError: (psycopg2.O...
ERROR integration_tests/test_config_tool.py::test_db_init[experimental-UTC] - sqlalchemy.exc.OperationalError: (psycopg2.Operatio...
ERROR integration_tests/test_config_tool.py::test_add_no_such_product[experimental-US/Pacific] - sqlalchemy.exc.OperationalError:...
ERROR integration_tests/test_config_tool.py::test_add_no_such_product[experimental-UTC] - sqlalchemy.exc.OperationalError: (psyco...
ERROR integration_tests/test_config_tool.py::test_user_creation[experimental-example_user0-US/Pacific] - sqlalchemy.exc.Operation...
ERROR integration_tests/test_config_tool.py::test_user_creation[experimental-example_user0-UTC] - sqlalchemy.exc.OperationalError...
ERROR integration_tests/test_config_tool.py::test_user_creation[experimental-example_user1-US/Pacific] - sqlalchemy.exc.Operation...
ERROR integration_tests/test_config_tool.py::test_user_creation[experimental-example_user1-UTC] - sqlalchemy.exc.OperationalError...
ERROR integration_tests/test_config_tool.py::test_user_creation[experimental-example_user2-US/Pacific] - sqlalchemy.exc.Operation...
ERROR integration_tests/test_config_tool.py::test_user_creation[experimental-example_user2-UTC] - sqlalchemy.exc.OperationalError...
ERROR integration_tests/test_config_tool.py::test_user_creation[experimental-example_user3-US/Pacific] - sqlalchemy.exc.Operation...
ERROR integration_tests/test_config_tool.py::test_user_creation[experimental-example_user3-UTC] - sqlalchemy.exc.OperationalError...
ERROR integration_tests/test_dataset_add.py::test_dataset_add_http[US/Pacific-datacube]
ERROR integration_tests/test_dataset_add.py::test_dataset_add_http[UTC-datacube]
ERROR integration_tests/test_model.py::test_crs_parse[experimental-US/Pacific] - sqlalchemy.exc.OperationalError: (psycopg2.Opera...
ERROR integration_tests/test_model.py::test_crs_parse[experimental-UTC] - sqlalchemy.exc.OperationalError: (psycopg2.OperationalE...
ERROR integration_tests/test_validate_ingestion.py::test_invalid_ingestor_config[experimental-US/Pacific] - sqlalchemy.exc.Operat...
ERROR integration_tests/test_validate_ingestion.py::test_invalid_ingestor_config[experimental-UTC] - sqlalchemy.exc.OperationalEr...
ERROR integration_tests/index/test_config_docs.py::test_idempotent_add_dataset_type[experimental-US/Pacific] - sqlalchemy.exc.Ope...
ERROR integration_tests/index/test_config_docs.py::test_idempotent_add_dataset_type[experimental-UTC] - sqlalchemy.exc.Operationa...
ERROR integration_tests/index/test_config_docs.py::test_update_dataset[experimental-US/Pacific] - sqlalchemy.exc.OperationalError...
ERROR integration_tests/index/test_config_docs.py::test_update_dataset[experimental-UTC] - sqlalchemy.exc.OperationalError: (psyc...
ERROR integration_tests/index/test_config_docs.py::test_product_update_cli[experimental-US/Pacific] - sqlalchemy.exc.OperationalE...
ERROR integration_tests/index/test_config_docs.py::test_product_update_cli[experimental-UTC] - sqlalchemy.exc.OperationalError: (...
ERROR integration_tests/index/test_config_docs.py::test_update_metadata_type[experimental-US/Pacific] - sqlalchemy.exc.Operationa...
ERROR integration_tests/index/test_config_docs.py::test_update_metadata_type[experimental-UTC] - sqlalchemy.exc.OperationalError:...
ERROR integration_tests/index/test_config_docs.py::test_filter_types_by_fields[experimental-US/Pacific] - sqlalchemy.exc.Operatio...
ERROR integration_tests/index/test_config_docs.py::test_filter_types_by_fields[experimental-UTC] - sqlalchemy.exc.OperationalErro...
ERROR integration_tests/index/test_config_docs.py::test_filter_types_by_search[experimental-US/Pacific] - sqlalchemy.exc.Operatio...
ERROR integration_tests/index/test_config_docs.py::test_filter_types_by_search[experimental-UTC] - sqlalchemy.exc.OperationalErro...
ERROR integration_tests/index/test_index_data.py::test_archive_datasets[experimental-US/Pacific] - sqlalchemy.exc.OperationalErro...
ERROR integration_tests/index/test_index_data.py::test_archive_datasets[experimental-UTC] - sqlalchemy.exc.OperationalError: (psy...
ERROR integration_tests/index/test_index_data.py::test_purge_datasets[experimental-US/Pacific] - sqlalchemy.exc.OperationalError:...
ERROR integration_tests/index/test_index_data.py::test_purge_datasets[experimental-UTC] - sqlalchemy.exc.OperationalError: (psyco...
ERROR integration_tests/index/test_index_data.py::test_purge_datasets_cli[experimental-US/Pacific] - sqlalchemy.exc.OperationalEr...
ERROR integration_tests/index/test_index_data.py::test_purge_datasets_cli[experimental-UTC] - sqlalchemy.exc.OperationalError: (p...
ERROR integration_tests/index/test_index_data.py::test_purge_all_datasets_cli[experimental-US/Pacific] - sqlalchemy.exc.Operation...
ERROR integration_tests/index/test_index_data.py::test_purge_all_datasets_cli[experimental-UTC] - sqlalchemy.exc.OperationalError...
ERROR integration_tests/index/test_index_data.py::test_index_duplicate_dataset[experimental-US/Pacific] - sqlalchemy.exc.Operatio...
ERROR integration_tests/index/test_index_data.py::test_index_duplicate_dataset[experimental-UTC] - sqlalchemy.exc.OperationalErro...
ERROR integration_tests/index/test_index_data.py::test_has_dataset[experimental-US/Pacific] - sqlalchemy.exc.OperationalError: (p...
ERROR integration_tests/index/test_index_data.py::test_has_dataset[experimental-UTC] - sqlalchemy.exc.OperationalError: (psycopg2...
ERROR integration_tests/index/test_index_data.py::test_get_dataset[experimental-US/Pacific] - sqlalchemy.exc.OperationalError: (p...
ERROR integration_tests/index/test_index_data.py::test_get_dataset[experimental-UTC] - sqlalchemy.exc.OperationalError: (psycopg2...
ERROR integration_tests/index/test_index_data.py::test_transactions_api_ctx_mgr[experimental-US/Pacific] - sqlalchemy.exc.Operati...
ERROR integration_tests/index/test_index_data.py::test_transactions_api_ctx_mgr[experimental-UTC] - sqlalchemy.exc.OperationalErr...
ERROR integration_tests/index/test_index_data.py::test_transactions_api_manual[experimental-US/Pacific] - sqlalchemy.exc.Operatio...
ERROR integration_tests/index/test_index_data.py::test_transactions_api_manual[experimental-UTC] - sqlalchemy.exc.OperationalErro...
ERROR integration_tests/index/test_index_data.py::test_transactions_api_hybrid[experimental-US/Pacific] - sqlalchemy.exc.Operatio...
ERROR integration_tests/index/test_index_data.py::test_transactions_api_hybrid[experimental-UTC] - sqlalchemy.exc.OperationalErro...
ERROR integration_tests/index/test_index_data.py::test_get_missing_things[experimental-US/Pacific] - sqlalchemy.exc.OperationalEr...
ERROR integration_tests/index/test_index_data.py::test_get_missing_things[experimental-UTC] - sqlalchemy.exc.OperationalError: (p...
ERROR integration_tests/index/test_memory_index.py::test_mem_user_resource - RuntimeError: No index driver found for 'memory'. 2 ...
ERROR integration_tests/index/test_memory_index.py::test_mem_metadatatype_resource - RuntimeError: No index driver found for 'mem...
ERROR integration_tests/index/test_memory_index.py::test_mem_product_resource - RuntimeError: No index driver found for 'memory'....
ERROR integration_tests/index/test_memory_index.py::test_mem_dataset_add_eo3 - RuntimeError: No index driver found for 'memory'. ...
ERROR integration_tests/index/test_memory_index.py::test_mem_ds_lineage - RuntimeError: No index driver found for 'memory'. 2 ava...
ERROR integration_tests/index/test_memory_index.py::test_mem_ds_search_dups - RuntimeError: No index driver found for 'memory'. 2...
ERROR integration_tests/index/test_memory_index.py::test_mem_ds_locations - RuntimeError: No index driver found for 'memory'. 2 a...
ERROR integration_tests/index/test_memory_index.py::test_mem_ds_updates - RuntimeError: No index driver found for 'memory'. 2 ava...
ERROR integration_tests/index/test_memory_index.py::test_mem_ds_expand_periods - RuntimeError: No index driver found for 'memory'...
ERROR integration_tests/index/test_memory_index.py::test_mem_prod_time_bounds - RuntimeError: No index driver found for 'memory'....
ERROR integration_tests/index/test_memory_index.py::test_mem_ds_archive_purge - RuntimeError: No index driver found for 'memory'....
ERROR integration_tests/index/test_memory_index.py::test_mem_ds_search_and_count - RuntimeError: No index driver found for 'memor...
ERROR integration_tests/index/test_memory_index.py::test_mem_ds_search_and_count_by_product - RuntimeError: No index driver found...
ERROR integration_tests/index/test_memory_index.py::test_mem_ds_search_returning - RuntimeError: No index driver found for 'memor...
ERROR integration_tests/index/test_memory_index.py::test_mem_ds_search_summary - RuntimeError: No index driver found for 'memory'...
ERROR integration_tests/index/test_memory_index.py::test_mem_ds_search_returning_datasets_light - RuntimeError: No index driver f...
ERROR integration_tests/index/test_memory_index.py::test_mem_ds_search_by_metadata - RuntimeError: No index driver found for 'mem...
ERROR integration_tests/index/test_memory_index.py::test_mem_ds_count_product_through_time - RuntimeError: No index driver found ...
ERROR integration_tests/index/test_memory_index.py::test_memory_dataset_add - RuntimeError: No index driver found for 'memory'. 2...
ERROR integration_tests/index/test_memory_index.py::test_mem_transactions - RuntimeError: No index driver found for 'memory'. 2 a...
ERROR integration_tests/index/test_pluggable_indexes.py::test_with_standard_index[experimental-US/Pacific] - sqlalchemy.exc.Opera...
ERROR integration_tests/index/test_pluggable_indexes.py::test_with_standard_index[experimental-UTC] - sqlalchemy.exc.OperationalE...
ERROR integration_tests/index/test_pluggable_indexes.py::test_system_init[experimental-US/Pacific] - sqlalchemy.exc.OperationalEr...
ERROR integration_tests/index/test_pluggable_indexes.py::test_system_init[experimental-UTC] - sqlalchemy.exc.OperationalError: (p...
ERROR integration_tests/index/test_postgis_index.py::test_create_spatial_index[US/Pacific-experimental] - sqlalchemy.exc.Operatio...
ERROR integration_tests/index/test_postgis_index.py::test_create_spatial_index[UTC-experimental] - sqlalchemy.exc.OperationalErro...
ERROR integration_tests/index/test_postgis_index.py::test_spatial_index_maintain[US/Pacific-experimental] - sqlalchemy.exc.Operat...
ERROR integration_tests/index/test_postgis_index.py::test_spatial_index_maintain[UTC-experimental] - sqlalchemy.exc.OperationalEr...
ERROR integration_tests/index/test_postgis_index.py::test_spatial_index_populate[US/Pacific-experimental] - sqlalchemy.exc.Operat...
ERROR integration_tests/index/test_postgis_index.py::test_spatial_index_populate[UTC-experimental] - sqlalchemy.exc.OperationalEr...
ERROR integration_tests/index/test_postgis_index.py::test_spatial_index_crs_validity[US/Pacific-experimental] - sqlalchemy.exc.Op...
ERROR integration_tests/index/test_postgis_index.py::test_spatial_index_crs_validity[UTC-experimental] - sqlalchemy.exc.Operation...
ERROR integration_tests/index/test_postgis_index.py::test_spatial_extent[US/Pacific-experimental] - sqlalchemy.exc.OperationalErr...
ERROR integration_tests/index/test_postgis_index.py::test_spatial_extent[UTC-experimental] - sqlalchemy.exc.OperationalError: (ps...
ERROR integration_tests/index/test_postgis_index.py::test_spatial_search[US/Pacific-experimental] - sqlalchemy.exc.OperationalErr...
ERROR integration_tests/index/test_postgis_index.py::test_spatial_search[UTC-experimental] - sqlalchemy.exc.OperationalError: (ps...
ERROR integration_tests/index/test_search_eo3.py::test_search_by_metadata[experimental-US/Pacific] - sqlalchemy.exc.OperationalEr...
ERROR integration_tests/index/test_search_eo3.py::test_search_by_metadata[experimental-UTC] - sqlalchemy.exc.OperationalError: (p...
ERROR integration_tests/index/test_search_eo3.py::test_search_dataset_equals_eo3[experimental-US/Pacific] - sqlalchemy.exc.Operat...
ERROR integration_tests/index/test_search_eo3.py::test_search_dataset_equals_eo3[experimental-UTC] - sqlalchemy.exc.OperationalEr...
ERROR integration_tests/index/test_search_eo3.py::test_search_dataset_by_metadata_eo3[experimental-US/Pacific] - sqlalchemy.exc.O...
ERROR integration_tests/index/test_search_eo3.py::test_search_dataset_by_metadata_eo3[experimental-UTC] - sqlalchemy.exc.Operatio...
ERROR integration_tests/index/test_search_eo3.py::test_search_day_eo3[experimental-US/Pacific] - sqlalchemy.exc.OperationalError:...
ERROR integration_tests/index/test_search_eo3.py::test_search_day_eo3[experimental-UTC] - sqlalchemy.exc.OperationalError: (psyco...
ERROR integration_tests/index/test_search_eo3.py::test_search_dataset_ranges_eo3[experimental-US/Pacific] - sqlalchemy.exc.Operat...
ERROR integration_tests/index/test_search_eo3.py::test_search_dataset_ranges_eo3[experimental-UTC] - sqlalchemy.exc.OperationalEr...
ERROR integration_tests/index/test_search_eo3.py::test_zero_width_range_search[experimental-US/Pacific] - sqlalchemy.exc.Operatio...
ERROR integration_tests/index/test_search_eo3.py::test_zero_width_range_search[experimental-UTC] - sqlalchemy.exc.OperationalErro...
ERROR integration_tests/index/test_search_eo3.py::test_search_globally_eo3[experimental-US/Pacific] - sqlalchemy.exc.OperationalE...
ERROR integration_tests/index/test_search_eo3.py::test_search_globally_eo3[experimental-UTC] - sqlalchemy.exc.OperationalError: (...
ERROR integration_tests/index/test_search_eo3.py::test_search_by_product_eo3[experimental-US/Pacific] - sqlalchemy.exc.Operationa...
ERROR integration_tests/index/test_search_eo3.py::test_search_by_product_eo3[experimental-UTC] - sqlalchemy.exc.OperationalError:...
ERROR integration_tests/index/test_search_eo3.py::test_search_limit_eo3[experimental-US/Pacific] - sqlalchemy.exc.OperationalErro...
ERROR integration_tests/index/test_search_eo3.py::test_search_limit_eo3[experimental-UTC] - sqlalchemy.exc.OperationalError: (psy...
ERROR integration_tests/index/test_search_eo3.py::test_search_or_expressions_eo3[experimental-US/Pacific] - sqlalchemy.exc.Operat...
ERROR integration_tests/index/test_search_eo3.py::test_search_or_expressions_eo3[experimental-UTC] - sqlalchemy.exc.OperationalEr...
ERROR integration_tests/index/test_search_eo3.py::test_search_returning_eo3[experimental-US/Pacific] - sqlalchemy.exc.Operational...
ERROR integration_tests/index/test_search_eo3.py::test_search_returning_eo3[experimental-UTC] - sqlalchemy.exc.OperationalError: ...
ERROR integration_tests/index/test_search_eo3.py::test_search_returning_rows_eo3[experimental-US/Pacific] - sqlalchemy.exc.Operat...
ERROR integration_tests/index/test_search_eo3.py::test_search_returning_rows_eo3[experimental-UTC] - sqlalchemy.exc.OperationalEr...
ERROR integration_tests/index/test_search_eo3.py::test_searches_only_type_eo3[experimental-US/Pacific] - sqlalchemy.exc.Operation...
ERROR integration_tests/index/test_search_eo3.py::test_searches_only_type_eo3[experimental-UTC] - sqlalchemy.exc.OperationalError...
ERROR integration_tests/index/test_search_eo3.py::test_search_special_fields_eo3[experimental-US/Pacific] - sqlalchemy.exc.Operat...
ERROR integration_tests/index/test_search_eo3.py::test_search_special_fields_eo3[experimental-UTC] - sqlalchemy.exc.OperationalEr...
ERROR integration_tests/index/test_search_eo3.py::test_search_by_uri_eo3[experimental-US/Pacific] - sqlalchemy.exc.OperationalErr...
ERROR integration_tests/index/test_search_eo3.py::test_search_by_uri_eo3[experimental-UTC] - sqlalchemy.exc.OperationalError: (ps...
ERROR integration_tests/index/test_search_eo3.py::test_search_conflicting_types[experimental-US/Pacific] - sqlalchemy.exc.Operati...
ERROR integration_tests/index/test_search_eo3.py::test_search_conflicting_types[experimental-UTC] - sqlalchemy.exc.OperationalErr...
ERROR integration_tests/index/test_search_eo3.py::test_fetch_all_of_md_type[experimental-US/Pacific] - sqlalchemy.exc.Operational...
ERROR integration_tests/index/test_search_eo3.py::test_fetch_all_of_md_type[experimental-UTC] - sqlalchemy.exc.OperationalError: ...
ERROR integration_tests/index/test_search_eo3.py::test_count_searches[experimental-US/Pacific] - sqlalchemy.exc.OperationalError:...
ERROR integration_tests/index/test_search_eo3.py::test_count_searches[experimental-UTC] - sqlalchemy.exc.OperationalError: (psyco...
ERROR integration_tests/index/test_search_eo3.py::test_count_by_product_searches_eo3[experimental-US/Pacific] - sqlalchemy.exc.Op...
ERROR integration_tests/index/test_search_eo3.py::test_count_by_product_searches_eo3[experimental-UTC] - sqlalchemy.exc.Operation...
ERROR integration_tests/index/test_search_eo3.py::test_count_time_groups[experimental-US/Pacific] - sqlalchemy.exc.OperationalErr...
ERROR integration_tests/index/test_search_eo3.py::test_count_time_groups[experimental-UTC] - sqlalchemy.exc.OperationalError: (ps...
ERROR integration_tests/index/test_search_eo3.py::test_count_time_groups_cli[experimental-US/Pacific] - sqlalchemy.exc.Operationa...
ERROR integration_tests/index/test_search_eo3.py::test_count_time_groups_cli[experimental-UTC] - sqlalchemy.exc.OperationalError:...
ERROR integration_tests/index/test_search_eo3.py::test_search_cli_basic[experimental-US/Pacific] - sqlalchemy.exc.OperationalErro...
ERROR integration_tests/index/test_search_eo3.py::test_search_cli_basic[experimental-UTC] - sqlalchemy.exc.OperationalError: (psy...
ERROR integration_tests/index/test_search_eo3.py::test_cli_info_eo3[experimental-US/Pacific] - sqlalchemy.exc.OperationalError: (...
ERROR integration_tests/index/test_search_eo3.py::test_cli_info_eo3[experimental-UTC] - sqlalchemy.exc.OperationalError: (psycopg...
ERROR integration_tests/index/test_search_eo3.py::test_find_duplicates_eo3[experimental-US/Pacific] - sqlalchemy.exc.OperationalE...
ERROR integration_tests/index/test_search_eo3.py::test_find_duplicates_eo3[experimental-UTC] - sqlalchemy.exc.OperationalError: (...
ERROR integration_tests/index/test_search_eo3.py::test_csv_search_via_cli_eo3[experimental-US/Pacific] - sqlalchemy.exc.Operation...
ERROR integration_tests/index/test_search_eo3.py::test_csv_search_via_cli_eo3[experimental-UTC] - sqlalchemy.exc.OperationalError...
ERROR integration_tests/index/test_search_eo3.py::test_csv_structure_eo3[experimental-US/Pacific] - sqlalchemy.exc.OperationalErr...
ERROR integration_tests/index/test_search_eo3.py::test_csv_structure_eo3[experimental-UTC] - sqlalchemy.exc.OperationalError: (ps...
ERROR integration_tests/index/test_search_eo3.py::test_query_dataset_multi_product_eo3[experimental-US/Pacific] - sqlalchemy.exc....
ERROR integration_tests/index/test_search_eo3.py::test_query_dataset_multi_product_eo3[experimental-UTC] - sqlalchemy.exc.Operati...
FAILED tests/test_driver.py::test_writer_drivers - AssertionError: assert 'netcdf' in []
FAILED tests/test_driver.py::test_index_drivers - AssertionError: assert 'null' in ['default', 'postgres']
FAILED tests/test_geometry.py::test_ops - assert 25.0 == 25.000000000000004
FAILED tests/test_geometry.py::test_crs_compat - assert None == 3577
FAILED tests/api/test_grid_workflow.py::test_gridworkflow_with_time_depth - AssertionError
FAILED tests/api/test_virtual.py::test_aggregate - ValueError: time already exists as coordinate or variable name.
FAILED integration_tests/test_double_ingestion.py::test_double_ingestion[US/Pacific-datacube] - AssertionError: Error for ['inges...
FAILED integration_tests/test_double_ingestion.py::test_double_ingestion[UTC-datacube] - AssertionError: Error for ['ingest', '--...
FAILED integration_tests/test_end_to_end.py::test_end_to_end[US/Pacific-datacube] - AssertionError: Error for ['-v', 'ingest', '-...
FAILED integration_tests/test_end_to_end.py::test_end_to_end[UTC-datacube] - AssertionError: Error for ['-v', 'ingest', '-c', '/t...
FAILED integration_tests/test_full_ingestion.py::test_full_ingestion[US/Pacific-datacube] - AssertionError: Error for ['ingest', ...
FAILED integration_tests/test_full_ingestion.py::test_full_ingestion[UTC-datacube] - AssertionError: Error for ['ingest', '--conf...
FAILED integration_tests/test_full_ingestion.py::test_process_all_ingest_jobs[US/Pacific-datacube] - AssertionError: Error for ['...
FAILED integration_tests/test_full_ingestion.py::test_process_all_ingest_jobs[UTC-datacube] - AssertionError: Error for ['ingest'...
FAILED integration_tests/test_index_out_of_bound.py::test_index_out_of_bound_error[US/Pacific-datacube] - AssertionError: Error f...
FAILED integration_tests/test_index_out_of_bound.py::test_index_out_of_bound_error[UTC-datacube] - AssertionError: Error for ['in...
FAILED integration_tests/test_validate_ingestion.py::test_invalid_ingestor_config[datacube-US/Pacific] - AssertionError: assert '...
FAILED integration_tests/test_validate_ingestion.py::test_invalid_ingestor_config[datacube-UTC] - AssertionError: assert 'No such...
FAILED integration_tests/index/test_memory_index.py::test_init_memory - AssertionError: assert 'memory' in {'default': <datacube....
FAILED integration_tests/index/test_null_index.py::test_init_null - AssertionError: assert 'null' in {'default': <datacube.index....
FAILED integration_tests/index/test_null_index.py::test_null_user_resource - RuntimeError: No index driver found for 'null'. 2 av...
FAILED integration_tests/index/test_null_index.py::test_null_metadata_types_resource - RuntimeError: No index driver found for 'n...
FAILED integration_tests/index/test_null_index.py::test_null_product_resource - RuntimeError: No index driver found for 'null'. 2...
FAILED integration_tests/index/test_null_index.py::test_null_dataset_resource - RuntimeError: No index driver found for 'null'. 2...
FAILED integration_tests/index/test_null_index.py::test_null_transactions - RuntimeError: No index driver found for 'null'. 2 ava...
FAILED integration_tests/index/test_search_eo3.py::test_cli_info_eo3[datacube-US/Pacific] - AssertionError: assert '    lat: {beg...
FAILED integration_tests/index/test_search_eo3.py::test_cli_info_eo3[datacube-UTC] - AssertionError: assert '    lat: {begin: -38...
==================== 27 failed, 690 passed, 3 skipped, 2 xfailed, 21 warnings, 162 errors in 158.71s (0:02:38) =====================
dap@odc ~/proj/datacube-core %  

And yes, according to git, I am uptodate:

dap@odc ~/proj/datacube-core % git status
On branch develop
Your branch is up to date with 'origin/develop'.

Untracked files:
  (use "git add <file>..." to include in what will be committed)
    h.txt
    my.py
    redo.sh

nothing added to commit but untracked files present (use "git add" to track)
dap@odc ~/proj/datacube-core % conda info

     active environment : odc_3.8
    active env location : /home/dap/anaconda3/envs/odc_3.8
            shell level : 1
       user config file : /home/dap/.condarc
 populated config files : /home/dap/.condarc
          conda version : 22.11.1
    conda-build version : 3.22.0
         python version : 3.9.13.final.0
       virtual packages : __archspec=1=aarch64
                          __glibc=2.35=0
                          __linux=5.15.0=0
                          __unix=0=0
       base environment : /home/dap/anaconda3  (writable)
      conda av data dir : /home/dap/anaconda3/etc/conda
  conda av metadata url : None
           channel URLs : https://repo.anaconda.com/pkgs/main/linux-aarch64
                          https://repo.anaconda.com/pkgs/main/noarch
                          https://repo.anaconda.com/pkgs/r/linux-aarch64
                          https://repo.anaconda.com/pkgs/r/noarch
                          https://conda.anaconda.org/conda-forge/linux-aarch64
                          https://conda.anaconda.org/conda-forge/noarch
          package cache : /home/dap/anaconda3/pkgs
                          /home/dap/.conda/pkgs
       envs directories : /home/dap/anaconda3/envs
                          /home/dap/.conda/envs
               platform : linux-aarch64
             user-agent : conda/22.11.1 requests/2.28.1 CPython/3.9.13 Linux/5.15.0-56-generic ubuntu/22.04.1 glibc/2.35
                UID:GID : 1027:1027
             netrc file : None
           offline mode : False

dap@odc ~/proj/datacube-core % 
sathwikreddy56 commented 1 year ago

Any Update on this?

stale[bot] commented 1 year ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

lucapaganotti commented 1 year ago

Hi Paul, I'm sorry, I had to long delay my odc activities, I should be able to restart them on this thursday if no other interrupts will rise. I'm very sorry for this delay of mine. I'll get back on odc soon.

Have a good day, thanks for your help till now.


-- Dott. Ing. Luca Paganotti -- Via Baroffio 4 -- 21040 Vedano Olona (VA) -- 393 1346898

-- softech s.r.l. email: -- @. -- @. -- https://github.com/lucapaganotti -- sourceforge email: -- @.*** -- skype name: luca.paganotti [image: http://it.linkedin.com/in/lucapaganotti] http://it.linkedin.com/in/lucapaganotti


-- Mistakes are portals of discovery - JAAJ


On Thu, Jan 19, 2023 at 8:06 PM sathwikreddy56 @.***> wrote:

Any Update on this?

— Reply to this email directly, view it on GitHub https://github.com/opendatacube/datacube-core/issues/1329#issuecomment-1397470092, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABOYWHJIK5ZDCARPBVKMIOLWTGGBZANCNFSM6AAAAAARQ2VSLQ . You are receiving this because you were mentioned.Message ID: @.***>

Ariana-B commented 1 year ago

Hi @lucapaganotti @permezel @sathwikreddy56 I recently updated the ubuntu setup documentation, which should hopefully help resolve the issues mentioned here. Let me know if you're able to get things running properly with the new instructions.