opendatacube / datacube-core

Open Data Cube analyses continental scale Earth Observation data through time
http://www.opendatacube.org
Apache License 2.0
505 stars 176 forks source link

Multi-document yamls are inconvenient to use in ODC's cli tools #984

Open jeremyh opened 4 years ago

jeremyh commented 4 years ago

I tried to update a set of products recently that were stored in a multi-document yaml:

The only way I know to fix this is to split up the file temporarily and add them separately.

I wish I could tell it: "update ODC to match my whole file" without caring about whether they're updates or additions.

Examples of multi-doc usage are in DEA's metadata-types and products, and some datasets on the NCI.

All document commands have this inconvenience:

 datacube product add    
 datacube product update    

 datacube dataset add    
 datacube dataset update    

 datacube metadata add      
 datacube metadata update  

(sourced from #983, cc @omad )

Kirill888 commented 4 years ago

I think a more proper solution is to support

  1. Selecting one part of multi-part document for both add/update
  2. Treating each one as a separate invocation and have an option to continue on failure
  3. Never just die with exception in user facing CLIs
Kirill888 commented 4 years ago

sorry wrong button press

Kirill888 commented 4 years ago

So you'd like "update or add" functionality? Or like "sync to this state if safe". Should it be

I think middle three are redundant and hence confusing, I would probably go with last one, but that requires changing CLI API in non-backward compatible ways.

omad commented 4 years ago

On YAMLs:

Looking elsewhere for inspiration:

WRT backward compatibility, we need to get over the fear of breaking things. So long as we do it in a sane way.

The obvious course here is to add new command sync or apply in one release, and mark add and update as deprecated, and then remove them in a future release.

Crazy Damien idea that should not be seen

Let's implement an ODC Kubernetes Operator, which you interact with to update an ODC Database via Kubernetes tools

jeremyh commented 4 years ago

I like apply.

I wouldn’t rush to remove the old ones as there’s lots of bash scripts, beginners-guides, power point presentations and tutorials that have them.

I do like the idea of supporting different record types in one file, like @omad’s example of Kubernetes. I’ve seen many people tripped up by calling the wrong command (datacube product add some-dataset.yaml etc), and not understanding why it failed. Datacube could just detect the type itself. One simple way to do this would be adding $schema field to product/metadata-types docs, similar to eo3 datasets.

jeremyh commented 4 years ago

Another option is not to remove the old ones entirely, but just make them aliases for the new command. In effect, retiring the more niche options/flags that they take. Beginner tutorials usually say datacube product add without any options.

(and I could imagine it occasionally still making a bash script clearer to write datacube product add somefile.yaml than datacube apply somefile.yaml if you have several separate steps. But that’s not a strong preference...)

Kirill888 commented 4 years ago

+1 for apply not sure about datacube apply vs datacube product|metadata|dataset apply do agree that having metadata and product in one yaml would be nice. Skipping non-product documents when calling datacube product apply is what I would expect, but we don't have type field, so there is no telling if this document meant to be metadata, product or dataset definition.

Kirill888 commented 4 years ago

we can of course implement guess_document_type(doc: Dict[str, Any]) -> Enum[Metadata|Product|Dataset|Unknown] but not sure how that would look exactly, or how robust it would be.

mpaget commented 4 years ago

For my 2 cents: I think datacube product|metadata|dataset apply is the right target. The concept of separate product and dataset yamls is still valid (fairly entrenched and not broken). Mixing two types into one doc (and then untangling them) seems unnecessarily duplicative and complicated. Less sure about the use cases for metadata and product types in one yaml, though.

Whereas datacube product|metadata|dataset apply just requires the add|update logic, possibly via an if in DB: update, else: add switch.

With regard to (a potential case for) mixing dataset and product types in one yaml: Following my eo3 adventures the other week, I wrote an Assembler that ties the product.yaml name and measurements to the corresponding dataset.yaml elements, through metadata:name and note_measurements being driven by the product.yaml. Through our test cases so far, this effectively (and easily) enough ties the dataset.yaml to the right product.yaml - which may be a practical (if intermediatary) step towards closer ties between dataset and product rather than being in the same multi-doc yaml.

BTW: very happy to share the Assembler - adapted from eodataset3's - as I think it could add value to eo3 uptake in the ODC community. But that's a topic for another thread - Rob may speak to it in the next SC.

Kirill888 commented 4 years ago

@mpaget mixing dataset and product definitions in one yaml file is not something we are proposing. Although I can see it being useful in very few cases, something like a DEM stored in one giant file, so it would be one product one dataset sort of deal. Also this would not change auto-matching logic at all, one would still need to make sure that dataset and product definitions are "compatible", so metadata.product.name is still a thing you'll need to do even if everything is stored in one file (not recommended except, maybe in the case of single dataset products?).

stale[bot] commented 3 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

stale[bot] commented 3 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.