Closed jggautier closed 9 months ago
For the record - nope, you cannot remove these CAFE*-blocks fields in the UI. ☹️ On account of some of them being required. We will need to resort to some hackery to resolve this. I need to be super careful about it, so it may take some appreciable time.
Looking into it now.
[edit: n/m!]
@jggautier Could you please confirm that these are the versions of the blocks that are currently installed in production? (Jan. 3 commits) -
https://github.com/IQSS/dataverse.harvard.edu/blob/f4a79a733a9d5b816080cf463914e8743a761fff/metadatablocks/customCAFEDataLocation.tsv and https://github.com/IQSS/dataverse.harvard.edu/blob/f4a79a733a9d5b816080cf463914e8743a761fff/metadatablocks/customCAFEDataSources.tsv
Not super crucial; I just want to replicate the prod. setup on my own dev. box 1:1 to test the delete queries.
successfully removed "Location"... "Sources" next...
OK, all the existing field values and fields have been erased, and both custom blocks uninstalled, like they were never there. I will install the new versions later tonight (so that I don't have to restart Solr during the day). If you have a sec., please take a look at the 2 published datasets in question, just to confirm that there's nothing visibly wrong with them after I had to mess with the metadata in the database. Just in case, I'm fairly positive they should be ok. 🤞
(no, I didn't get to installing the blocks last night, but will do shortly)
Thanks. Sorry I didn't get to help look at those Jan. 3 commits yesterday. Got caught up in other stuff.
I checked those two published datasets. They're editable and I don't see any traces of the new blocks' fields in the forms.
I do see the metadata in the JSON exports, like https://dataverse.harvard.edu/api/datasets/export?exporter=dataverse_json&persistentId=doi:10.7910/DVN/Y1WNU7. Just pointing that out just in case.
Correct, I didn't bother re-exporting the 2 datasets. They will be automatically re-exported when they are re-published.
The 2 blocks have been installed (again). Please review/double-check that these are the correct versions.
I just reviewed them and they're the correct versions. Thanks!
@jggautier Can we close it, or do you want to keep it open until they enter all the metadata they need?
Ah, yes we can close it. I'll do that now. Wasn't sure if there was anything else we needed to do for re-adding the metadata blocks.
We can track the remaining tasks, mostly about those two published datasets you edited, in our email thread with the collection admin.
This GitHub issue is being used to track progress of the creation of a metadata block or metadata blocks I'm helping design for a Dataverse collection that the BUSPH-HSPH Climate Change and Health Research Coordinating Center (CAFE) will be managing on Harvard Dataverse. Their unpublished collection is at https://dataverse.harvard.edu/dataverse/cafe.
In this repo at https://github.com/IQSS/dataverse.harvard.edu/tree/master/metadatablocks, I've added the .tsv and .properties files that define the metadata fields, and I'll continue updating those files as the CAFE folks review and improve the metadata fields.
This screenshot shows the metadata block we're planning to add, as of 2023-11-07, so that depositors can describe the geospatial data:
This screenshot shows the metadata block we're planning to add, as of 2023-11-07, so that depositors can describe the source datasets of the dataset being deposited: