Open philippconzett opened 1 year ago
Did the CSV or XLSX file fail ingestion?
We have turned off tabular file ingestion due to previous challenges.
We've been having the same issue (in 5.14 on pilot). And further tested it and had it happen for multiple datasets, so it appears to not be dependent on a specific file or dataset. Or at least that's way it seems, even when the errors imply otherwise.
I was not able to reproduce this at first on v5.14 or v6.0 (under test) using a single browser, single tab. I was able to reproduce it if I used multiple tabs in the same browser, both on a draft view and so publish button is enabled but I then further modify the draft in only one tab and attempt to publish in the first tab where it has a stale read:
Is this what you're seeing? There may be other ways to reproduce it: 2 different users/browsers etc. If this isn't your scenario, can you try to identify the steps to reproduce it, perhaps with a new/clean test dataset?
When I try the same steps but instead publish a minor version I still get an error but maybe a more understandable one optimistic lock:
Thanks, @kcondon The error message we got was slightly different. It said
Cannot merge an Entity that has been removed
See above.
I see, @philippconzett , thanks. Does the stale read scenario provide a plausible explanation in your case, whether from multiple tabs, browsers, or people editing? A workaround if it is a stale read would be to refresh the draft page view from which you want to publish, if we do not yet know all situations in which this could occur.
I just added another keyword and tried to update the dataset, but got again the error message below. I can't see that there was any multiple tabs, browsers issue involved.
Can you try refreshing that Dataset draft view and publish without incrementing version? Just making sure that isn't the issue.
OK, will ask Dev for more info. Not sure what that removed entity is or how it is detected that it was removed unless it was not completely removed?
Ok, seems like a bug if not the same as the stale read. Someone from Development needs to investigate to learn more.
Thanks, the error message is from trying to publish without incrementing version. Publishing with incrementing version works.
On our Dataverse installation in Stuttgart (version 5.12.1) we have the same problem:
Error – Dataset Version Update failed. Changes are still in the DRAFT version.
- edu.harvard.iq.dataverse.engine.command.exception.CommandException:
- Command edu.harvard.iq.dataverse.engine.command.impl.CuratePublishedDatasetVersionCommand@78f1fadf failed:
- Cannot merge an Entity that has been removed: edu.harvard.iq.dvn.core.study.FileMetadata[id=321341]
- If you believe this is an error, please contact [DaRUS Support Team](https://darus.uni-stuttgart.de/dataset.xhtml?persistentId=doi%3A10.18419%2Fdarus-2816#) for assistance.
As @philippconzett we only did small changes in the citation metadata (description and related publication) prior to try to publish without incrementing version. No change on any file metadata (mentioned in the error message). However, we had a prior version of the dataset that changed files: our Version tab looks like this:
Maybe Dataverse compares metadata of a file that did not exists in the very first version of the dataset?
We have a similar problem on several datasets (15) on our installation dataverse‧cirad‧fr (version 5.2). We only did a minor change in the citation metadata (Kind of Data). I haven't found any similar points that would block publication without incrementing version. In our case, versioning doesn't seem to be the issue (some datasets have only one version, others have several)
What steps does it take to reproduce the issue?
Error – Dataset Version Update failed. Changes are still in the DRAFT version. - edu.harvard.iq.dataverse.engine.command.exception.CommandException: Command edu.harvard.iq.dataverse.engine.command.impl.CuratePublishedDatasetVersionCommand@33eaee58 failed: Cannot merge an Entity that has been removed: edu.harvard.iq.dvn.core.study.FileMetadata[id=209002] If you believe this is an error, please contact DataverseNO Support for assistance.
Which page(s) does it occurs on? On the dataset landing page of this TROLLing dataset: https://doi.org/10.18710/XKDBLF.
What happens? I get the error message pasted above.
To whom does it occur (all users, curators, superusers)? Superusers.
What did you expect to happen? The dataset to be updated without there being issued a new version.
Which version of Dataverse are you using? 5.13
Any related open or closed issues to this bug report? No, but the issue was discussed in the Dataverse Google Group (https://groups.google.com/u/1/g/dataverse-community/c/6smbEFyq_Ok), and it turns out to be a bug in the Dataverse software.