hubmapconsortium / ingest-ui

HuBMAP Data Ingest Portal
https://ingest.hubmapconsortium.org
MIT License
4 stars 0 forks source link

Bulk Metadata Upload Error Capture Upgrades #1417

Closed BirdMachine closed 3 months ago

BirdMachine commented 4 months ago

The Responses from the API have changed somewhat from before, and the error content is wrapped differently in some cases. Need to re-align with the API expectations

shirey commented 3 months ago

@BirdMachine (FYI @yuanzhou ) The attached file, when uploaded as Block (remove .txt extension added so it could be uploaded here) metadata contains one error (an invalid value in the source_storage_duration_unit column. Based on the json returned from the validator (see response json pasted below) the message to the user should be something like:

image

But currently is rendered like:

image

Response JSON

{
  "code": 406,
  "description": [
    {
      "column": "source_storage_duration_unit",
      "error": "On row 1, column \"source_storage_duration_unit\", value \"ms\" fails because of error \"notStandardTerm\". Example: minute",
      "row": 1
    }
  ],
  "name": "Unacceptable Metadata"
}

sample-block-bad-storage_source_duration_unit.tsv.txt

shirey commented 3 months ago

@BirdMachine Also the links to the example .tsv fiiles for Block, Section and Suspension point to the example contributors.tsv file instead of the examples for block, section and suspension.

BirdMachine commented 3 months ago

sample-block-bad-storage_source_duration_unit.tsv.txt

So regarding this situation, this is how pre-attachment validation currently exists on Prod (assuming, using Test as the reference). The Point of the task was to handle the attachment responses, not the initial cedar validation responses. Have you had a moment to check the response formatting for an uploaded csv that passes Validation but Fails (fully or partially) on the Attachment Step? I'll create a new ticket item for updating the Cedar validation responses as well. It'd be worth time coordinating with someone on the API end and get full documentation/reference/information for every possible formatting and wrapping error variation possible.