Closed fedorov closed 4 years ago
It is interesting that this happens for some other but not all of the subjects from that collection.
It works fine for this one: https://dev-viewer.canceridc.dev/projects/idc-dev-etl/locations/us/datasets/pre-mvp-temp/dicomStores/cross-collection-temp/study/1.3.6.1.4.1.14519.5.2.1.1706.4009.279236404491517275639164431409
Also note when this happens, it is impossible to download the study (or maybe it is a separate regression).
@pieper any ideas?
Or is this due to the proxy?
I haven't looked at this in detail. Do we need it fixed before the Wednesday meeting?
Considering the earlier experience, I don't think this is feasible or desirable to fix this before the Wednesday meeting (since other things can get broken, and I will not know what to expect to work during the demo).
I am just testing things as I prepare, and populate issues that I think should be fixed, if possible, sometime after the Wednesday meeting.
Adding @wlongabaugh to the thread, since it is not clear if the issue is actually relevant to the download functionality.
To reproduce the CORS part, put this into the JS console:
ohif.app.commandsManager.runCommand("downloadAndZip", {listOfUIDs: [window.location.href.split("/").pop()]})
OK, having looked at the logs, we see an early 404, which is actually the initial 404 that is part of the OHIF load with a non-empty path. No problem. Next, the viewer starts looping, which is handled cleanly by the proxy, though it is worth nothing that this can quickly exhaust your daily quota if you don't shut this down. Finally, the downloadAndZip command appears to be triggering a 500 error from the proxy. The CORS problem happens because the proxy is not catching program exceptions and then wrapping the 500 return with the needed CORS headers. So the problem is not CORS, but the fact the proxy is somehow getting confused by the downloadAndZip. @fedorov and @pieper suggest you point OHIF directly at Google Healthcare directly to debug while I try to track down the 500.
@pieper is there anything weird going on with the downloadAndZip command server side? I note that those requests are taking ~14 seconds before returning the 500, and appears to be hitting the Google backend normally.
@wlongabaugh - the downloadAndZip command can't reuse the data it's already downloaded (because it gets the metadata in json and pixels as frames) so it does re-access the study. You would expect to see a whole set of wado requests for the dicom part10 files. I doubt the downloadAndZip knows what to do with at 500 or a 429 code, so it probably would just fail.
Got it. But does the command use any weirdo DICOMWeb stuff that might cause the data coming back from Google to be different than usual? The proxy appears to be getting back a usual response from the Google API, but then throws a 500 before it is done. I will instrument the code after Wednesday, but am wondering how these requests may be different from "standard" requests, if they are at all.
It works fine for this one: https://dev-viewer.canceridc.dev/projects/idc-dev-etl/locations/us/datasets/pre-mvp-temp/dicomStores/cross-collection-temp/study/1.3.6.1.4.1.14519.5.2.1.1706.4009.279236404491517275639164431409
It doesn't. I think there is still some race condition, because the same study does not work consistently. Here's the error from the console:
If I drag into the viewport series 1, things break. If I show series 3, and then try to look at series 1, things also break.
I believe PR #1877 will fix this looping issue as well as #1797. But we will need to investigate the actual error itself.
Bug Report
Describe the Bug
There is looping error "RangeError: Offset is outside the bounds of the DataView" lading RTSTRUCT, see here:
https://dev-viewer.canceridc.dev/projects/idc-dev-etl/locations/us/datasets/pre-mvp-temp/dicomStores/cross-collection-temp/study/1.3.6.1.4.1.14519.5.2.1.1706.4009.421312468402475769388366275116
This is subject
TCGA-CV-6433
from theTCGA-HNSC
collection.