Open alexvuta opened 4 years ago
Hey, thanks for interest in the S3 plugin :D
Issues you're heaving are related to the internals of Orthanc itself I'd advise you to submit ticket here https://bitbucket.org/sjodogne/orthanc/issues
The reason is that S3 code is called in a loop that originates in the Orthanc Core code or even the pligin code i.e. https://hg.orthanc-server.com/orthanc-dicomweb/file/tip/Plugin/WadoRs.cpp around line 277 Where plugin calls REST API of Orthanc to get files in the shameless loop
On the part of S3 plugin itself you could think of a cache layer - but first I'd check if there is such a thing in Orthanc itself with default of 2 items AFAIR (although not used consistently)
Hello,
I have configured a docker Orthanc image, using the S3 plugin to store DICOM files on AWS. Recently I configured the OHIF viewer that use DICOM Web plugin to fetch studies via REST API and I have serious problems on large studies.
For example, when I try to acces a study that has ~900 instances, I got a timeout on /metadata endpoint. Orthanc logs show that S3 plugin is fetching each instance from AWS.
Here we have :
Tried the same study but using the Google healt API for storage and it works very good. Is there some optimizations that can be made on the S3 plugin to avoid timeouts on metadatas? It seems that orthanc keeps JSON files containing the metadatas for each instance.
Also, I would like to know if is possible to store metadatas in postgres and the DICOM files on S3 to avoid local JSON storage for metadatas.
Thanks, have a nice day :+1: