Closed nexmann closed 1 year ago
Hello @nexmann,
Thank you for your question! You are absolutely right... "converting the DICOMs to binary data it removes some of the information somehow". The information transferred over the wire, lacks the FileMetaInformation
header, that is required to form a valid DICOM part10 file. Instead of trying to directly save the information you receive from your SCP handler to a file, you should call Dataset
's class, toFile
function. This function adds the required FileMetaInformation
bytes before "de-naturalizing" the dataset.
Thank you for your swift answer!
Okay. The toFile() method saves it to disk right? If I’m looking to convert the received Dicom into an arrayBuffer that I can use to stream to the cloud or interpret with dcmjs Readfile() how do I do this?
Thank you @nexmann for highlighting that this functionality is missing from the Dataset
class.
Until implemented, you can make your own function by taking the toFile
function and strip the file saving part, like the following:
const dcmjs = require('dcmjs');
const { DicomDict, DicomMetaDictionary } = dcmjs.data;
const { Dataset, Implementation } = dcmjsDimse;
const { StorageClass } = dcmjsDimse.constants;
/**
* Saves a dataset to a DICOM P10 array buffer.
* @method
* @param {Dataset} dataset - The dataset to save.
* @param {Object} [nameMap] - Additional DICOM tags to recognize when denaturalizing the
* dataset. Can be used to support writing private fields/tags.
* @param {Object} [writeOptions] - The write options to pass through to `DicomDict.write()`.
* @returns {ArrayBuffer} DICOM P10 array buffer.
*/
function toPart10ArrayBuffer(dataset, nameMap, writeOptions) {
const elements = {
_meta: {
FileMetaInformationVersion: new Uint8Array([0, 1]).buffer,
MediaStorageSOPClassUID:
dataset.getElement('SOPClassUID') || StorageClass.SecondaryCaptureImageStorage,
MediaStorageSOPInstanceUID:
dataset.getElement('SOPInstanceUID') || Dataset.generateDerivedUid(),
TransferSyntaxUID: dataset.getTransferSyntaxUid(),
ImplementationClassUID: Implementation.getImplementationClassUid(),
ImplementationVersionName: Implementation.getImplementationVersion(),
},
...dataset.getElements(),
};
const denaturalizedMetaHeader = DicomMetaDictionary.denaturalizeDataset(elements._meta);
const dicomDict = new DicomDict(denaturalizedMetaHeader);
dicomDict.dict = nameMap
? DicomMetaDictionary.denaturalizeDataset(elements, {
...DicomMetaDictionary.nameMap,
...nameMap,
})
: DicomMetaDictionary.denaturalizeDataset(elements);
return dicomDict.write(writeOptions);
}
Excellent thank you it works well. I'm still having some problems with GSPS Dicom Files but it seems that it is due to corrupt DICOM files and me not getting Node vs JS ArrayBuffer types and doesn't have anything to do with your library.
Thanks again.
Hey, I have built an SCP server to receive DICOMs and allowed JPEG compressed transfer syntaxes. It works fine. However, when I try to convert the received DICOMs to ArrayBuffer's to read them with the dcmjs library, I get the following error:
It seems that when converting the DICOMs to binary data it removes some of the information somehow. If I save the ArrayBuffer from getDenaturalizedDataset() to disk I am able to open it in my Weasis DICOM viewer, but Weasis is also extremely lenient in terms of accepting corrupt DICOM files. If I run dciodvfy () on the DICOM file I save to disk, before and after sending and saving with dcmjs-dimse, I see that it introduces two new errors:
Furthermore it seems like it changes the TransferSyntaxUID to: 1.2.840.10008.1.2.1 | Explicit VR Little Endian when saving it. Or maybe I am doing something wrong. The DICOM file I am sending has TransferSyntaxUID: 1.2.840.10008.1.2.4.70 | JPEG Lossless
So this change is probably why I have trouble reading it with dcmjs.