icon-lab / SynDiff

Official PyTorch implementation of SynDiff described in the paper (https://arxiv.org/abs/2207.08208).
Other
207 stars 38 forks source link

Using the SynDiff_Sample_Data and using the pretrained network to output T2 from T1 #25

Open carlouic opened 1 year ago

carlouic commented 1 year ago

Hello, Thanks for your amazing work and for sharing it with all of us. I have a couple questions: first thing, when i try to load the T1.mat and T2.mat file, i get an error like this in matlab:

Error using load Unknown text on line number 1 of ASCII file path/to/T2/T2.mat "HDF ". Error in uiimport/runImportdata (line 470) datastruct = load('-ascii', fileAbsolutePath); Error in uiimport/gatherFilePreviewData (line 438) [datastruct, textDelimiter, headerLines]= runImportdata(fileAbsolutePath, type); Error in uiimport (line 260) gatherFilePreviewData(fileAbsolutePath);

this is using the very basic load function in matlab. should i try to load it in a different way?

second thing, i downloaded the pretrained network to perform the synthesis of T2 starting from T1. i run this line of code:

python test.py --image_size 256 --exp exp_syndiff --num_channels 2 --num_channels_dae 64 --ch_mult 1 1 2 2 4 4 --num_timesteps 4 --num_res_blocks 2 --batch_size 1 --embedding_type positional --z_emb_dim 256 --contrast1 T1 --contrast2 T2 --which_epoch 50 --gpu_chose 0 --input_path /input/path/for/data --output_path /output/for/results

and the result is amazing. how can i try the pretrained model on some of my data? I am trying to convert my nifti file to mat in this way:

import nibabel as nib nii = nib.load('filename.nii') slices = data.reshape((1, data.shape[0], data.shape[1], data.shape[2])) sio.savemat('slices.mat', {'slices': slices})

but then when i try to use it in the testing stage, this is the error i get, when trying to load it using the LoadDataset() function:

OSError: Unable to open file (file signature not found)

However, when I try to open this file in matlab, this works and i can see the image.

Thanks!

thanks in advance!! best regards.