jcreinhold / synthtorch

deep neural network-based image translation/synthesis
Other
27 stars 7 forks source link

nn-train patchsize error #17

Closed mcbrs1a closed 5 years ago

mcbrs1a commented 5 years ago

Strange, I am using 3D images, I use a patchsize of [64, 64], the reason for this is in your example you seem to use one dimension for patchsize, i.e --patch-size 64, I assume this is for 2D images. When I attempt a 3D patch size (with -dm 3 and 3D images ) I get the below:

synthtorch.exec.nn_train - ERROR - too many values to unpack (expected 2)

with the patchsize of 64 64 I get:

(start): ModuleList( (0): Sequential( (0): ReplicationPad3d((1, 1, 1, 1, 1, 1)) (1): Conv3d(1, 32, kernel_size=(3, 3, 3), stride=(1, 1, 1), bias=False) (2): InstanceNorm3d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=False) (3): ReLU(inplace) ...

Any suggestions appreciated

jcreinhold commented 5 years ago

If you want a 64x64x64 patch, then set dim to 3 and set the value to the patch_size key to either [64,64,64] or 64 (where the single number is outside of a list). The key dim controls the dimensions of the convolutions, e.g., 3 for 3D convolutions, 2 for 2D convolutions. If only 64 is provided—outside of a list—and dim equals 3, then the patch will be assumed to be isotropic, i.e., 64x64x64. If only 64 were provided and dim was equal to 2, then the patch size would be 64x64. The list format for the patch size allows you to use non-isotropic patch sizes, e.g., 128x64x8 or whatever.

NIfTI images are always assumed to be 3D images in synthtorch but you can set dim to 2 and extract 2D patches, if desired, which is why you need to specify these options explicitly in the config file.

The provided explanation of the patch size in the documentation answers this question. You can either look at the online documentation or use nn-train -h for this reference material in the future.

If you run into a problem that is not in the documentation related to this post, please re-open the issue next time so there is continuity (this post is a continuation of #16). Please provide the full error message. Also, please provide the version number (or, even better, the commit hash) of your installation of synthtorch and provide your modified config.json file.

Best of luck.

mcbrs1a commented 5 years ago

Thanks I am however using a nifti image and as you state it should assume this is 3D, so I do not set -dm, I then get the error: nn_train - ERROR - 5D tensors expect 6 values for padding If I set -dm to 3, I get: ERROR - CUDA out of memory. Tried to allocate 720.00 MiB (GPU 0; 15.90 GiB total capacity; 14.76 GiB already allocated; 367.88 MiB free; 40.60 MiB cached) I want to use 3D convolutions and 3D images, any help appreciated

jcreinhold commented 5 years ago

NIfTI files are assumed to be 3D, but you need to specify the dimension of the convolutions in the network. If you wanted 3D convolutions, then setting -dm 3 was correct. You simply ran out of memory on your GPU, which is not a problem with the package. Try making the batch size or patch size smaller to fix the memory issue. You will just need to experiment with the batch size and/or patch size to see what will work on your GPU.

mcbrs1a commented 5 years ago

Thanks appreciate that, we had our IT personnel look at this and we appear to have more than enough memory on the GPU, and are using 2 GPU's but still get the issue. I use the flag "multi-gpu" as a stand alone command i.e -multi-gpu, adding "True" after this throws an error, any ideas how to resolve the memory without using patches. I have previously ran some tests with patches and get a patchwork looking output when using nn-predict, this however is when operating in 2D, can't get it to work in 3D

mcbrs1a commented 5 years ago

Also still having the patch size problem when attempting in 3D

You recommended using one value for isotropic patch, so I have -id 3 -dm 3 -ps 50 and I get the error "not enough values to unpack (expected 2, got 1)" i.e here: /niftidataset/transforms.py", line 184, in call hh, ww = self.output_size ValueError: not enough values to unpack (expected 2, got 1), from looking into transforms.py it appears to be calling a 2D patch function, shouldn't this be 3D, as the image i'm feeding is nifti and I have -id 3

is this a problem in the nifti header or with the synthtorch? Any suggestions what I may need to alter in the header if this is the route of the problem?

When I put three values in as a list, as you suggested, it doesn't recognize the patch size , and states invalid int value [50,50,50]

any suggestions?

jcreinhold commented 5 years ago

You may have two GPUs with a large amount of memory, but you are using settings in synthtorch that cause memory overflows. You will need to figure out settings that work on your machine through experimentation.

The --multi-gpu option is a flag that does not require a True or anything after it, if you use the command line interface (CLI). That may be the cause of your error but without the error report I cannot tell.

Use -ps 50 50 50 if you want a 50^3 patch size from the CLI. With the config file you can use 50 as a value alone, but you need to specify all dimensions with the CLI.

Careful reading of the documentation would resolve most of the issues you are running into. Note that synthtorch is a research package that I use for myself mostly and put up because I thought it might be useful for other people.

If you run into a new specific bug or see something in the documentation that is blatantly misleading or incorrect, then please open a new issue—filling the provided templates as completely as possible. If you do end up submitting a new issue, then I'll need you to follow my previous requests regarding posting the config file or full CLI instruction, version number, and as much other contextual information as possible.

Unfortunately, I won't be able to respond to this support thread anymore due to time constraints.

Best of luck.