Closed yingmuzhi closed 1 year ago
Hi, thanks for you interest. The attention module is activated by default. You can deactivate it by providing the --no_attention flag which removes the attention module from the neural network and the attention maps from the loss function (This is explained in the README). Best results were observed with the attention module activated.
Hi, there. I appreciate the paper
Wang, G. et al. Automatic Segmentation of Vestibular Schwannoma from T2-Weighted MRI by Deep Spatial Attention with Hardness-Weighted Loss, MICCAI, pp 264-272, 2019.
and I am interested in Unet2d5_spvPA. But I found in this project VS_Seg missed the attention module, why? Is there any improvement for disable this module? I also want to see the net with attention module , thanks:)