torch / torch7

http://torch.ch
Other
8.97k stars 2.38k forks source link

VolumetricCrossEntropyCriterion #973

Open arthitag opened 7 years ago

arthitag commented 7 years ago

Hi,

I am trying to use VolumetricCrossEntropyCriterion. My input to the loss layer is a Tensor of the size [batch_size x num_classes x T x H x W] and my targets is [batch_size x T x H x W]. I have checked by printing this to the terminal. I have pasted the error message I get below. I also tried to reshape targets to the same size as inputs but that just caused the training to stop with a segmentation fault without any stack trace.

I get the error : ........ .../torch/extra/cunn/lib/THCUNN/SpatialClassNLLCriterion.cu:38: void cunn_SpatialClassNLLCriterion_updateOutput_kernel(T , T , T , long , T *, int, int, int, int, int) [with T = float, AccumT = float]: block: [0,0,0], thread: [735,0,0] Assertion t >= 0 && t < n_classes failed. THCudaCheck FAIL file=/tmp/luarocks_cutorch-scm-1-107/cutorch/lib/THC/generic/THCStorage.c line=32 error=59 : device-side assert triggered ..../torch/install/bin/luajit: cuda runtime error (59) : device-side assert triggered at /tmp/luarocks_cutorch-scm-1-107/cutorch/lib/THC/generic/THCStorage.c:32 stack traceback: [C]: at 0x7f1337236190 [C]: in function '__index' ...ch/install/share/lua/5.1/nn/SpatialClassNLLCriterion.lua:51: in function 'updateOutput' ...all/share/lua/5.1/cudnn/SpatialCrossEntropyCriterion.lua:37: in function 'updateOutput' .../share/lua/5.1/cudnn/VolumetricCrossEntropyCriterion.lua:46: in function 'forward' train.lua:65: in function 'opfunc' /scratch0/arthita/torch/install/share/lua/5.1/optim/sgd.lua:44: in function 'sgd' train.lua:74: in function 'train' train.lua:82: in main chunk [C]: in function 'dofile' run.lua:98: in main chunk [C]: in function 'dofile' ...hita/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:150: in main chunk [C]: at 0x00406580

Can someone please help?

Thank you

1LOVESJohnny commented 6 years ago

Hi, have you solved this problem? If so, please share the solution. Thanks a lot!