Closed Jeavy closed 6 years ago
The assertion checks that your multigpu strategy matches the number of GPUs. Obviously when you split the network into chunks, the number of chunks must equal the number of GPUs. I don't have multiple GPUs myself in a single computer, but if I understand this correctly, for 2 GPUs your strategy should contain a single split.
You were right, @htoyryla, I just had to put a single parameter there. Thanks a lot.
Hi, first of all, thank you for this nice work. I got a server with 2 GPU and when I try to run it with them I get this error:
... torch/install/bin/luajit: neural_style.lua:393: assertion failed! stack traceback: [C]: in function 'assert' neural_style.lua:393: in function 'setup_multi_gpu' neural_style.lua:151: in function 'main' neural_style.lua:601: in main chunk [C]: in function 'dofile' ...rmar/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:150: in main chunk [C]: at 0x004064f0
I tried different parametres in multigpu_strategy and tried leaving it empty, too, but it doesn't work. Any help would be grateful.
Thanks.