Islanna / DynamicReLU

Implementation of Dynamic ReLU on Pytorch
204 stars 38 forks source link

conv3d #10

Open walynlee opened 3 years ago

walynlee commented 3 years ago

Hello, can I ask how to operate with conv_type='3d'? about the code 'theta = torch.mean(theta, axis=-1)' when conv_type = '2d'? Do somebody know?Thanks a lot.

walynlee commented 3 years ago

when I put the DyReluA in my model, it occurs this error: raceback (most recent call last): File ".\main.py", line 119, in main() File ".\main.py", line 111, in main train_funcation = train_eval File "D:\walyn\CTPelvic1K_project\utils\util.py", line 156, in train loss.backward() File "D:\Anaconda\envs\rtx3090\lib\site-packages\torch\tensor.py", line 221, in backward torch.autograd.backward(self, gradient, retain_graph, create_graph) File "D:\Anaconda\envs\rtx3090\lib\site-packages\torch\autograd__init__.py", line 132, in backward allow_unreachable=True) # allow_unreachable flag RuntimeError: ones needs to be contiguous and I don't know how to fix it. Would somebody know,please?