detectRecog / CCPD

[ECCV 2018] CCPD: a diverse and well-annotated dataset for license plate detection and recognition
MIT License
2.25k stars 569 forks source link

train error #107

Open Zoinye opened 6 months ago

Zoinye commented 6 months ago

hey! I have tow problems and My torch version is 1.12! Seemly error is due to fps_pred, y_pred = model(x) .

  1. The first one is division by zero .I try to use loss.item() to replace `loss.data[0],but it not work! The whole error is

Traceback (most recent call last): File "D:\project\python2\CCPD-master\rpnet\rpnet.py", line 422, in model_conv = train_model(model_conv, criterion, optimizer_conv, num_epochs=epochs) File "D:\project\python2\CCPD-master\rpnet\rpnet.py", line 412, in train_model print('%s %s %s\n' % (epoch, sum(lossAver) / (len(lossAver)), time() - start)) ZeroDivisionError: division by zero`.

  1. The other is RuntimeError: adaptive_max_pool2d(): Expected input to have non-zero size for non-batch dimensions, but input has sizes [1, 64, 0, 0] with dimension 2 being empty.The two dimensions are zero and I don't know how to solve it!

The whole error is

Traceback (most recent call last): File "D:\project\python2\CCPD-master\rpnet\rpnet.py", line 422, in model_conv = train_model(model_conv, criterion, optimizer_conv, num_epochs=epochs) File "D:\project\python2\CCPD-master\rpnet\rpnet.py", line 414, in train_model count, correct, error, precision, avgTime = eval(model, testDirs) File "D:\project\python2\CCPD-master\rpnet\rpnet.py", line 334, in eval fps_pred, y_pred = model(x) File "E:\app\anaconda3\envs\pytorch1\lib\site-packages\torch\nn\modules\module.py", line 1130, in _call_impl return forward_call(*input, kwargs) File "E:\app\anaconda3\envs\pytorch1\lib\site-packages\torch\nn\parallel\data_parallel.py", line 166, in forward return self.module(*inputs[0], *kwargs[0]) File "E:\app\anaconda3\envs\pytorch1\lib\site-packages\torch\nn\modules\module.py", line 1130, in _call_impl return forward_call(input, kwargs) File "D:\project\python2\CCPD-master\rpnet\rpnet.py", line 267, in forward roi1 = roi_pooling_ims(_x1, boxNew.mm(p1), size=(16, 8)) File "D:\project\python2\CCPD-master\rpnet\roi_pooling.py", line 74, in roi_pooling_ims output.append(F.adaptive_max_pool2d(im, size)) File "E:\app\anaconda3\envs\pytorch1\lib\site-packages\torch_jit_internal.py", line 423, in fn return if_false(*args, **kwargs) File "E:\app\anaconda3\envs\pytorch1\lib\site-packages\torch\nn\functional.py", line 1129, in _adaptive_max_pool2d return adaptive_max_pool2d_with_indices(input, output_size)[0] File "E:\app\anaconda3\envs\pytorch1\lib\site-packages\torch\nn\functional.py", line 1121, in adaptive_max_pool2d_with_indices return torch._C._nn.adaptive_max_pool2d(input, output_size) RuntimeError: adaptive_max_pool2d(): Expected input to have non-zero size for non-batch dimensions, but input has sizes [1, 64, 0, 0] with dimension 2 being empty

@detectRecog @shizhenbo