wzmsltw / BSN-boundary-sensitive-network.pytorch

Codes of our paper: "BSN: Boundary Sensitive Network for Temporal Action Proposal Generation"
248 stars 58 forks source link

Error: Sizes of tensors must match except in dimension 0. Got 1000 and 723 in dimension 1 #4

Closed LvJC closed 5 years ago

LvJC commented 5 years ago

It works well until run this step.

python main.py --module PEM --mode inference

Here's the output result:

PEM inference start
validation subset video numbers: 4728
Traceback (most recent call last):
  File "main.py", line 298, in <module>
    main(opt)
  File "main.py", line 276, in main
    BSN_inference_PEM(opt)
  File "main.py", line 221, in BSN_inference_PEM
    for idx,(video_feature,video_xmin,video_xmax,video_xmin_score,video_xmax_score) in enumerate(test_loader):
  File "/lvjc/envs/anaconda2/lib/python2.7/site-packages/torch/utils/data/dataloader.py", line 637, in __next__
    return self._process_next_batch(batch)
  File "/lvjc/envs/anaconda2/lib/python2.7/site-packages/torch/utils/data/dataloader.py", line 658, in _process_next_batch
    raise batch.exc_type(batch.exc_msg)
RuntimeError: Traceback (most recent call last):
  File "/lvjc/envs/anaconda2/lib/python2.7/site-packages/torch/utils/data/dataloader.py", line 138, in _worker_loop
    samples = collate_fn([dataset[i] for i in batch_indices])
  File "/lvjc/envs/anaconda2/lib/python2.7/site-packages/torch/utils/data/dataloader.py", line 232, in default_collate
    return [default_collate(samples) for samples in transposed]
  File "/lvjc/envs/anaconda2/lib/python2.7/site-packages/torch/utils/data/dataloader.py", line 209, in default_collate
    return torch.stack(batch, 0, out=out)
RuntimeError: invalid argument 0: Sizes of tensors must match except in dimension 0. Got 1000 and 723 in dimension 1
wzmsltw commented 5 years ago

Since I have no device for experiment recently, I cannot re-run these codes. Have you change any configuration in opts.py?

wzmsltw commented 5 years ago

Hi, @LvJC , please follow command in bsn.sh python main.py --module PEM --mode inference --pem_batch_size 1.

LvJC commented 5 years ago

Hi, @LvJC , please follow command in bsn.sh python main.py --module PEM --mode inference --pem_batch_size 1.

Yep, I figured it out and there's another problem that you set pem_top_K_inference to 1000, but many PEM validation features have length less that 1000, so it might change to 100 and ensure that it works well.

dreamedrainbow commented 4 years ago

Hi, @LvJC , please follow command in bsn.sh python main.py --module PEM --mode inference --pem_batch_size 1.

Yep, I figured it out and there's another problem that you set pem_top_K_inference to 1000, but many PEM validation features have length less that 1000, so it might change to 100 and ensure that it works well.

Hi. I have the same error as you when reproducing the source code. I also realize the length of proposals generated as you mentioned. But how can you generate the AR@1000 under the situation that the TOP K length of proposal =100? Any suggestions would be grateful .