Open Willy-NotSpe opened 2 years ago
Hi Rojin,
I don't setup the data_path.py. Because I don't train the model for myself data, I just use the pre-train model Did that cause this problem? Thanks for your reply.
best regards
Rojin @.***> 於 2022年3月25日 週五 下午5:56寫道:
Hi,
did you setup data_path.py?
best
— Reply to this email directly, view it on GitHub https://github.com/PeterL1n/BackgroundMattingV2/issues/172#issuecomment-1078844911, or unsubscribe https://github.com/notifications/unsubscribe-auth/APLMKXTSZVECJIKZPHE2BADVBWETPANCNFSM5OGEGOGQ . You are receiving this because you authored the thread.Message ID: @.***>
No no, you are right, you don't need to setup data_path.py. Can You upload your input image and the related background?
Sorry, I can't provide the data generated by the video. Because the video was obtained from a private organization.
Do You use inference_images?
Yes. I don't know why two different model types have the same steps in the first place. "mattingbase" can be done, but another cannot.
HI I've got the same problem here. Any solutions yet?
I got the following error message after input the src and bgr imge. Please help me, thanks.
Traceback (most recent call last): File "./Matting/inferenceimages.py", line 131, in
pha, fgr, , _, err, ref = model(src, bgr)
File "/usr/local/lib/python3.6/dist-packages/torch/nn/modules/module.py", line 727, in _call_impl
result = self.forward(*input, *kwargs)
File "/app/Matting/model/model.py", line 189, in forward
pha, fgr, ref_sm = self.refiner(src, bgr, pha_sm, fgr_sm, err_sm, hid_sm)
File "/usr/local/lib/python3.6/dist-packages/torch/nn/modules/module.py", line 727, in _call_impl
result = self.forward(input, **kwargs)
File "/app/Matting/model/refiner.py", line 95, in forward
ref = self.select_refinement_regions(err)
File "/app/Matting/model/refiner.py", line 175, in select_refinement_regions
idx = err.topk(self.sample_pixels // 16, dim=1, sorted=False).indices
RuntimeError: invalid argument 5: k not in range for dimension at /pytorch/aten/src/THC/generic/THCTensorTopK.cu:26