Open jhaggle opened 11 months ago
@jhaggle.. Hey, did you solve it?
Hi all, You can check the input of the SAM model what shape it takes of the masks and images. This can solve your problem, as I also got this problem and kept the size as 256. because the sam model takes inputs/output of (256,256) masks size.. You could correct me if I am wrong.
When changing patch_size from 256 to 512 and step size from 256 to 512 I get this error:
"Error: AssertionError: ground truth has different shape (torch.Size([2, 1, 512, 512])) from input (torch.Size([2, 1, 256, 256]))"
In this notebook:
https://github.com/bnsreenu/python_for_microscopists/blob/master/331_fine_tune_SAM_mito.ipynb
Is 256 hardcoded somewhere or why is this?