I am trying to run the model on my own images which have a dimension of 428x760. It seems that when images are not of the original sample image size (480x640) there is an issue with tensor sizes. I have a feeling this might be an issue with upsampling or encoding/decoding in the network. Any suggestions how I can work around this or correct in the code?
Hello!
I am trying to run the model on my own images which have a dimension of 428x760. It seems that when images are not of the original sample image size (480x640) there is an issue with tensor sizes. I have a feeling this might be an issue with upsampling or encoding/decoding in the network. Any suggestions how I can work around this or correct in the code?
Thanks!