osmr / imgclsmob

Sandbox for training deep learning networks
MIT License
2.97k stars 563 forks source link

demo_pt.py failed on ICNet #87

Closed platvlad closed 3 years ago

platvlad commented 3 years ago

Hi! I am trying to run demo script examples/demo.py on ICNet model icnet_resnetd50b_cityscapes for PyTorch:

cd examples
python demo_pt.py --model icnet_resnetd50b_cityscapes --image test.jpg --num-gpus 1

I am getting an error:

File "/home/oem/repo/libs/ML/venv/lib/python3.8/site-packages/torch/nn/functional.py", line 3151, in interpolate
    return torch._C._nn.upsample_bilinear2d(input, output_size, align_corners, scale_factors)
TypeError: upsample_bilinear2d(): argument 'output_size' must be tuple of ints, but found element of type float at pos 1

As I see in InterpolationBlock code, calc_out_size returns a tuple of floats, since its elements are multiplied by float self.scale_factor, which is 0.5 for some ICNet blocks. This cause tuple of floats to be provided to be provided into torch interpolate function here. My PyTorch version is 1.7.1.

Is it a bug of ICNet for Pytorch implementation?

osmr commented 3 years ago

Nope.

osmr commented 3 years ago

You try to use a script, designed to showcase the work of classification networks with standard ImageNet-1K pipeline, on a segmentation model.

platvlad commented 3 years ago

@osmr Thank you for your reply.

I am modifying the script by deleting everything after running model on image data (starting from this line).

So I just load, crop and normalize the image and run model on it. I`m not trying to parse the model results as classification output, but still have this error.

In other words, I also have this error with the following code:


import torch
from pytorchcv.model_provider import get_model as ptcv_get_model

net = ptcv_get_model('icnet_resnetd50b_cityscapes', pretrained=True).eval().cuda()
x = torch.rand((1, 3, 224, 224), dtype=torch.float).cuda()
net(x)
platvlad commented 3 years ago

@osmr Thank you! I have another error now, for which I opened a new issue #88.