isl-org / MiDaS

Code for robust monocular depth estimation described in "Ranftl et. al., Towards Robust Monocular Depth Estimation: Mixing Datasets for Zero-shot Cross-dataset Transfer, TPAMI 2022"
MIT License
4.27k stars 599 forks source link

AttributeError – revert to timm 0.6.13 #215

Closed WorldofDepth closed 1 year ago

WorldofDepth commented 1 year ago

I am using MiDaS v3.1 via Google Colab, and it worked fine a few days ago, but now gives the following error in response to "!python run.py --model_type dpt_beit_large_512 --input_path input --output_path output":

Start processing
  Processing input/8-s.png (1/9)
    Input resized to 512x608 before entering the encoder
Traceback (most recent call last):
  File "/content/MiDaS/run.py", line 276, in <module>
    run(args.input_path, args.output_path, args.model_weights, args.model_type, args.optimize, args.side, args.height,
  File "/content/MiDaS/run.py", line 154, in run
    prediction = process(device, model, model_type, image, (net_w, net_h), original_image_rgb.shape[1::-1],
  File "/content/MiDaS/run.py", line 61, in process
    prediction = model.forward(sample)
  File "/content/MiDaS/midas/dpt_depth.py", line 166, in forward
    return super().forward(x).squeeze(dim=1)
  File "/content/MiDaS/midas/dpt_depth.py", line 114, in forward
    layers = self.forward_transformer(self.pretrained, x)
  File "/content/MiDaS/midas/backbones/beit.py", line 15, in forward_beit
    return forward_adapted_unflatten(pretrained, x, "forward_features")
  File "/content/MiDaS/midas/backbones/utils.py", line 86, in forward_adapted_unflatten
    exec(f"glob = pretrained.model.{function_name}(x)")
  File "<string>", line 1, in <module>
  File "/content/MiDaS/midas/backbones/beit.py", line 125, in beit_forward_features
    x = blk(x, resolution, shared_rel_pos_bias=rel_pos_bias)
  File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "/content/MiDaS/midas/backbones/beit.py", line 102, in block_forward
    x = x + self.drop_path(self.gamma_1 * self.attn(self.norm1(x), resolution,
  File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1614, in __getattr__
    raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'Block' object has no attribute 'drop_path'. Did you mean: 'drop_path1'?

Does anyone know how this can be fixed? Thank you.

Jukka-Sun commented 1 year ago

did you find the solution?

WorldofDepth commented 1 year ago

No—I wish! Any idea?

I tried reverting Python 3.10.11 to 3.8.10, but it seems the torch module is missing in the latter.

Gasuhu commented 1 year ago

You just need to use an older version of timm this version worked for me !pip install timm==0.6.13