bwconrad / flexivit

PyTorch reimplementation of FlexiViT: One Model for All Patch Sizes
MIT License
48 stars 4 forks source link

AttributeError: 'FlexiVisionTransformer' object has no attribute '_backward_hooks' #1

Closed striver123456 closed 1 year ago

striver123456 commented 1 year ago

Traceback (most recent call last): File "xx.py", line 7, in preds = net(img) File "/home/dell/anaconda3/envs/fvit/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1192, in _call_impl if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks File "/home/dell/anaconda3/envs/fvit/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1270, in getattr type(self).name, name)) AttributeError: 'FlexiVisionTransformer' object has no attribute '_backward_hooks'

striver123456 commented 1 year ago

how to solve it? Thanks!

bwconrad commented 1 year ago

I am not able to replicate this error on my end. Could you give me the script that is causing this error.

striver123456 commented 1 year ago

Thank you for your reply。 My initial mistake was as follows: Traceback (most recent call last): File "xx.py", line 4, in net = flexivit_tiny() File "/media/dell/A0BA0D9FBA0D72D8/fvit/flexivit_pytorch/flexivit.py", line 168, in flexivit_tiny return FlexiVisionTransformer(embed_dim=192, depth=12, num_heads=3, **kwargs) File "/media/dell/A0BA0D9FBA0D72D8/fvit/flexivit_pytorch/flexivit.py", line 125, in init block_fn, # type:ignore TypeError: init() takes from 1 to 24 positional arguments but 25 were given There may be an error in the parameters inside ---- super().init() in flexivit.py

bwconrad commented 1 year ago

What version of the timm library are you using? It should be at least 0.8.15.dev0. Having an older version will cause the positional arguments error.