microsoft / DynamicHead

MIT License
624 stars 60 forks source link

Defining Spatial-aware Attention Layer error #13

Closed Aliweka2020 closed 2 years ago

Aliweka2020 commented 2 years ago

when running this line of code spatial_output = spatial_layer(scale_output)

I got this error ~\AppData\Local\Continuum\anaconda3\lib\site-packages\torch\nn\modules\module.py in _call_impl(self, *input, kwargs) 725 result = self._slow_forward(*input, *kwargs) 726 else: --> 727 result = self.forward(input, kwargs) 728 for hook in itertools.chain( 729 _global_forward_hooks.values(),

TypeError: forward() takes 3 positional arguments but 4 were given

vdavid70619 commented 2 years ago

Thanks for your interests! The code is released now. Please check it out.

Aliweka2020 commented 2 years ago

upgrade torch version from 7.1.0 to 10.1.0 solve the problem