sshaoshuai / MTR

MTR: Motion Transformer with Global Intention Localization and Local Movement Refinement, NeurIPS 2022.
Apache License 2.0
666 stars 106 forks source link

testing question #80

Open Sally0330 opened 9 months ago

Sally0330 commented 9 months ago

hello.After I imported requirement.txt and setup.py, the following problems occurred when running train. Could you please answer them? Thank you Traceback (most recent call last): File "F:\MTR-master\tools\train.py", line 23, in from mtr.models import model as model_utils File "F:\MTR-master\tools..\mtr\models\model.py", line 15, in from mtr.models.context_encoder import build_context_encoder File "F:\MTR-master\tools..\mtr\models\context_encoder__init__.py", line 7, in from .mtr_encoder import MTREncoder File "F:\MTR-master\tools..\mtr\models\context_encoder\mtr_encoder.py", line 12, in from mtr.models.utils.transformer import transformer_encoder_layer, position_encoding_utils File "F:\MTR-master\tools..\mtr\models\utils\transformer\transformer_encoder_layer.py", line 16, in from .multi_head_attention_local import MultiheadAttentionLocal File "F:\MTR-master\tools..\mtr\models\utils\transformer\multi_head_attention_local.py", line 21, in from mtr.ops import attention File "F:\MTR-master\tools..\mtr\ops\attention__init.py", line 6, in import mtr.ops.attention.attention_utils_v2 as attention_utils_v2 File "F:\MTR-master\tools..\mtr\ops\attention\attention_utils_v2.py", line 9, in from . import attention_cuda ImportError: cannot import name 'attention_cuda' from partially initialized module 'mtr.ops.attention' (most likely due to a circular import) (F:\MTR-master\tools..\mtr\ops\attention\init__.py)

1832850085 commented 5 months ago

you need to run setup.py first: python setup.py develop