hila-chefer / Transformer-MM-Explainability

[ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
MIT License
801 stars 107 forks source link

ImportError: No module named lxmert.lxmert.src.tasks #23

Closed CaffreyR closed 2 years ago

CaffreyR commented 2 years ago

Hi @hila-chefer , when I tried to run the lxmert script, I encounter import problems! Thanks!

image
hila-chefer commented 2 years ago

Hi @CaffreyR, thanks for your interest!

I can’t reproduce the issue, it seems that the folder does exist did you clone the entire project to your local server?

It could also be an issue with your PATH environment variable though I see your call should have handled it.

hila-chefer commented 2 years ago

@CaffreyR closing due to inactivity, please reopen if necessary