hila-chefer / Transformer-MM-Explainability

[ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
MIT License
776 stars 106 forks source link

ssl error #38

Open fzb408 opened 8 months ago

fzb408 commented 8 months ago

self.frcnn_cfg = Config.from_pretrained("unc-nlp/frcnn-vg-finetuned") An ssl error occurred while the model was loading。 Would like to ask how to solve, looking forward to your answer,thanks!