Open DaveTJones opened 10 months ago
Hi there 👋 You can create a custom OnnxConfig in optimum to enable the attention matrices. See here for more information: https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models
Brilliant, thankyou!
Question
Hey, I've been trying to access the attentions output by the MarianMT like so (please excuse the unorthodox config argument, tidying up is next on my todo list):
I'm then getting the following error when I run the code:
Error:
output_attentionsis true, but the model did not produce cross-attentions. This is most likely because the model was not exported with
output_attentions=True.
I've looked around but haven't been able to find out what is meant by the reference to exporting the model. How would I go about fixing this?