Closed ra-MANUJ-an closed 1 year ago
@calpt @AmirAktify
cc : @JoPfeiff sorry for bothering
Hey @ra-MANUJ-an, adapter-transformers does support encoder-decoder models that can be used for summarization, e.g. T5 or BART. You can find a full list of supported model architectures at https://docs.adapterhub.ml/model_overview.html. adapter-transformers also adjusts many of HuggingFace's example scripts for training adapters (including summarization). You can find these here. Hope this helps.
Hi @calpt thanks for the response! As I'm currently looking at the repository, for the summarisation task, if I've to train and evaluate the model on some dataset only utilising "run_summarization.py" should give the results, right?
And is this code compatible for long document summarisation task?
Hey, any updates for the Longformer adapter support ?
There doesn't seem to be support for longformer unfortunately. Perhaps, you can ask for it in issues itself.
This issue has been automatically marked as stale because it has been without activity for 90 days. This issue will be closed in 14 days unless you comment or remove the stale label.
This issue was closed because it was stale for 14 days without any activity.
Hi, does adapter hub provides support for encoder decoder models such as t5 or longformer. So as to perform summarization task with the incorporation of adapters?