adapter-hub / adapters

A Unified Library for Parameter-Efficient and Modular Transfer Learning
https://docs.adapterhub.ml
Apache License 2.0
2.56k stars 342 forks source link

adapter support for summarisation task #442

Closed ra-MANUJ-an closed 1 year ago

ra-MANUJ-an commented 2 years ago

Hi, does adapter hub provides support for encoder decoder models such as t5 or longformer. So as to perform summarization task with the incorporation of adapters?

ra-MANUJ-an commented 2 years ago

@calpt @AmirAktify

ra-MANUJ-an commented 2 years ago

cc : @JoPfeiff sorry for bothering

calpt commented 1 year ago

Hey @ra-MANUJ-an, adapter-transformers does support encoder-decoder models that can be used for summarization, e.g. T5 or BART. You can find a full list of supported model architectures at https://docs.adapterhub.ml/model_overview.html. adapter-transformers also adjusts many of HuggingFace's example scripts for training adapters (including summarization). You can find these here. Hope this helps.

ra-MANUJ-an commented 1 year ago

Hi @calpt thanks for the response! As I'm currently looking at the repository, for the summarisation task, if I've to train and evaluate the model on some dataset only utilising "run_summarization.py" should give the results, right?

And is this code compatible for long document summarisation task?

gabinguo commented 1 year ago

Hey, any updates for the Longformer adapter support ?

ra-MANUJ-an commented 1 year ago

There doesn't seem to be support for longformer unfortunately. Perhaps, you can ask for it in issues itself.

adapter-hub-bert commented 1 year ago

This issue has been automatically marked as stale because it has been without activity for 90 days. This issue will be closed in 14 days unless you comment or remove the stale label.

adapter-hub-bert commented 1 year ago

This issue was closed because it was stale for 14 days without any activity.