Closed hchoi-moveworks closed 2 years ago
This definitely works and we are currently working on adding this type of modularity (similar to the different prediction heads). This is explicitly what we did in our paper UNKs everywhere, to be presented at EMNLP 2021. This should be available (at least in a branch) in the upcoming weeks.
Does AdapterHub support model with custom embeddings? That is, let us imagine a model where
With this as base model, would adapterhub work? If adding adapters only impact attention blocks of the base model, I believe this should work, but I wanted to confirm as there maybe internal details that I do not know.