xenova / transformers.js

State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
https://huggingface.co/docs/transformers.js
Apache License 2.0
11k stars 672 forks source link

Is it possible to use adapters from the hub? #514

Open vabatta opened 8 months ago

vabatta commented 8 months ago

Question

Hi, would it be possible to use adapters on top of a model using the js library?

xenova commented 8 months ago

Hi there! 👋 Do you have any specific examples/models in mind? 😇

vabatta commented 8 months ago

Yes, I trained a few adapters on distilbert-multilingual-based-cased with a token classification head. So I have the adapter weights and head, which I can successfully load and use with adapters library and would like to package it up the model as a whole to be able to use in transformerjs (in onnx runtime).

xenova commented 8 months ago

In that case, you have two options:

  1. Since the transformers library supports distilbert models for token-classification, you can save the model as a single DistilbertForTokenClassification model and upload it to the HF hub. You can then run the conversion script and it should work with transformers.js
  2. If you would like to keep the two models separate, you can load the base model first, and load the second model with ORT, then pass the inputs between them yourself. I think it will also be great to support loading custom ONNX models (even those not exported with optimum or compatible with transformers.js).
vabatta commented 8 months ago

Okay, I'll have a look at how to do save it as so far I wasn't unable to save the full model with the adapter and head to be later loaded as a whole.