xenova / transformers.js

State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
https://huggingface.co/docs/transformers.js
Apache License 2.0
10.62k stars 646 forks source link

[Question] Can I work with Peft models through the API? #305

Open chrisfel-dev opened 11 months ago

chrisfel-dev commented 11 months ago

Let's say I have the following code in Python. How would I translate that to js?

import torch
from peft import PeftModel, PeftConfig
from transformers import AutoModelForCausalLM, AutoTokenizer

peft_model_id = "samwit/bloom-7b1-lora-tagger"
config = PeftConfig.from_pretrained(peft_model_id)
model = AutoModelForCausalLM.from_pretrained(config.base_model_name_or_path, return_dict=True, load_in_8bit=True, device_map='auto')
tokenizer = AutoTokenizer.from_pretrained(config.base_model_name_or_path)

# Load the Lora model
model = PeftModel.from_pretrained(model, peft_model_id)
xenova commented 11 months ago

Hi there 👋 We don't yet support running peft models unfortunately, but this might be an interesting avenue to explore in future. @fxmarty might be able to provide some insight about loading and using adapters in optimum/ONNX.

peft.js anyone? 👀