Open Mrs-Hudson opened 5 months ago
Hello @Mrs-Hudson
Yes, you can load the huggingface model without the GoLLIE source code this way:
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("HiTZ/GoLLIE-7B")
model = AutoModelForCausalLM.from_pretrained("HiTZ/GoLLIE-7B", trust_remote_code=True, torch_dtype=torch.bfloat16)
model.to("cuda")
Hi
I want to run this notebook. Is it possible to do so with the hugginface model instead of using the local_model method from the GoLLIE repo's source code?