hitz-zentroa / GoLLIE

Guideline following Large Language Model for Information Extraction
https://hitz-zentroa.github.io/GoLLIE/
Apache License 2.0
261 stars 17 forks source link

Custom Task NER with Huggingface #13

Open Mrs-Hudson opened 5 months ago

Mrs-Hudson commented 5 months ago

Hi

I want to run this notebook. Is it possible to do so with the hugginface model instead of using the local_model method from the GoLLIE repo's source code?

ikergarcia1996 commented 5 months ago

Hello @Mrs-Hudson

Yes, you can load the huggingface model without the GoLLIE source code this way:

import torch
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("HiTZ/GoLLIE-7B")
model = AutoModelForCausalLM.from_pretrained("HiTZ/GoLLIE-7B", trust_remote_code=True, torch_dtype=torch.bfloat16)
model.to("cuda")