Open JaimeFon opened 2 weeks ago
Copy-and-paste the text below in your GitHub issue and FILL OUT the two last points.
transformers
No response
examples
I ran this code several times successfully but I got the error and from that point I keep getting the error, this is the simple code I am running
import torch from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
torch.random.manual_seed(0)
model = AutoModelForCausalLM.from_pretrained( "microsoft/Phi-3.5-mini-instruct", device_map="cuda", torch_dtype="auto", trust_remote_code=True, )
I expected to initialize the tokenizer so I can use it later in the code.
cc @Rocketknight1
Hi @JaimeFon, I've tried running this several times and can't reproduce the error. Can you paste me the entire error message?
System Info
Copy-and-paste the text below in your GitHub issue and FILL OUT the two last points.
transformers
version: 4.44.2Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
I ran this code several times successfully but I got the error and from that point I keep getting the error, this is the simple code I am running
import torch from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
torch.random.manual_seed(0)
model = AutoModelForCausalLM.from_pretrained( "microsoft/Phi-3.5-mini-instruct", device_map="cuda", torch_dtype="auto", trust_remote_code=True, )
Expected behavior
I expected to initialize the tokenizer so I can use it later in the code.