huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
132.34k stars 26.36k forks source link

Phi-3 maximum recursion depth exceeded when execute AutoModelForCausalLM.from_pretrained #33328

Open JaimeFon opened 2 weeks ago

JaimeFon commented 2 weeks ago

System Info

Copy-and-paste the text below in your GitHub issue and FILL OUT the two last points.

Who can help?

No response

Information

Tasks

Reproduction

I ran this code several times successfully but I got the error and from that point I keep getting the error, this is the simple code I am running

import torch from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline

torch.random.manual_seed(0)

model = AutoModelForCausalLM.from_pretrained( "microsoft/Phi-3.5-mini-instruct", device_map="cuda", torch_dtype="auto", trust_remote_code=True, )

Expected behavior

I expected to initialize the tokenizer so I can use it later in the code.

LysandreJik commented 2 weeks ago

cc @Rocketknight1

Rocketknight1 commented 2 weeks ago

Hi @JaimeFon, I've tried running this several times and can't reproduce the error. Can you paste me the entire error message?