Open bibibabibo26 opened 5 months ago
https://huggingface.co/shilongz/debug 这个是一个merge好,原本用于demo的checkpoint
https://huggingface.co/shilongz/debug 这个是一个merge好,原本用于demo的checkpoint
谢谢!
https://huggingface.co/shilongz/debug 这个是一个merge好,原本用于demo的checkpoint 你好,在我运行app.py时,调试到加载tokenizer时报错RecursionError: maximum recursion depth exceeded in comparison。我融合权重文件时也是调试到delta_tokenizer = AutoTokenizer.from_pretrained(delta_path)报错RecursionError,请问项目环境的transformers版本是多少?或者可能是其他什么原因造成错误?
def build_model(self, model_name): ######################## base model define ######################## print('Start loading model...') disable_torch_init() model_name = os.path.expanduser(model_name) self.tokenizer = AutoTokenizer.from_pretrained(model_name) from gpt4roi.models.spi_llava import SPILlavaMPTForCausalLM
# TODO add detector for normal conversation
self.model = SPILlavaMPTForCausalLM.from_pretrained(
model_name,
low_cpu_mem_usage=True,
torch_dtype=torch.float16,
use_cache=True,
).cuda()
你好,在我合成llama-base和你的roi-delta权重时报错,可以告诉我是什么原因怎么解决吗? RecursionError: maximum recursion depth exceeded while calling a Python object