(venv) bw@bw-X570-GAMING-X:~/Python-3.8.9/huatuoGPT/HuatuoGPT$ python3.8 -m huatuo_cli_demo_stream --model-name models/
Traceback (most recent call last):
File "/usr/local/lib/python3.8/runpy.py", line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/local/lib/python3.8/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/home/bw/Python-3.8.9/huatuoGPT/HuatuoGPT/huatuo_cli_demo_stream.py", line 160, in
main(args)
File "/home/bw/Python-3.8.9/huatuoGPT/HuatuoGPT/huatuo_cli_demo_stream.py", line 117, in main
model, tokenizer = load_model(args.model_name, args.device, args.num_gpus)
File "/home/bw/Python-3.8.9/huatuoGPT/HuatuoGPT/huatuo_cli_demo_stream.py", line 27, in load_model
tokenizer = AutoTokenizer.from_pretrained(model_name, padding_side="right", use_fast=True)
File "/home/bw/Python-3.8.9/venv/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py", line 724, in from_pretrained
raise ValueError(
ValueError: Tokenizer class BaiChuanTokenizer does not exist or is not currently imported.
您好,我试着运行了,报了个错,怎么解决?谢谢
(venv) bw@bw-X570-GAMING-X:~/Python-3.8.9/huatuoGPT/HuatuoGPT$ python3.8 -m huatuo_cli_demo_stream --model-name models/ Traceback (most recent call last): File "/usr/local/lib/python3.8/runpy.py", line 194, in _run_module_as_main return _run_code(code, main_globals, None, File "/usr/local/lib/python3.8/runpy.py", line 87, in _run_code exec(code, run_globals) File "/home/bw/Python-3.8.9/huatuoGPT/HuatuoGPT/huatuo_cli_demo_stream.py", line 160, in
main(args)
File "/home/bw/Python-3.8.9/huatuoGPT/HuatuoGPT/huatuo_cli_demo_stream.py", line 117, in main
model, tokenizer = load_model(args.model_name, args.device, args.num_gpus)
File "/home/bw/Python-3.8.9/huatuoGPT/HuatuoGPT/huatuo_cli_demo_stream.py", line 27, in load_model
tokenizer = AutoTokenizer.from_pretrained(model_name, padding_side="right", use_fast=True)
File "/home/bw/Python-3.8.9/venv/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py", line 724, in from_pretrained
raise ValueError(
ValueError: Tokenizer class BaiChuanTokenizer does not exist or is not currently imported.
您好,我试着运行了,报了个错,怎么解决?谢谢