我在执行cli_demo.py时,报错找不到属性
(base) root@hzhb:/data/fuzi.mingcha-main/src# python3 cli_demo.py --url_lucene_task1 "法条检索对应部署的 pylucene 地址" --url_lucene_task2 "类案检索对应部署的 pylucene 地址"
正在加载模型
Traceback (most recent call last):
File "cli_demo.py", line 17, in
tokenizer = AutoTokenizer.from_pretrained("/data/fuzi-mingcha-v1_0", trust_remote_code=True)
File "/usr/local/lib/python3.8/dist-packages/transformers/models/auto/tokenization_auto.py", line 755, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, *kwargs)
File "/usr/local/lib/python3.8/dist-packages/transformers/tokenization_utils_base.py", line 2024, in from_pretrained
return cls._from_pretrained(
File "/usr/local/lib/python3.8/dist-packages/transformers/tokenization_utils_base.py", line 2256, in _from_pretrained
tokenizer = cls(init_inputs, **init_kwargs)
File "/root/.cache/huggingface/modules/transformers_modules/fuzi-mingcha-v1_0/tokenization_chatglm.py", line 196, in init
super().init(
File "/usr/local/lib/python3.8/dist-packages/transformers/tokenization_utils.py", line 367, in init
self._add_tokens(
File "/usr/local/lib/python3.8/dist-packages/transformers/tokenization_utils.py", line 467, in _add_tokens
current_vocab = self.get_vocab().copy()
File "/root/.cache/huggingface/modules/transformers_modules/fuzi-mingcha-v1_0/tokenization_chatglm.py", line 248, in get_vocab
vocab = {self._convert_id_to_token(i): i for i in range(self.vocab_size)}
File "/root/.cache/huggingface/modules/transformers_modules/fuzi-mingcha-v1_0/tokenization_chatglm.py", line 244, in vocab_size
return self.sp_tokenizer.num_tokens
AttributeError: 'ChatGLMTokenizer' object has no attribute 'sp_tokenizer'
我在执行cli_demo.py时,报错找不到属性 (base) root@hzhb:/data/fuzi.mingcha-main/src# python3 cli_demo.py --url_lucene_task1 "法条检索对应部署的 pylucene 地址" --url_lucene_task2 "类案检索对应部署的 pylucene 地址" 正在加载模型 Traceback (most recent call last): File "cli_demo.py", line 17, in
tokenizer = AutoTokenizer.from_pretrained("/data/fuzi-mingcha-v1_0", trust_remote_code=True)
File "/usr/local/lib/python3.8/dist-packages/transformers/models/auto/tokenization_auto.py", line 755, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, *kwargs)
File "/usr/local/lib/python3.8/dist-packages/transformers/tokenization_utils_base.py", line 2024, in from_pretrained
return cls._from_pretrained(
File "/usr/local/lib/python3.8/dist-packages/transformers/tokenization_utils_base.py", line 2256, in _from_pretrained
tokenizer = cls(init_inputs, **init_kwargs)
File "/root/.cache/huggingface/modules/transformers_modules/fuzi-mingcha-v1_0/tokenization_chatglm.py", line 196, in init
super().init(
File "/usr/local/lib/python3.8/dist-packages/transformers/tokenization_utils.py", line 367, in init
self._add_tokens(
File "/usr/local/lib/python3.8/dist-packages/transformers/tokenization_utils.py", line 467, in _add_tokens
current_vocab = self.get_vocab().copy()
File "/root/.cache/huggingface/modules/transformers_modules/fuzi-mingcha-v1_0/tokenization_chatglm.py", line 248, in get_vocab
vocab = {self._convert_id_to_token(i): i for i in range(self.vocab_size)}
File "/root/.cache/huggingface/modules/transformers_modules/fuzi-mingcha-v1_0/tokenization_chatglm.py", line 244, in vocab_size
return self.sp_tokenizer.num_tokens
AttributeError: 'ChatGLMTokenizer' object has no attribute 'sp_tokenizer'