Closed skpig closed 2 years ago
Hey @skpig, I think the model fnlp/bart-base-chinese
is broken sadly. We would need to ping the author here to notify her/him. Soon this will be able on the Hub :heart: cc @julien-c
Thanks for your reply @patrickvonplaten .
Environment info
transformers
version:4.17.0Who can help
@patrickvonplaten This issue is similar to https://github.com/huggingface/transformers/issues/9328, but I'm not so sure. If adding a merge.txt is able to fix the error, maybe the model hub should update many models / notify the creators.
Information
Model I am using (Bert, XLNet ...): Bart
The problem arises when using:
tokenizer = AutoTokenizer.from_pretrained("fnlp/bart-base-chinese")
model = AutoModel.from_pretrained("fnlp/bart-base-chinese")
Traceback (most recent call last): File "", line 1, in
File "/home/huangbz/.conda/envs/NLP/lib/python3.6/site-packages/transformers/models/auto/tokenization_auto.py", line 546, in from_pretrained
return tokenizer_class_fast.from_pretrained(pretrained_model_name_or_path, *inputs, kwargs)
File "/home/huangbz/.conda/envs/NLP/lib/python3.6/site-packages/transformers/tokenization_utils_base.py", line 1795, in from_pretrained
kwargs,
File "/home/huangbz/.conda/envs/NLP/lib/python3.6/site-packages/transformers/tokenization_utils_base.py", line 1819, in _from_pretrained
*(copy.deepcopy(kwargs)),
File "/home/huangbz/.conda/envs/NLP/lib/python3.6/site-packages/transformers/tokenization_utils_base.py", line 1923, in _from_pretrained
tokenizer = cls(init_inputs, **init_kwargs)
File "/home/huangbz/.conda/envs/NLP/lib/python3.6/site-packages/transformers/models/bart/tokenization_bart.py", line 220, in init
with open(vocab_file, encoding="utf-8") as vocab_handle:
TypeError: expected str, bytes or os.PathLike object, not NoneType