Open Wonder1905 opened 1 month ago
It's not in list here, https://github.com/microsoft/onnxruntime-extensions/blob/ca433cbea706e7c1782df25391f877e28b887d61/onnxruntime_extensions/_hf_cvt.py#L183. So it haven't been supported yet.
But in huggingface code repo, it looks like Qwen2Tokenizer is very similar to existing GPT2Tokenizer, you can add an item in the list if you are urgent, Or waiting for our PR later.
Hi, Im trying to export my tokenizer, and followed this short guide: Guide Now, using: tokenizer = AutoTokenizer.from_pretrained(onnx_path, use_fast=False) onnx_tokenizer = OrtPyFunction(gen_processing_models(tokenizer, pre_kwargs={})[0])
But getting:
What are my options from this point? Thanks!