Closed dopc closed 4 months ago
Hey, hope you are well.
I am willing to use Cohere's HF tokenizers and get the model_max_length information from its tokenizer_config.json file. Is there any way to do that?
model_max_length
tokenizer_config.json
from tokenizers import Tokenizer tokenizer = Tokenizer.from_pretrained(identifier="Cohere/Cohere-embed-multilingual-light-v3.0") # tokenizer.max_len # tokenizer.model_max_length
I found the below method gives the access
from transformers import AutoTokenizer tokenizer = AutoTokenizer.from_pretrained("Cohere/Cohere-embed-multilingual-light-v3.0")
Hey, hope you are well.
I am willing to use Cohere's HF tokenizers and get the
model_max_length
information from itstokenizer_config.json
file. Is there any way to do that?