Open pradeepdev-1995 opened 3 months ago
Is tiktoken supports meta/llama2-70b? I want to find the token count of the prompt before passing the prompt to a meta/llama2-70b model So how to do this with tiktoken
import tiktoken enc = tiktoken.get_encoding("meta/llama2-70b")
is this possible?
Meta's models are tokenized differently from OpenAI's. tiktoken doesn't support non-OpenAI models.
tiktoken
Is tiktoken supports meta/llama2-70b? I want to find the token count of the prompt before passing the prompt to a meta/llama2-70b model So how to do this with tiktoken
is this possible?