openai / tiktoken

tiktoken is a fast BPE tokeniser for use with OpenAI's models.
MIT License
11.03k stars 748 forks source link

How to find the token count of a prompt using meta/llama2-70b model #274

Open pradeepdev-1995 opened 3 months ago

pradeepdev-1995 commented 3 months ago

Is tiktoken supports meta/llama2-70b? I want to find the token count of the prompt before passing the prompt to a meta/llama2-70b model So how to do this with tiktoken

import tiktoken
enc = tiktoken.get_encoding("meta/llama2-70b")

is this possible?

jmehnle commented 1 week ago

Meta's models are tokenized differently from OpenAI's. tiktoken doesn't support non-OpenAI models.