Open TomLucidor opened 6 days ago
Agreed, will look into it now, thanks for the feedback! Some thoughts - I'm currently working on getting weekly updated pricing charts, though this would have to work with an internet connection. The rate at which available tokenizers grow is alarming to me though and I may just have a custom add option where you can input the pricing/model directly. And maybe have the top couple of models available as presets. Would love to hear your thoughts, and again thanks for the feedback!
Currently LLM providers like Anthropic, Google, and Mistral, are being competitive relative to OpanAI. Also services like OpenRouter can flatten the price table for interoperability between different services. Could these other things be integrated to the Tokenizer?
Bonus: Ollama and SLMs, and instead of price, get token count and velocity (relative to eGPU performance)