dzhng / llm-api

Fully typed & consistent chat APIs for OpenAI, Anthropic, Groq, and Azure's chat models for browser, edge, and node environments.
https://www.npmjs.com/package/llm-api
MIT License
135 stars 8 forks source link

Could the tiktoken dependency be optional? #6

Closed carlos-alberto closed 4 months ago

carlos-alberto commented 9 months ago

We're deploying into cloudflare, and tiktoken is a large library. We don't particularly need the token error check, so having a way to only optionally include the large library would be great.

dzhng commented 4 months ago

sorry but it's a bit tricky to remove and I don't have the time right now to implement & properly test. I do think this check is getting less relevant over time as large context models becomes prevalent, so will rework at some point, but going to close this for now.