lmstudio-ai / lms

👾 LM Studio CLI
https://lms.dev
MIT License
1.68k stars 138 forks source link

Feature Request: count tokens before calling '/v1/chat/completions' #98

Open GPTLocalhost opened 3 weeks ago

GPTLocalhost commented 3 weeks ago

Recently, we integrated Microsoft Word with LM Studio through a local Word Add-in. You can view a demo here. We're planning to add a feature to count tokens before calling '/v1/chat/completions,' allowing users to see the remaining tokens available for inference. Our question is: Is it possible for LM Studio to count the tokens of the prompt before calling '/v1/chat/completions'? Thank you for any advice.

ryan-the-crayon commented 3 weeks ago

Currently, the functionality is only available in LMStudio.js (https://github.com/lmstudio-ai/lmstudio.js/blob/main/packages/lms-client/src/llm/LLMDynamicHandle.ts#L432)

In the future, we will expand the functionality to our Restful API as well.