Mintplex-Labs / anything-llm

The all-in-one Desktop & Docker AI application with full RAG and AI Agent capabilities.
https://anythingllm.com
MIT License
17.86k stars 1.93k forks source link

Ability to set max tokens for localai api? #430

Closed dillfrescott closed 7 months ago

dillfrescott commented 7 months ago

I need this feature!

timothycarambat commented 7 months ago

Would this limit apply to every workspace and chat sent? If so that can be configured for LocalAI LLM inferencing.

Is the chat running away and not terminating output?

dillfrescott commented 7 months ago

I added it globally right around here actually:

https://github.com/Mintplex-Labs/anything-llm/blob/ce9233c258a6769ddf4e640dd49623c5c7b0f37d/server/utils/AiProviders/localAi/index.js#L119

So I guess that works for me!