Mintplex-Labs / anything-llm

The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, and more.
https://anythingllm.com
MIT License
26.53k stars 2.65k forks source link

Possible to add "STOP" button if user does not like the AI's response? #540

Closed jameschen83 closed 8 months ago

jameschen83 commented 10 months ago

In ChatGPT, user can click "stop generation" button to stop LLM streaming. Is it available in AnythingLLM, or Possible to add this feature in AnythingLLM?

If no, I plan to fork the branch to add the feature.

duc-gp commented 8 months ago

When I would pick this up to implement, how would I start and how is the process regarding UI / UX? Are there any design specs for this or can the developer propose something in their Pull request?

timothycarambat commented 8 months ago

We do not have a design for this yet. The main limitation here is that all this would do is disconnect the client from the response stream - it would not terminate the request at the LLM side - so an infinite response loop would still continue on the LLM side and it would stay occupied until it finished.

If the client just wants to "stop" seeing the output then we can break the response stream but I think this will lead to more issues with localLLMs that go wild and keep streaming and the user sends a new prompt and now the chat is hanging because the LocalLLM is still stream the old response.

duc-gp commented 8 months ago

We do not have a design for this yet. The main limitation here is that all this would do is disconnect the client from the response stream - it would not terminate the request at the LLM side - so an infinite response loop would still continue on the LLM side and it would stay occupied until it finished.

If the client just wants to "stop" seeing the output then we can break the response stream but I think this will lead to more issues with localLLMs that go wild and keep streaming and the user sends a new prompt and now the chat is hanging because the LocalLLM is still stream the old response.

I dig a little bit into the code, so I think a way to disconnect the client would be to declare a abortcontroller instance outside of https://github.com/Mintplex-Labs/anything-llm/blob/f4b09a8c794f4e1ae0a21a2f47790b680579e087/frontend/src/models/workspace.js#L76 and passing it, so it can be called from outside where its used? Would this be the correct way to implement this?

But then I end up having still a blinking cursor at the end, see screenshot: Screenshot 2024-02-17 at 23 06 09

Then I thought about passing a chatResult with type: "finalizeResponseStream" but for that I would need the chatId ?

Can you provide some guidance on this topic, maybe I'm on a complete wrong way?

Thanks

sumitsodhi88 commented 8 months ago

has it been added? i cant see it.

duc-gp commented 8 months ago

has it been added? i cant see it.

no it hasnt been added yet afaik. I think about implementing it but I'm not sure what the correct approach would be thats why I asked for guidance here https://github.com/Mintplex-Labs/anything-llm/issues/540#issuecomment-1950430063