huggingface / chat-ui

Open source codebase powering the HuggingChat app
https://huggingface.co/chat
Apache License 2.0
7.1k stars 1.02k forks source link

Show character limit upfront #333

Open abitrolly opened 1 year ago

abitrolly commented 1 year ago

There are many frustrations with https://huggingface.co/chat giving incomplete answers. One is listed on this tracker - #287. Others are in chat-ui community https://huggingface.co/spaces/huggingchat/chat-ui/discussions/195 and in model specific chats https://huggingface.co/OpenAssistant/oasst-sft-6-llama-30b-xor/discussions/44

It would be nice to show actual model limitations upfront. I am not sure if there is a standard config setting for that, but here is the one for the current model. https://huggingface.co/OpenAssistant/oasst-sft-6-llama-30b-xor/discussions/53

image

nsarrazin commented 1 year ago

Yeah thanks for bringing it up to the surface, it's indeed a common issue. Not 100% sure but I think the length limit is for tokens rather than characters, which make it hard to display client-side. I did have plans for a "continue generation" button: https://github.com/huggingface/chat-ui/issues/280 so I think that would help with that as well.

abitrolly commented 1 year ago

@nsarrazin will it really be able to "continue" the interrupted phrase? I've got a feeling that the interrupted answer is lost completely and the model starts generating a new one from the available tail https://hf.co/chat/r/2zlDKDj

nsarrazin commented 11 months ago

Yeah the goal is to have a button that just picks up where it last left. Haven't added it yet though!