Closed peperunas closed 9 months ago
Good job - sounds like prompts for chat titles are llm-dependent, and ollama needs some other sort of encoding/prompting. Remember which model in Ollama? (not necessary for the ticket, just a curiosity)
Note that we are now using the Ollama /chat
API, which has improved the encoding of the prompt. Now models shall have no issues with titles anymore.
Please let me know if this is correct. I believe this should be the case now.
It works, thank you!
Giulio De Pasquale
PhD Candidate in Program Analysis for Computer Security King’s College London, Strand Campus
Website: https://pepe.runas.rocks On 28 Dec 2023 at 10:42 +0000, Enrico Ros @.***>, wrote:
Please let me know if this is correct. I believe this should be the case now. — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>
Nice! Probably the only obscure part is that Chat Titles are generated with the currently selected "functions" model (could be a local llm or cloud llm). Performing the selection of the "function" model is very opaque still.
Describe the bug A clear and concise description of what the bug is.
Where is it happening?
To Reproduce Start a chat, read the chat title.
Screenshots / context
The title also have nothing to do with the conversation.