huggingface / chat-ui

Open source codebase powering the HuggingChat app
https://huggingface.co/chat
Apache License 2.0
7.33k stars 1.07k forks source link

How does huggingchat prompt the model to generate HTML output? #889

Open vgoklani opened 7 months ago

vgoklani commented 7 months ago

How does Huggingchat prompt the LLM to generate HTML output? Where can I find that prompt? I'd like to tweak it. thanks!

nsarrazin commented 7 months ago

We don't prompt it specifically to generate HTML. Do you have an example of what you mean specifically ?

vgoklani commented 7 months ago

Apologies, the question should be asking how are you formatting the text into HTML as its streaming. Thanks!

nsarrazin commented 7 months ago

Aah no worries. We use marked to parse the text as markdown, which we then render to HTML.

You can see the component that handles this logic here

Let me know if that helps :grin:

This is the block that renders it, and the marked tokens are generated here

julien-c commented 7 months ago

(hi @vgoklani! 👋 )