nod-ai / SHARK

SHARK - High Performance Machine Learning Distribution
Apache License 2.0
1.4k stars 169 forks source link

(Studio2) Centralize and minimize prompt handling for LLMs #2073

Open monorimet opened 5 months ago

monorimet commented 5 months ago

We don't want to lose track of efforts to separate UI from execution, so this issue is here so that we don't forget to keep the two separate moving forward. Currently there is a small tweak to user prompt that lives outside of the API (in the UI) because of how the UI receives yielded tokens/history from the LLM api.