BoundaryML / baml

BAML is a language that helps you get structured data from LLMs, with the best DX possible. Works with all languages. Check out the promptfiddle.com playground
https://docs.boundaryml.com
Apache License 2.0
1.42k stars 52 forks source link

Guidance on BAML using Ollama `/chat` endpoint with history #1184

Closed mjb2k closed 1 week ago

mjb2k commented 1 week ago

Hello,

I am noticing that BAML is querying the /chat/completions endpoint for the OpenAI-generic client (in my case Ollama) with no message history. I have a need for BAML where the LLM must remember its previous queries and responses, and so I need to query while including all of the messages (+ their responses) for this call. Is there a way to do this that I couldn't find or is this unavailable at this time?

Thanks

mjb2k commented 1 week ago

Reference to the API call: https://github.com/ollama/ollama/blob/main/docs/api.md#chat-request-with-history

aaronvg commented 1 week ago

You can use this guide to see how to add messages from previous turns into the chat:

https://docs.boundaryml.com/examples/prompt-engineering/chat

If you create a baml test or look at the raw curl request in the VSCode playground, you can see we create the same kind of array everytime you add {{ _.role("user") }} or use "assistant".

mjb2k commented 1 week ago

You can use this guide to see how to add messages from previous turns into the chat:

https://docs.boundaryml.com/examples/prompt-engineering/chat

If you create a baml test or look at the raw curl request in the VSCode playground, you can see we create the same kind of array everytime you add {{ _.role("user") }} or use "assistant".

Thank you.

mjb2k commented 1 week ago

Ok I see so you're including the chat history in the prompt, which is contained in the first (and only) message.content field. What about adding an ability to include message history aside from the prompt?

aaronvg commented 1 week ago

have a look at this example: https://www.promptfiddle.com/New-Project--6IRn . If you check "raw curl" checkbox, you'll see we do send it as chat history. The prompt string gets converted into the array of Chat Completion messages.

mjb2k commented 1 week ago

Ok I understand now, thanks for your help!