Open mmurad2 opened 1 week ago
The model is configurable through API :)
You can create bee a with a different model using the python SDK, just switch the model (depends on your LLM_BACKEND):
https://github.com/i-am-bee/bee-python-sdk/blob/main/examples/basic_usage.py#L18
Or use this curl (for bee-stack):
# env for stack
BEE_API=localhost:4000
BEE_API_KEY=sk-proj-testkey
curl -X POST \
"${BEE_API}/v1/assistants" \
-H "Authorization: Bearer ${BEE_API_KEY}" \
-H "Content-Type: application/json" \
-d '{
"model": "meta-llama/llama-3-1-70b-instruct",
"tools": [
{
"type": "code_interpreter"
}
]
}'
You can then edit your assistant in the UI.
IMO it would make sense to make this configuratble in the UI
Issue description New users don't know which model the bee-stack is using. Ideally they should have the flexibility to specify which model to use as well as modify parameters.
Ideal solution
Related discussion: https://github.com/i-am-bee/internal/issues/2