Closed engie-b2c-perf closed 8 months ago
If i am not mistaken, default model is configured at the top with model
directive
Eg: model: claude:claude-3-opus-20240229
If i am not mistaken, default model is confiured at the model top with
model
directive Eg:model: claude:claude-3-opus-20240229
You are totally right @chiefMarlin. It works! thanks 👍
If I may, can we leave this issue open for the two other questions :
I would like:
- to know what the current model is in chat mode
- bonus: to restrict models that can be used for one client. For example, here I could restrict vertexai to only one model:
$ aichat --list-models vertexai:gemini-1.0-pro ollama:llama2
aichat --info
(command mode) or .info
(REPL mode).
Is your feature request related to a problem? Please describe. it is not really a problem, but I am confused about which model will be used by default in case I have various models set up.
That's my
config.yaml
:Modles are recognized:
Describe the solution you'd like I would like:
config.yaml
. For that I guess we can use the field `prelude: model:Describe alternatives you've considered I tried playing with
config.yaml
to do that but it didn't workTried to force one model like this:
Thanks for the help! Your tool is great