Sorry but I cannot find solutions to this. Hope this problem won't waste your guys' time.
my question is:
my extra-openai-models.yaml under the project root folder looks like:
- model_id: test
model_name: gpt-3.5-turbo
api_base: "https://a_website/v1/chat"
completion: true
but when I run llm -m test 'What is the capital of France?'
the output is:
Error: 'test' is not a known model
I don't know, maybe I've messed up with the yaml file?
Where did you place the file? I put extra-openai-models.yaml in ~/Library/Application\ Support/io.datasette.llm/ (I'm on a mac) and then it recognised the test model
Sorry but I cannot find solutions to this. Hope this problem won't waste your guys' time.
my question is: my
extra-openai-models.yaml
under the project root folder looks like:but when I run
llm -m test 'What is the capital of France?'
the output is:Error: 'test' is not a known model
I don't know, maybe I've messed up with the yaml file?