simonw / llm

Access large language models from the command-line
https://llm.datasette.io
Apache License 2.0
4.1k stars 229 forks source link

Add o1 support #570

Closed kevinburkesegment closed 1 week ago

kevinburkesegment commented 1 week ago

I believe it needs to be added to default_models.py and also add some documentation.

I just tried a local patch and the API reported that o1-preview does not exist or I don't have access to it, so not sure if API access has not been enabled yet or if our company doesn't have it yet.

Quantisan commented 1 week ago

you can add locally with this config https://llm.datasette.io/en/stable/openai-models.html#adding-more-openai-models

kevinburkesegment commented 1 week ago

the docs there are not quite right... you need to specify both model_name and model_id, without model_name I get errors.

kevinburkesegment commented 1 week ago

OK, it's there for me now, however I'm getting this - "Error: Error code: 400 - {'error': {'message': "Unsupported value: 'stream' does not support true with this model. Only the default (false) value is supported.", 'type': 'invalid_request_error', 'param': 'stream', 'code': 'unsupported_value'}}"

So seems like you need to use --no-stream in order to get any output, wonder if there is a way to make that the default

Quantisan commented 1 week ago

the docs there are not quite right... you need to specify both model_name and model_id, without model_name I get errors.

Good catch! I submitted a PR to fix the doc.

simonw commented 1 week ago

So seems like you need to use --no-stream in order to get any output, wonder if there is a way to make that the default

I think we need a new model option for setting can_stream = False.

simonw commented 1 week ago

I've published a branch with this implemented, but I can't test it myself yet as I don't have a tier 5 API key.

Could someone else test this for me? You can install the branch like this:

pip install https://github.com/simonw/llm/archive/refs/heads/openai-o1.zip
# Or maybe even this:
llm install https://github.com/simonw/llm/archive/refs/heads/openai-o1.zip

Then:

llm -m o1-preview 'hello to o1-preview'
llm -m o1-mini 'hello to o1-preview'

If someone reports those running without incident I'll ship a release with them.

Quantisan commented 1 week ago
$ llm -m o1-mini 'hello to o1-preview'
Hello! It sounds like you're referring to "o1-preview." Could you provide a bit more context or let me know how I can assist you with it?

$ llm -m o1-preview 'hello to o1-preview'
Hello! How can I assist you today?

@simonw your branch works!

simonw commented 1 week ago

Merged that, about to release it.