Closed oslfmt closed 1 year ago
ID, Object, and created should be Option<> as openrouter.ai does not provide them. This makes it hard for those of us using this thing for openrouter.
Thank you @victor-wei126.
Which types do you determine to be "Option"? I've stripped the option type from any required types as specified by the API spec, but not sure if option should be applied to the other types or not
I determinte Option
from the spec, the fields which are not in the required
list are Option
als.
I've noticed there's an enum type ChatCompletionFunctionCall and just a FunctionCall type. Not sure which I should use.
The best is to look at OpenAPI spec to determine which one (sometimes spec doesnt have details about streaming responses - then the best is to execute and see API response)
Beyond that, how do you go about updating the yaml file? As the one in the current repo is out of date. Should I pull the updated spec into this repo as well?
My usual practice was to get upstream spec in my pull request AND the diff was what I used to update the rest of the code / types.rs
ID, Object, and created should be Option<> as openrouter.ai does not provide them
@lpi in the current version of this library, object
and created
are not option types. How do you deal with this now? Because in the OpenAI spec these are required types. Just curious how this affects development - is it not possible to just ignore those types in the returned response object?
@64bit Thanks for the response.
I've updated the types now to reflect what I think is proper (Option or not), trying to reflect the latest spec as closely as possible. The main question is whether to leave ID, object, and created types as option or not, as @lpi mentioned it is easier as openrouter.ai does not provide them.
The best is to look at OpenAPI spec to determine which one
FunctionCall
and ChatCompletionFunctionCall
are types define in async-openai library, not the official specification itself. I choose to use FunctionCall
type as this more closely resembles the official spec.
My usual practice was to get upstream spec in my pull request AND the diff was what I used to update the rest of the code / types.rs
By upstream spec do you mean the current spec in this repo or the latest openai-openapi spec?
Just out of curiosity - is there any way to automatically test the types against the yaml spec? I read a bit about OpenAPI and saw they have some functionality regarding automatic code generation. Not entirely sure about it though - was wondering if you had used this feature so I can speed up the development here if possible.
By upstream spec do you mean the current spec in this repo or the latest openai-openapi spec?
I mean openai-openapi spec, the source of truth.
Just out of curiosity - is there any way to automatically test the types against the yaml spec?
I shared my experience with auto generation in this comment https://github.com/64bit/async-openai/issues/85#issuecomment-1646752463
I think for auto generation to be practical, there's a need for a good openapi spec to Rust generator, with ability to customize generated names. (There could be one - I haven't searched in a long time)
But for now the best way to test is using examples. They serve two purposes: documentation for users and also testing when making changes.
ID, Object, and created should be Option<> as openrouter.ai does not provide them
@lpi in the current version of this library,
object
andcreated
are not option types. How do you deal with this now? Because in the OpenAI spec these are required types. Just curious how this affects development - is it not possible to just ignore those types in the returned response object?
I created my own fork of the thing where those are option types. No, it's not possible to just ignore those since the thing will fail to deserialize and throw an error.
@64bit this PR is ready for review
Thanks!
I'll take a look, btw seems like your Github account changed?
This is released in v0.13.0
Yup haha changed my name but still me
Attempt to solve issue #75.
I've changed the following types to match the OpenAI API spec. Couple of questions:
ChatCompletionFunctionCall
and just aFunctionCall
type. Not sure which I should use.Beyond that, how do you go about updating the yaml file? As the one in the current repo is out of date. Should I pull the updated spec into this repo as well?