Closed ChuckJonas closed 1 year ago
Something to consider here: I played around with this some locally and ended up needing to extract a ChatCompletionMessage
type. The problem I was seeing is there is some drift between the messages in ChatCompletion
and ChatCompletionOptions
– meaning I got some typing issues trying to pass message replies back in during subsequent requests.
I played around with this some locally and ended up needing to extract a ChatCompletionMessage type
Ah, very interesting. Seems like a good idea.
LGTM can you add tests
I don't see any actual unit tests in the project? I added an "example" file.
I also added the null coalesce
... It's not the most beautiful solution, but as discussed, I'm not sure if there is a better option.
Let me know if there is anything else needed to get this out.
Thanks for the PR @ChuckJonas!
Open AI just released support for "Functions". This is a first pass at supporting that behavior.
https://openai.com/blog/function-calling-and-other-api-updates
NOTE: this has been lightly tested and seems to be working