Closed gubatron closed 1 year ago
somewhat related topic.
The ChatCompletion::builder::create()
works in async fashion.
is it out of the question to have a blocking version so that I don't have to include tokio in my project and manage my blocking requests in a simple thread without increasing my dependencies with all the nasty patch work tokio brings?
This is not a bug. Just add this to Cargo.toml
if you want the last revision -
openai = { git = "https://github.com/valentinegb/openai", rev = "309e2d09e8fd41c4ee18ca2aec33f4d93e67181e" }
Yeah, what @DhruvDh said. Also, v1.0.0-alpha.6 is planned to release on Monday (tomorrow) although I'll probably release it later today after any last-minute changes.
As for a blocking version of the library, sure, that can be done, but it's not a very simple task so please open another issue for it.
Bug Description
Kind of need
ChatCompletionMessage
to implement Cloneable trait which you did yesterdayI know I could fork the repo and point my cargo to a local reference (which I have), but it'd be nice if you'd just issue a new tag and published in crates.io :), pretty please!
Thank you for this work!
Terminal Output
Reproduction Steps
Expected Behavior
foo
Workaround
just tagging and publishing latest changes.
Library Version
1.0.0-alpha.5
rustc Version
1.67.0
cargo Version
1.67.0
Platform
macOS