Open iprovalo opened 4 months ago
Hi @iprovalo thanks for raising this issue. I like adding this functionality. I have been thinking of a cleaner design to implement and came up with setting a request timeout when configuring the OpenAI
. Then the timeout will apply to all requests. Should work for most use cases.
OpenAI openAI = OpenAI.newBuilder(System.getenv("OPENAI_API_KEY"))
.requestTimeout(Duration.ofSeconds(10))
.build();
This has been implemented as part of https://github.com/StefanBratanov/jvm-openai/commit/28bb9c976d0dcea1c8f5a55f44da62a6aac8a69c
Thank you, @StefanBratanov, this is very helpful!
I agree it cover most cases!
However, if possible, a per-request flexibility could be very useful, for example, in my case, some request types may be systemically slower than the others, it would help to differentiate them.
Thank you!
Thank you, @StefanBratanov, this is very helpful!
I agree it cover most cases!
However, if possible, a per-request flexibility could be very useful, for example, in my case, some request types may be systemically slower than the others, it would help to differentiate them.
Thank you!
I agree it would be very nice and flexible to modify the HttpRequest
per-request. I will think about a potential design. My main concern is cluttering the API too much. Will keep the issue open for now and update it if I find a solution.
In the ChatClient (or any client) it would be nice to have a timeout for the HttpRequest: