sinequa / sba-angular

Sinequa's Angular-based Search Based Application (SBA) Framework
https://sinequa.github.io/sba-angular/
MIT License
30 stars 23 forks source link

set timeout for JsonMethodPlugin post requests #111

Closed meunierfrederic closed 1 year ago

meunierfrederic commented 1 year ago

By default, angular httpClient has a timeout of 1min. But since OpenAI ChatGPT completion requests may take several minutes to respond, I've added in options parameter a way to specify a timeout in minutes. I'm sure it's not the best way to do it...

ericleib commented 1 year ago

Are you sure this works? It seems to me that you are adding a timeout on top of the existing one (if any). Looking at this post, it doesn't look like Angular has a default timeout for requests. You might be facing a default timeout of the server (maybe from sinequa or iis?). https://stackoverflow.com/questions/45938931/default-and-specific-request-timeout

Either way, this problem will be solved soon, because we are implementing a streaming API, so that very long answers can be displayed right from the beginning.

meunierfrederic commented 1 year ago

Are you sure this works? It seems to me that you are adding a timeout on top of the existing one (if any). Looking at this post, it doesn't look like Angular has a default timeout for requests. You might be facing a default timeout of the server (maybe from sinequa or iis?). https://stackoverflow.com/questions/45938931/default-and-specific-request-timeout

I've been inspired by the above stackoverflow post to handle specific timeout per request (by using the options parameter). Like suggest in this post, using an HttpInterceptor + HttpHeaders is much more elegant than my solution. ;-)

By default, angular httpclient kills the requests having a duration longer than 1 minute and set their http status to 504 (I was able to reproduce that). Some are pretending it's 30s, but it's not what I've observed...

GLLM call are often longer than 1 min, since we are using the OpenAI API (not the Azure OpenAI service, we are still in the waitlist for Azure OpenAI GPT4 Preview).

Either way, this problem will be solved soon, because we are implementing a streaming API, so that very long answers can be displayed right from the beginning.

you mean WebSocket?

ericleib commented 1 year ago

Are you sure it's the Angular HttpClient that kills the request and not Sinequa/IIS/dev proxy? The solution you propose adds a timeout, but I don't think it can replace the existing one. (the pipe() method just appends operators, but it has no control over the upstream pipe)

you mean WebSocket?

No, Server-Sent Events https://developer.mozilla.org/fr/docs/Web/API/Server-sent_events/Using_server-sent_events

meunierfrederic commented 1 year ago

Are you sure it's the Angular HttpClient that kills the request and not Sinequa/IIS/dev proxy? The solution you propose adds a timeout, but I don't think it can replace the existing one. (the pipe() method just appends operators, but it has no control over the upstream pipe)

whoops... I have tested locally this "fix". Actually the timeout is triggered by AWS ELB...