grpc / grpc-web

gRPC for Web Clients
https://grpc.io
Apache License 2.0
8.6k stars 765 forks source link

server streaming javascript engine problem #1303

Closed grzegorzewskiflyingdog closed 3 months ago

grzegorzewskiflyingdog commented 1 year ago

Hello

I have tried to make streaming request in Photoshop UXP plugin (which is using their own JavaScript engine). The request is working, but all result are not coming in as a stream, instead these are coming in after request is finished at once.

This is not problem in common browsers (Chrome, Firefox etc) and response is sent there/coming in as a stream. In my example one server "hello" every second.

This is example code:

  let promiseEngines = new Promise((resolve, reject) => {

   let client = new MultiGreeterClient(SDConfig.url, { metadata }, null);

   let stream =client.sayHello(helloRequest, metadata);

  stream.on('data', (response) => {

            let answer = response.toObject();

            console.log("response is send here as a single answer not streaming one!",answer);

  });

i use this command to render javascript grpc files :

protoc -I=. engines.proto --js_out=import_style=commonjs:. --grpc-web_out=import_style=commonjs,mode=grpcwebtext:.

Do you know what could be reason of this behavior? How the streaming request is sent to the browser in RPC request - as a web socket or something like that?

Maybe they should improve somehow their JavaScript engine to make it work?
This is corresponding issue in adobe uxp forum: https://forums.creativeclouddeveloper.com/t/server-streaming-grpc-client-problem/5523

sampajano commented 1 year ago

Hi :)

Do you know what could be reason of this behavior? How the streaming request is sent to the browser in RPC request - as a web socket or something like that?

I think the most likely reason is that your runtime has a specific implementation of XMLHttpRequest — which is how the request is sent and parsed in streaming responses. If somehow the implementation isn't perfect in propagating the status code and/or responses in the fashion common in browsers, then it would break streaming.

If you'd like to report feedback to them, you could ask if their XmlHttpRequest implementation supports streaming. And if they say yes, you could then try to dig into the details of how it differs from Chrome/etc. (by maybe using something like a chrome debugger but i'm unfamiliar with that :))

hope it helps :)

maja42 commented 1 year ago

Maybe something is buffering the responses, so the Browser receives Thema "all at once" at the end. I had this issue with both nginx and envoy. Disabling buffering solves the issue.

sampajano commented 3 months ago

@maja42 Hi. Sorry for the late response.

Disabling buffering solves the issue.

I'm assuming that the issue is resolved now? I'm closing the issue assuming that's the case :) (Feel free to re-open if not.)


Also, if you could clarify on how exactly did you "disable the buffer" it might be helpful to future readers. Thanks 😃

AF250329 commented 3 months ago

Maybe something is buffering the responses, so the Browser receives Thema "all at once" at the end. I had this issue with both nginx and envoy. Disabling buffering solves the issue.

What did you actually disabled in Envoy and other places ?

maja42 commented 2 months ago

nginX: Add proxy_buffering off; http://nginx.org/en/docs/http/ngx_http_proxy_module.html#proxy_buffering

envoy: I'm not sure anymore. I think it was either stream_idle_timeout: 43200s # 12h on the listener filter, or timeout: 0s in each grpc-route, so that long-running streams are not aborted.