dialohq / ocaml-grpc

gRPC library for OCaml
https://dialohq.github.io/ocaml-grpc
BSD 3-Clause "New" or "Revised" License
62 stars 9 forks source link

Server_streaming list_features not streaming gradually responses? #36

Closed aryx closed 1 year ago

aryx commented 1 year ago

Hi,

I have a server that I would like to serve a stream of response where each takes a long time to compute, and I would like a client that gradually display those responses, as soon as they arrive.

I've modified the routeguide-lwt example in the ocaml-grpc repo with this diff:

diff -u -p -b -B -r -x .semantic.cache -x .depend -x CVS -x .hg -x .svn -x .git -x _darcs /home/pad/work/lang-ocaml/ocaml-grpc/examples/routeguide-lwt/src/ /home/pad/routeguide-lwt/src
diff -u -p -b -B -r -x .semantic.cache -x .depend -x CVS -x .hg -x .svn -x .git -x _darcs /home/pad/work/lang-ocaml/ocaml-grpc/examples/routeguide-lwt/src/server.ml /home/pad/routeguide-lwt/src/server.ml
--- /home/pad/work/lang-ocaml/ocaml-grpc/examples/routeguide-lwt/src/server.ml  2023-08-03 14:04:29.771438529 +0200
+++ /home/pad/routeguide-lwt/src/server.ml  2023-08-03 14:26:29.751509229 +0200
@@ -73,12 +73,21 @@ let calc_distance (p1 : Point.t) (p2 : P
   let c = 2.0 *. atan2 (sqrt a) (sqrt (1.0 -. a)) in
   Float.to_int (r *. c)

+let rec fibonacci n =
+  if n < 3 then
+    1
+  else
+    fibonacci (n-1) + fibonacci (n-2)
+
 let get_feature buffer =
   let decode, encode = Service.make_service_functions RouteGuide.getFeature in
   (* Decode the request. *)
   let point =
     Reader.create buffer |> decode |> function
-    | Ok v -> v
+    | Ok v -> 
+       let _ = fibonacci 40 in
+       Unix.sleepf 0.5;
+       v
     | Error e ->
         failwith
           (Printf.sprintf "Could not decode request: %s" (Result.show_error e))
@@ -123,9 +132,13 @@ let list_features (buffer : string) (f :
   let () =
     List.iter
       (fun (feature : Feature.t) ->
-        if in_range (Option.get feature.location) rectangle then
+        if in_range (Option.get feature.location) rectangle then begin
+            let res = fibonacci 40 in
+            Printf.printf "Computed one feature, fib = %d\n" res;
+            Unix.sleepf 0.001;
+            flush stdout;
           encode feature |> Writer.contents |> f
-        else ())
+        end else ())
       !features
   in
   Lwt.return Grpc.Status.(v OK)

Diff finished.  Thu Aug  3 14:36:48 2023

unfortunately when I run in one terminal

$./_build/default/src/server.exe routeguide/data/route_guide_db.json
Listening on port 8080 for grpc requests

and in the other

$ ./_build/default/src/client.exe
*** SIMPLE RPC ***
RESPONSE = {{ name = "Berkshire Valley Management Area Trail, Jefferson, NJ, USA";
  location = (Some { latitude = 409146138; longitude = -746188906 }) }}

*** SERVER STREAMING ***

even though the server seems to gradually compute the response (the server output progress on stdout), the client seems blocked on SERVER_STREAMING, and need to wait for all responses to be computed to finally print all of them at once.

Is there a way to fix the server or the client to gradually display the responses? Is Server_streaming the right way to solve this problem? Or I need to use bidirectional streaming?

aryx commented 1 year ago

cc @quernd

aryx commented 1 year ago

Or is it a limitation of Grpc_lwt and to get more flexibility I should use Grpc_eio?

quernd commented 1 year ago

Thanks for reporting this! It is not a limitation of neither Lwt nor server_streaming, but the example. The client code reads the entire stream first and turns it into a list before logging anything. If you add logs inside Lwt_stream.map you will see each message when it comes in.

Also, in this specific case, you will want to use a nonblocking sleep instead of Unix.sleep. Otherwise the server will execute the entire loop without allowing a context switch to allow IO to happen.

Both is demonstrated in this commit.

In general, Grpc_eio is the future and probably just as mature or immature as the other backends at this point, so if you can, try it out!

tmcgilchrist commented 1 year ago

@aryx Do you think that the example could be clearer about the LWT behaviour? What could we improve to make it better?

I noticed that the async implementation of route guide prints out the responses as they appear at the client, while the other two implementations read the entire stream first and then print it out. That should get fixed so they all work the same.

aryx commented 1 year ago

I think just merge the commit quend linked above so all implem do the same and prints out responses as they appear at the client.

aryx commented 1 year ago

Closing as switching to grpc-eio solves my issue too.