Nhowka / Elmish.Bridge

Create client-server Fable-Elmish apps keeping a single mindset
MIT License
139 stars 17 forks source link

Question: extend Elmish.Bridge to RPC with a ReplyChannel abstraction #29

Open 0x53A opened 4 years ago

0x53A commented 4 years ago

If this is outside the scope of your Vision of Elmish.Bride, please feel free to just close this ;)


I like the elmish concept in general, and I already use and like Elmish.Bridge for pushing data from server to client.

It is really good for one-way, message based communication.

But often you also need a RPC style, request-response flow, and that is a little bit cumbersome with just fire-and-forget messages.


I haven't thought too deeply about it, but what would you think about adding RPC capabilities by emulating some Actor models with a ReplyChannel abstraction?

The proposed API use would look very similar to the MailboxProcessor:

You have a ReplyChannel class/interface.

On the serverside on the hub, you have AskClient / AskClientIf, similar to BroadcastClient / SendClientIf.

On the client side you have Bridge.Ask, complimentary to Bridge.Send.


IMO the Msg and update parts of elmish are already very similar to an actor, this would just extend this capability.

Nhowka commented 4 years ago

I'm not sure if I understand it completely... Would it be the same as the following?

Today:

After that:

Is the idea make it possible to write code as the following?

let someFunction n =
  async {
    let! value = Bridge.Ask GiveMeValueA
    match value with
    | HaveTheValue (Some a) -> return somethingWith a
    | _ -> return somethingElse
  }

If that's it, I like the idea! Maybe Bridge.Ask could have the same signature as MailboxProcessor.PostAndAsyncReply? Not sure how to implement it, to be honest...

0x53A commented 4 years ago

Yeah something similar like that.

I'd make it as similar to the MailboxProcessor as possible, so it could look like this:


// defined by Elmish,Bridge

type IReplyChannel<'T> with
    abstract member ReplyWithValue : 'T -> unit
    abstract member ReplyWithException : exn -> unit

// shared user code

type Msg =
| GiveMeValueA of rc: IReplyChannel<int>

// RPC-Client (web-server)

async {
    let! result = hub.Ask(fun rc -> GiveMeValueA rc)
    // result is 42
}

// RPC-server (browser)

let update =
  match msg with
  | GiveMeValueA rc->
      rc.ReplyWithValue(42)

of course this RPC stuff should be two-way, so that the browser could also "ask" the web-server.

Not sure how to implement it, to be honest...

You'd have to transparently track the request / response messages with some internal message ids to correlate them, then add a timeout if the other side doesn't respond after some time, etc ...

It would add a bunch of additional hidden state that must be tracked.

Nhowka commented 3 years ago

I might have now the knowledge I was missing to implement this feature. I hope to have some fruitful experiments soon!

Nhowka commented 3 years ago

@0x53A I added a method AskClient on the ServerHub as an experiment on the version 5.0.0-rc-4. Are you still interested in testing it?

Also, I'm not sure what to do for the cases with multiple clients. Would a function callback receiving the clientDispatcher, serverDispatcher and the returned value (maybe as Result<'T,exn>) be a good API?

I'll try to implement the reverse communication soon.

Nhowka commented 3 years ago

Client now has a Bridge.AskServer to do the same in the reverse direction on rc-7!

Nhowka commented 3 years ago

rc-9 changed the transport a little so the message that goes to server is now smaller.

Nhowka commented 3 years ago

5.0.0-rc-9-1 now have AskAllClients and AskAllClientsIf. The latter has a predicate function on the model and then both have an IReplyChannel<'T> -> 'client to create the message to be sent, a Dispatch<'client> -> Dispatch<'server> -> 'T -> unit to process the value and send new messages if needed, and a Dispatch<'client> -> Dispatch<'server> -> exn -> unit for processing the exception. Not sure how the client send the exceptions to be honest.