JuliaGizmos / Reactive.jl

Reactive programming primitives for Julia
http://juliagizmos.github.io/Reactive.jl/
Other
182 stars 46 forks source link

Minimal Reactive attempt #115

Closed tshort closed 7 years ago

tshort commented 7 years ago

Hi everyone, I threw together a basic package that does a type of Functional Reactive Programming. It generally follows the API that Reactive.jl uses. My reason for implementing it was to use with Sims to handle data flow. We have been using Reactive, but I wanted something more basic and something that runs more immediately.

https://github.com/tshort/ReactiveBasics.jl

In the one benchmark I have in this repo, it runs almost ten times faster than Reactive. On the flip side, it has fewer features. I also pinched some of the tests in Reactive.

So, my reason for bringing this up here is to see if there's any interest in using the code from my package in Reactive.jl. It uses closures heavily, so that probably limits use to Julia v0.5 and up.

The code I have is sufficient for my dataflow needs, but I know a lot of the folks using Reactive use it for GUI stuff, so the use cases are different. I don't know if the style of FRP used in ReactiveBasics is suitable for GUI use cases.

shashi commented 7 years ago

Interesting! It is quite simple. Reactive was synchronous before #65 and problems were caused by signals which block for IO -- during the IO a new update might start to propagate and encounter state from a previously incomplete update. As long as you use this for things that do not block for IO like Sims, this should work just fine! Most GUI stuff I've used Reactive for involves reading from a socket, so queuing and real asynchrony is desirable. I'm sure there are performance improvement opportunities in this package. I've only ever focused on making it feature complete. I'm happy to see the idea spread!

ps: zip is a better name for the merge you have. ;)

tshort commented 7 years ago

One day, I'll figure out how async stuff works:) Thanks for the name suggestion.

tshort commented 7 years ago

@shashi, I'm wondering if you can layer in asynchronous support on top of synchronous operation. As a trial run, I added an asyncmap here:

https://github.com/tshort/ReactiveBasics.jl/pull/2/files

It follows the same logic as your async_map. It seems to work for the test case you have for async_map. I have to use @sync instead of step in the tests. Is there more that's needed for GUI apps and asynchronous support?

Also, now that Base has asyncmap, you might want to consider using that in Reactive instead of async_map.

Although I don't need it (at least for Sims), I have been wanting to learn more about @async and friends. I can't say I fully follow everything, yet.

SimonDanisch commented 7 years ago

I thought the same ;) It would be nice to have @async optional. After profiling in #113, it seems ~80% of the time is spent in functions for task switching etc. I'd be interested to see the results from this approach (or maybe even help integrating it). For GLVisualize's core event handling I wanted something similar, since the @async introduces plain overhead without any gain. One further step would be to have a bake function, to say at some point: this event tree is final, now inline everything into one big function, which should give an additional performance gain.

shashi commented 7 years ago

There are some subtle race conditions which can arise from using @async with Reactive vs the same with ReactiveBasics.

Consider the following function... The signal y has a sleep in it representing a blocking function being called during an update to the signal graph:

julia> function test()
           x = Signal(1)
           y = map(a->(sleep(5); -a), x)
           w = Signal(10)
           z = map(println, map(-, y, w))
           # suddenly two events appear in this order:
           @async push!(x, 20) # the @async just represents updates coming from a different task
           @async push!(w, 40)
           sleep(5.5); # wait till previous update goes through
       end

using Reactive:

julia> test()
-15
-30
-60

using ReactiveBasics (exact same code :) - pretty cool that that works )

julia> test()
-15
-45
-60

In the case of ReactiveBasics, it goes ahead and carries out the @async push!(w, 40) task while @async push!(x, 5) blocks due to y still trying to update.

But in Reactive, push! simply keeps the update on a channel (a queue) which lets Reactive keep the ordering of events coming in.

To give a real world scenario where this might be important, imagine an app in which you have 2 input widgets - one to upload / read a file, the other to process the loaded data. A user might request to read the file (which you handle by push! ing to a signal), and then quickly click the button to process it (which you handle by push!ing to a second signal).

In the Reactive case, this will do the generally expected thing of waiting till the file is loaded and then processing it. In ReactiveBasics case, it might end up processing previously loaded data while your new file loads or might even end up trying to process some invalid previous state...

@SimonDanisch Reactive does not use @async by default, only when the channel's queue is full which should not happen if updates are coming at a pace your code can handle... I think something like bake would be really useful in Tom's use case.

tshort commented 7 years ago

Hmm. That's an interesting case. I'm also wondering if different situations might want different answers. Some apps might want the ReactiveBasics sequence (or at least be okay with it). Others might want all updates to block as long as there are in-process calculations, so in this example, the middle value shouldn't appear. I'm not sure how that could be specified and handled.

shashi commented 7 years ago

In versions before #65 Reactive would just crash if a situation like this occured. That was fine until it wasn't (https://github.com/JuliaGL/GLVisualize.jl/issues/26, https://github.com/shashi/Escher.jl/issues/100)

I guess there are many spaces of concerns a reactive library can address...

SimonDanisch commented 7 years ago

@shashi, yeah but it does use the task scheduler because of the put/take which seems to introduce overhead, right? That's what I meant, I just said in a confusing (wrong) way.

I was thinking about more control over the behavior as well. E.g. when doing the computation in another process, sometimes you simply want to skip new computations as long as the old one is running, and sometimes you want to queue them ;) E.g. when moving a slider, you probably want to skip them, because all in between values of the slider you scrub are not really valid anymore. Maybe having different Signal types e.g. BufferedSignal, could help.... and confuse? :)

shashi commented 7 years ago

There's throttle for that use case... Say you want to allow users to move the slider really fast but only update at a maxium of 60fps, you can use throttle(1/60, ...). You can also simulate

when doing the computation in another process, sometimes you simply want to skip new computations as long as the old one is running, and sometimes you want to queue them ;)

Using filterwhen and foldp...

SimonDanisch commented 7 years ago

yeah but that's kinda difficult to set up with processes, channels and so forth ;) So it'd be nice to have a default implementation for that!

shashi commented 7 years ago

I agree

tshort commented 7 years ago

I've been looking into ways of managing asynchronous events in a synchronous reactive setup. I found a great example that I ported to Julia here. It uses flatmap to organize processing of signals. That function might be useful in Reactive.

For @shashi's example above with the race condition, I looked at a couple of ways of tackling the problem here.