Oh yeah! Seriously the best idea ever! All workers will be required to provide generators as output. This will be validated by the plugin manager. Since all workers have generators then as workbench 'connects' worker to worker they become a set of 'chained generators'. Each worker simply outputs data at whatever granularity is natural to that worker, yes in some cases it might be one, but in lots of cases you're sending 'rows' of data so those workers would have a very high granularity. Since our server --> client --> dataframe pipeline is already a generator chain (yes if you don't know that you aren't paying attention), then literally we would have a chain all the way from sample to dataframe on the client side! (insert exploding head here).
Also since we're going to be leveraging Flask/NodeJS for our upcoming set of 'web_views" it will be a generator chain all the way up to a web page (view).
A lot of work here, but if it's even remotely possible we're going to give it a shot.
Oh yeah! Seriously the best idea ever! All workers will be required to provide generators as output. This will be validated by the plugin manager. Since all workers have generators then as workbench 'connects' worker to worker they become a set of 'chained generators'. Each worker simply outputs data at whatever granularity is natural to that worker, yes in some cases it might be one, but in lots of cases you're sending 'rows' of data so those workers would have a very high granularity. Since our server --> client --> dataframe pipeline is already a generator chain (yes if you don't know that you aren't paying attention), then literally we would have a chain all the way from sample to dataframe on the client side! (insert exploding head here).
Also since we're going to be leveraging Flask/NodeJS for our upcoming set of 'web_views" it will be a generator chain all the way up to a web page (view).
A lot of work here, but if it's even remotely possible we're going to give it a shot.