Open honnibal opened 6 years ago
At a high level I think this falls under another issue, allowing tasks to eagerly "eat" the entire rest of the parser token list, which maps most closely to #378 at the moment (specifically the *args
sub-case). That enables things like what you're discussing here - having a single up-front task basically redefine parsing so it can do "meta" things with the rest of the command line.
The other component strikes me as concerning improvements around Executor and Context such that this type of userland code has an easier time recreating "most of" what the default execution loop does currently. Executor is a publicly documented/designed API but it's got some pending work to make it easier to truly reuse in ways that the internals had no need for when it was written.
Finally, I wonder how this might dovetail with another recent ticket, #526, since the entire process of "set up things to run, then run them async" seems to map super closely to asyncio's "generate coroutines, create a loop, feed coroutines to loop, tell loop to run somehow" workflow.
Don't have time right now to get deeper in this besides those thoughts, unfortunately!
Invoke is super awesome! Congrats :tada:
I see you've been debating designs for parallel execution and DAG structure pretty extensively. I'd like to suggest a pretty simple "userspace" strategy that I think can help, until a more full-featured solution lands in the library.
Inside the
parallel
task, we set a runner that simply collects the commands to be run, and doesn't run anything. Then duringblock
, we start them all, and wait until they're all complete.For my purposes, I think this will usually be better than the
make -j
style. I'll usually want to parametrise the jobs I run in parallel, so I'd like them to be tasks. I'm also not that excited to have the library decide which parts should be parallel. I mean, sure that's good in theory --- but it's super difficult to get right. It's nice to have synchronous be the default.Example implementation below. I don't think I understand how to pass state through the
Context
object properly yet. I'm sure there's a solution that's less hacky. It should also be simple to support nested parallelism.