Closed vkottler closed 2 years ago
This is something we should do with #3 in mind as the next major development effort
The first thing that needs to be done is coming up with a data model for something that mimics make
: "targets", "variables", "dependency declarations" etc.
Since we're probably going to load modules or eval
code, we don't need to be quite as terse as make
. A good example of improvement would be allowing for more "semantic" declaration of what resolves a dependency (e.g. like make
, file present or file dependencies newer than target, something else)
This avenue also enables better introspection, like a visualization of a dependency graph or why a target may or may not be satisfied
I guess there needs to be some kind of TargetRegistry
that knows how to load modules, has ownership of the state of variables (copies os.environ
?) and probably knows how to resolve targets
There will definitely be some cross-over with how datazen does what it does (that should be split out into a separate package maybe?)
that should be split out into a separate package maybe?
starting to think that maybe we should leave the source to this tool mostly untouched (e.g. don't increase complexity to a point where we need to add dependencies)
what we could do, is split off a new package (core library or something?), and then figure out how to get this module to dispatch to some external backend, and then maybe also commit canonical recipes for that backend into this package's data (and move current make
-based ones into its own directory)?
If we start a new package, we can use all of the existing tooling to bootstrap the workflow until we can upgrade vmklib (to support external / alternative backends).
Maybe each backend is given like, the conf.*
file path in this package's install data (for it to initially load), and then otherwise just given the command-line arguments as-is?
We upgraded some of datazen's dynamic task / dependency management and created an event-loop based dynamic dependency resolver: https://github.com/vkottler/vcorelib/issues/9
it can run shell commands or exec
some other process
we need to build a framework around this that we can slowly replace the make
implementations for things.
we should probably try to first resolve targets given using the Python dependency resolver, make the dependency resolver return a list/set of targets that weren't resolved and then pass that to the make
implementation (obviously won't really work on Windows / Mac)
In addition to the dependency resolver taking precedence over the make
implementations, we need a way to register new tasks to the task resolver from external modules, and generally come up with a framework for how that should work (e.g. common via package / external code modules, etc.)
make the dependency resolver return a list/set of targets that weren't resolved
That's complete, along with a few other improvements.
Next task is:
we need a way to register new tasks to the task resolver from external modules
Once that's in place it makes sense to incorporate the task manager into vmklib
because it can actually be used. We need to come up with an interface that external modules should have
Interface for external modules on the TaskManager
class is added, now we can instantiate one and come up with the ways we want to use it, and probably also load package modules as tasks that can be executed (but also a default external script entry)
We implement this by allowing an internal conf.py
to register tasks, an external/project-specific conf.py
to register tasks.
We allow disabling make
target resolution, and if that's disabled the command will error if some of the input targets (e.g. on command-line) weren't resolved.
This is the path forward for running on Windows etc.
We now need to implement tasks in Python that mimic and/or interoperate with the tasks defined in Makefiles.
This enables us to build a backend for parsing rules in Python (if we want to just
eval
files / load modules and require modules to declare certain things, we could do that).