Closed fblanqui closed 2 years ago
Set.lpo
was compiled with a different version of Lambdapi, but is still up to date with respect to file modification date. In that case you need to delete the object files manually, it is better not to blindly overwrite a file that may not be part of your library's code. This is not a bug but a feature.
Then, I don't understand what's the point of this option, all the more so since the doc says: "Recompile the files even if they have an up-to-date object file."
This option is still useful for several reasons. For instance if you want to do performance evaluation you might want to make sure that every file that you check is actually checked (and not simply loaded from an object file).
Note that this kind of behaviour is the same for most languages. If you try to link with an incompatible .cmx
file in OCaml you will get an error.
For the performance evaluation issue, you could as well remove all lpo files before hand. Anyway, it seems that the doc should be clarified, shouldn't it?
I don't know, I think this behaviour is quite expected. And even if not, the error message is pretty clear.
You're funny Rodolphe. It's always clear to you. Perhaps I'm stupid. But when I read "Recompile the files even if they have an up-to-date object file", I expect LP to recompile them...
Well, I never said that, I'm just expressing my opinion. Everyone has the right to disagree. If that makes you feel better we can expand the doc a bit on that, but I still think this is the right behaviour. If you disagree, I'd be happy to hear your arguments.
I'd say this behavior is a bug, or at least a severe limitation of the incremental rebuilding.
You are right this used to be a hell in ocaml for example but dune solved that by properly tracking the dependency of object files on the compiler, and the flags used actually.
Solving this requires usually tracking digests for deps or some other kind of better dependency tracking. In Coq we solved this by switching to Dune; I dunno, all that I can say is that using timestamps is IMVHO a very bad method to update targets.
I don't see how that is a severe limitation, the tool recompiles exactly what it needs to. If you have modified only one file then only this file and those that transitively depend on it are recompiled. That is the usual mode of operation for a user of Lambapi. The small limitation that we are discussing here only materialises when one updates Lambdapi. If there are object files that were compiled with another version (in fact, currently, with a different binary), then the tool refuses to overwrite them. In that case, running make clean
would be the typical solution. It is not a big inconvenience.
The rational behind this choice is that currently Lambdapi will not stop recompiling files at the package boundaries. And for obvious reasons, it should not go and recompile arbitrary proof libraries that were installed independently on the system. I agree that this behaviour is not the best we could hope for in the long run, but that is what we have now and it works well enough. What I am planning for the future is to have a proper package manager and (parallel) build system, but this is not yet necessary. What we have now is good enough assuming that users install libraries using the guidelines hinted in the documentation.
Experience in OCaml and Coq has shown that this is indeed a severe limitation and causes a lot of pain to users as the for example change opam switch, update coq, update lambdapi, etc... I would not underestimate the impact this has.
For me it boils down about the correctness of the build rules, I guess that a build rule for LP looks like:
file.lobj <- %{bin:lambdapi} /\ file.lp /\ %{bin_deps:file.lp}
so indeed, if you skip the dependency on lambdapi your rule is unsound.
Writing a build system and a package manager IMVHO seems like very hard tasks, in particular the build system.
It wouldn't be hard to use dune for building LP files, using a variation of coq_dune.ml
or even the extension API that is gonna come pretty soon, that would solve Frédéric problem, but yeah, YMMV.
Writing a build system and a package manager IMVHO seems like very hard tasks, in particular the build system.
All of that is non-trivial, but not so hard either.
As a general principle, it is better to rely and reuse existing tools whenever possible. We are not going to write and have to maintain a build system if we can reuse an existing one. The same applies for confluence checkers, termination checkers, automated theorem provers, etc.
Yeah I agree with that, but there must be existing tools that do what you want. For instance I don't think that Opam will do the trick (the inability to install two versions of the same library in a single switch is a pain). And I am not convinced that Dune will either. I guess we will have to see when this "extension api" is released.
Another point I was anticipating was integration with Logipedia. I thought it would be a good idea to have whatever package manager we use interact with it to propose extraction with other tools automatically. And I don't see how to do that kind of thing cleanly without having our own tooling. But maybe that kind of thing is not on Deducteam's plans.
If you want to switch back and forth between several versions of a same piece of code, IMO, the solution is functional package managers such as Guix or Nix. On the other hand, it looks overkill.
FYI, if the European e-infrastructure project is accepted, OCamlPro will develop the package manager of Logipedia.