Closed jonahbeckford closed 3 months ago
Computing an over-approximation (counting every external that is defined and might be linked) sounds like a good idea, since externals are a form of dependency (on a flat external namespace).
Contrarily, computing exact dependencies for bytecode would require to implement name resolution down to the term-level (excluding type-directed disambiguation), which goes far beyond the scope of a module level language.
The first option sounds sensible, but I fear that the second one would go counter to idea of codept of computing as little as possible to extract the graph of dependency, and deferring everything else to the dependency graph.
Okay, that is perfect. I had no intention of getting codept to compute exact dependencies. Just the union of all externals (which is the first option) is all that is needed.
Thanks!
Today
M2l
is focused on the module sublanguage. That makes sense for a dependency analyzer because the module sublanguage expresses dependencies between OCaml compilation units.However, the
external
keyword introduces a dependency to C compilation units. I'd like to add the C dependencies toM2l
for several reasons:external
declarations..cma
(etc.) files ... and they are all making an ELF-centric flat namespace assumption that only linker flags are required to bind correctly to libraries ... and forgetting that two-level namespace shared libraries exist (Windows DLLs and Mach-O dylibs). On Windows the sane way to load a plugin is to specify the first level of the two-level namespace by executing aLoadLibrary
function call ... and that needs C glue code. I'd like to explore using dependency analysis to discover what the linker flags and what the (C) glue code would be. That approach might even remove the need forctypes
in simple cases.Thoughts?
This is not urgent, and I wouldn't be starting it tomorrow :)