MLton / mlton

The MLton repository
http://mlton.org
Other
953 stars 127 forks source link

Support incremental parsing and type-checking #141

Open DemiMarie opened 8 years ago

DemiMarie commented 8 years ago

While MLton is a whole-program compiler, it might be possible to parse and typecheck files individually, and then store the parsed and typechecked data structures in a compact binary format for fast loading later. This could make the common (in my experience) case of reporting an error in one of a few changed files much faster.

jonsterling commented 8 years ago

If it is indeed possible, this would be _amazing_! It might allow us to adopt MLton for all of sml-red-jonprl, without having to support SML/NJ builds for development.

MatthewFluet commented 8 years ago

Anything is possible. :smile:

This is something that I've thought would be very helpful to speeding up the front-end; one does spend a non-trivial time elaborating the Basis Library and other stable libraries before getting to the newly changed code and the reported error.

Unfortunately, I think that it would require a substantial rewrite of the MLton front-end. Type inference in general and MLton's implementation of type-checking in particular uses a lot of mutable state; so it isn't as simple as pickling the output from type-checking a single unit. (I would argue that the unit of caching should be an MLB file, rather than a source file; the MLKit works that way: A Framework for Cut-Off Incremental Recompilation and Inter-Module Optimization.)

Another approach, which MLton used to use, is to elaborate the Basis Library and then use the MLton.World facilities to dump the heap; the bin/mlton script would then load this heap and continue. But the interaction between the compile-time arguments and the MLB semantics makes robust support for that difficult; see the discussion at "Eliminating library elaboration time". Also, increasingly common defaults to address space layout randomization for executables breaks MLton.World for many systems.

But, again, this would certainly be a nice feature to have.