VictorTaelin / coc-with-math-prims

5 stars 1 forks source link

Evaluation efficiency in type checker #1

Open AndrasKovacs opened 7 years ago

AndrasKovacs commented 7 years ago

Two things seem most relevant here.

First, it seems to me that the type checker evaluates everything call-by-value (according to standard JS evaluation). This can be rather wasteful, because most bindings and many type annotations aren't needed to be evaluated during checking. Strict CBV basically weak-head normalizes everything in the source code. I think it makes sense to have call-by-need at least for environment entries (the other place where it makes sense is arguments in weak-head normal applications). Thunks can be implemented directly in JS by some object which overwrites itself with a resulting value the first time it's forced.

Second, you could use the JIT capabilities of JS during type checking as well. This is not essential, and could result in slower typechecking on some inputs or when used indiscriminately, but the potential speedups are pretty large.

VictorTaelin commented 7 years ago

About (1), I've noticed that... not sure I know how to implement call-by-need correctly, though. I'll keep that in mind.

About (2), how? It is easy to extend it to also compute the final types, but actually type checking requires applying that deep equality on each application, I thought that wouldn't be possible. I'm not sure now, though...

AndrasKovacs commented 7 years ago

Call-by-need in JS

It could look like this:

    // constructor tags (but we don't have constructor functions, only delaying and forcing)
    var VAL = 0;
    var THUNK = 1;

    function force(th){
        switch (th.ctor) {
            case VAL: 
                return th.val;
            case THUNK:
                val v = th.eval();
                th = {ctor:VAL, val: v};
                return v;
        }
    }

    function delay(f){ return {ctor: THUNK, eval: f};}

And then you delay expressions by wrapping them in a nullary function, for example

    var additionThunk = delay(function(){return 100 + 100;});

Call-by-need is then realized by putting thunks of values into environments instead of values.

JIT in typechecking

The most convenient way to JIT is to take terms as inputs, and convert them to weak-head-normal values (exactly the same as used in the type checker) by outputting the JS expression which evaluates to that value, and which expression has actual JS functions in lambda and pi binders. Haskell example with untyped LC:

{-# language LambdaCase, OverloadedStrings #-}

import Data.String
import Data.HashMap.Lazy (HashMap, (!))
import qualified Data.HashMap.Lazy as M

data Term = Var !String | !Term :$ !Term | Lam !String !Term deriving (Show)
data Val  = VVar !String | VApp !Val Val | VLam !String !(Val -> Val)

instance IsString Term where fromString = Var . fromString
instance IsString Val  where fromString = VVar . fromString

infixl 5 :$
infixl 5 $$

($$) :: Val -> Val -> Val
VLam v t $$ x = t x
t        $$ x = VApp t x

eval :: HashMap String Val -> Term -> Val
eval e = \case
  Var v   -> e ! v
  f :$  x -> eval e f $$ eval e x
  Lam v t -> VLam v $ \val -> eval (M.insert v val e) t

quote :: Val -> Term
quote = \case
  VVar v   -> Var v
  VApp f x -> quote f :$ quote x
  VLam v f -> Lam v (quote (f (VVar v)))

nf :: Term -> Term
nf = quote . eval M.empty

-- JIT-ing terms
------------------------------------------------------------

t1 :: Term
t1 = Lam "f" $ Lam "x" $ "f" :$ "x"

t1JIT :: Val
t1JIT = VLam "f" $ \f -> VLam "x" $ \x -> f $$ x

t2 :: Term
t2 = (Lam "x" $ Lam "y" "x") :$ "z"

t2JIT :: Val
t2JIT = (VLam "x" $ \x -> VLam "y" $ \y -> x) $$ "z"

So, JIT-ing just converts :$ into $$ and named binders into functions. This way JIT can be a drop-in replacement for non-JIT evaluation, as JIT output values have the same observable behavior as non-JIT values. Hence equality checking can be done the same way as well.

VictorTaelin commented 7 years ago

I think I'll leave it that way for now. Reason being I can't afford the time to do it right now... but I'll keep that in mind. Can't thank you enough for coming here, noticing those issues and giving that feedback. And, well, for practically being my professor so many times.