hakaru-dev / hakaru

A probabilistic programming language
BSD 3-Clause "New" or "Revised" License
311 stars 30 forks source link

Maple produces identically non-positive weight (but a sensible answer) #83

Closed yuriy0 closed 7 years ago

yuriy0 commented 7 years ago

Haskell throws an error while parsing (or typechecking) the result of the following:

@Yuriy-PC:~/hakaru/maple $ hk-maple ../tests/RoundTrip/t65.expected.hk
hk-maple.exe: primCoerceFrom@Literal: negative HInt CallStack (from HasCallStack):  
error, called at haskell\Language\Hakaru\Syntax\AST.hs:167:24 in hakaru-0.4.0-8IG2WtWQS8YBJ7MLg4Aabh:Language.Hakaru.Syntax.AST

@Yuriy-PC:~/hakaru/maple $ cat ../tests/RoundTrip/t65.expected.hk
fn x1 real: 
 x0 <~ uniform(0, 1)
 weight((match ((0
                  < 
                 (log(real2prob((x1 + (x0 * (-1))))) * (-1))) && 
                (x0 < (((x1 * prob2real(exp(1))) - 1) * prob2real(exp((-1))))) && 
                (x0 < x1)): 
          true: real2prob(recip(((x1 * (-1)) + x0)))
          false: 0),
        return ())

Presumably Haskell produces this error because the output from Maple contains a weight which is identically non-positive:

lam(x1, HReal(), 
  Weight(piecewise(x1 < exp(-1), 0
                  ,x1 < 1, -ln(x1)-1
                  ,x1 < 1+exp(-1), -1
                  ,x1 < 2, ln(x1-1)
                  ,2 <= x1, 0)
        ,Ret(Hakaru:-Datum(unit, Inl(Done)))))

You can optionally squint at the above, or look at the nice plots Maple will make:

image

The 'simplify' interpretation of weights is that Maple may scale weights to it's hearts content; but Hakaru has a more strict interpretation. The weight we really want is:

image

This handwritten program is the expected result:

fn x real: 
  x1 = real2prob(x)
  weight( real2prob(negate(
    match( x < exp(-1) ):
     true : 0 
     false:  
     match( x < 1 ):
       true : -log(x1)-1
       false: 
       match( x < 1 + exp(-1) ):
         true : -1 
         false: 
         match( x < 2 ):
           true : log(real2prob(x1 - 1.0))
           false: 0 
   )) , dirac () )

which only differs from the actual result by a negate, and correctly roundtrips through 'simplify'.

It seems the correct thing to do here is to detect in Maple when a weight is identically non-positive, and just negate it. But such a check is potentially expensive, and should likely likely be done as a final step before sending back to Haskell. And it is an extremely specific operations which has zero likelihood to affect Maple's ability to simplify the program. The question lies in whether this particular case should be implemented, or some more general class of 'weight normalization' should be done by Maple.

JacquesCarette commented 7 years ago

A negative weight is always an error. Something, somewhere, went wrong. Looks like a - got dropped.

ccshan commented 7 years ago

If you look at (or even trace) how factorize (in NewSLO/From.mpl) calls list_of_mul, you'll see that we do try to keep weights positive. For example, applying unintegrate to Int((1-p^2)*(1+p)*applyintegrand(...),p=0..1) should not yield Weight(-1,...)

But the problem in this issue seems different because, as @JacquesCarette puts it, it looks like a - got dropped, and it's not canceled out by another -. Maybe it would still help to trace(unintegrate)

yuriy0 commented 7 years ago

As it turns out, this wasn't any (machine) making a mistake - the expected output program itself was completely bogus. It was:

t65' =
     lam $ \t ->
     uniform_0_1  >>= \x->
     withWeight (... &&
                      x < t)
                 (unsafeProb (recip (t * real_ (-1) + x)))

but of course if x < t then x - t is negative.

For example, applying unintegrate to Int((1-p^2)(1+p)applyintegrand(...),p=0..1) should not yield Weight(-1,...)

For the sake of posterity: this Int does not get to unintegrate - it is evaluated by elim_intsum.