mikeizbicki / HLearn

Homomorphic machine learning
Other
1.62k stars 138 forks source link

Migration to ghc-8.0 with subhask-branch ghc-8.0 #84

Open Drezil opened 7 years ago

Drezil commented 7 years ago

There are many things changed from subhask for ghc-7.10 to subhask-branch for ghc-8.0

i tried to compile it naively, but i run into many problems when trying to compile HLearn (Currently stuck on Distributions.hs.. with many things not properly defined, like >< as data-kind instead of type family, VectorSpace renamed to Vector and many other things i cannot solve properly without deeper knowledge).

A brief example of the errors i currently get are

src/HLearn/Models/Distributions.hs:60:12: error:
    • Could not deduce: v ~ (forall g. Monoid g => g)
      from the context: Hilbert v
        bound by the instance declaration
        at src/HLearn/Models/Distributions.hs:59:10-40
      ‘v’ is a rigid type variable bound by
        the instance declaration
        at src/HLearn/Models/Distributions.hs:59:10
      Expected type: Moments v
        Actual type: Moments (forall g. Monoid g => g)
    • In the expression: Moments zero zero zero
      In an equation for ‘zero’: zero = Moments zero zero zero
      In the instance declaration for ‘Monoid (Moments v)’
    • Relevant bindings include
        zero :: Moments v
          (bound at src/HLearn/Models/Distributions.hs:60:5)

src/HLearn/Models/Distributions.hs:71:51: error:
    • Could not deduce: Scalar v ~ Scalar (Square v)
      from the context: Hilbert v
        bound by the instance declaration
        at src/HLearn/Models/Distributions.hs:70:10-40
      Expected type: Scalar (Square v)
        Actual type: Scalar (Moments v)
      NB: ‘Scalar’ is a type function, and may not be injective
    • In the second argument of ‘(.*)’, namely ‘r’
      In the third argument of ‘Moments’, namely ‘(c .* r)’
      In the expression: Moments (r * a) (b .* r) (c .* r)
    • Relevant bindings include
        r :: Scalar (Moments v)
          (bound at src/HLearn/Models/Distributions.hs:71:22)
        c :: Square v (bound at src/HLearn/Models/Distributions.hs:71:18)
        b :: v (bound at src/HLearn/Models/Distributions.hs:71:16)
        a :: Scalar v (bound at src/HLearn/Models/Distributions.hs:71:14)
        (.*) :: Moments v -> Scalar (Moments v) -> Moments v
          (bound at src/HLearn/Models/Distributions.hs:71:5)

src/HLearn/Models/Distributions.hs:78:51: error:
    • Could not deduce: Scalar v ~ Scalar (Square v)
      from the context: Hilbert v
        bound by the instance declaration
        at src/HLearn/Models/Distributions.hs:77:10-40
      Expected type: Scalar (Square v)
        Actual type: Scalar (Moments v)
      NB: ‘Scalar’ is a type function, and may not be injective
    • In the second argument of ‘(./)’, namely ‘r’
      In the third argument of ‘Moments’, namely ‘(c ./ r)’
      In the expression: Moments (r / a) (b ./ r) (c ./ r)
    • Relevant bindings include
        r :: Scalar (Moments v)
          (bound at src/HLearn/Models/Distributions.hs:78:22)
        c :: Square v (bound at src/HLearn/Models/Distributions.hs:78:18)
        b :: v (bound at src/HLearn/Models/Distributions.hs:78:16)
        a :: Scalar v (bound at src/HLearn/Models/Distributions.hs:78:14)
        (./) :: Moments v -> Scalar (Moments v) -> Moments v
          (bound at src/HLearn/Models/Distributions.hs:78:5)

src/HLearn/Models/Distributions.hs:117:17: error:
    • Could not deduce: Scalar (Square v) ~ r
      from the context: (FiniteModule v, Hilbert v)
        bound by the instance declaration
        at src/HLearn/Models/Distributions.hs:112:10-63
      ‘r’ is a rigid type variable bound by
        a type expected by the context:
          forall r. Real r => r
        at src/HLearn/Models/Distributions.hs:117:17
      Expected type: forall r. Real r => r
        Actual type: Scalar (Square v)
    • In the second argument of ‘(*)’, namely ‘size sigma’
      In the first argument of ‘(**)’, namely ‘(2 * pi * size sigma)’
      In the first argument of ‘(*)’, namely
        ‘(2 * pi * size sigma) ** (- fromIntegral (dim v) / 2)’
    • Relevant bindings include
        v' :: v (bound at src/HLearn/Models/Distributions.hs:119:13)
        sigma :: Square v
          (bound at src/HLearn/Models/Distributions.hs:122:13)
        mu :: v (bound at src/HLearn/Models/Distributions.hs:121:13)
        v :: Elem (Normal v)
          (bound at src/HLearn/Models/Distributions.hs:116:37)
        m2 :: Square v (bound at src/HLearn/Models/Distributions.hs:116:32)
        m1 :: v (bound at src/HLearn/Models/Distributions.hs:116:29)
        (Some bindings suppressed; use -fmax-relevant-binds=N or -fno-max-relevant-binds)

src/HLearn/Models/Distributions.hs:117:67: error:
    • Could not deduce: Scalar v ~ (forall r. Real r => r)
      from the context: (FiniteModule v, Hilbert v)
        bound by the instance declaration
        at src/HLearn/Models/Distributions.hs:112:10-63
    • In the second argument of ‘(*)’, namely
        ‘(v' `vXm` reciprocal sigma) <> v'’
      In the first argument of ‘exp’, namely
        ‘((- 1 / 2) * (v' `vXm` reciprocal sigma) <> v')’
      In the second argument of ‘(*)’, namely
        ‘exp ((- 1 / 2) * (v' `vXm` reciprocal sigma) <> v')’
    • Relevant bindings include
        v' :: v (bound at src/HLearn/Models/Distributions.hs:119:13)
        sigma :: Square v
          (bound at src/HLearn/Models/Distributions.hs:122:13)
        mu :: v (bound at src/HLearn/Models/Distributions.hs:121:13)
        v :: Elem (Normal v)
          (bound at src/HLearn/Models/Distributions.hs:116:37)
        m2 :: Square v (bound at src/HLearn/Models/Distributions.hs:116:32)
        m1 :: v (bound at src/HLearn/Models/Distributions.hs:116:29)
        (Some bindings suppressed; use -fmax-relevant-binds=N or -fno-max-relevant-binds)

Are you planning on upgrading HLearn or do you want to focus on subhask for now?

Greetings,

Drezil

mikeizbicki commented 7 years ago

Thanks for the report.

I don't have immediate plans for upgrading hlearn. I want to do it, but I have a lot of other work getting in the way. I'm not sure when I'll get around to it.