denizyuret / AutoGrad.jl

Julia port of the Python autograd package.
Other
169 stars 26 forks source link

type tests are added #92

Closed ekinakyurek closed 6 years ago

ekinakyurek commented 6 years ago

I added Float32 tests to testsets.

It turns out that there are failing tests for only Float32. I added them here:

https://github.com/denizyuret/AutoGrad.jl/blob/89ff1bce4f2e4d394bdda3767615dec243d3210d/test/base.jl#L37

https://github.com/denizyuret/AutoGrad.jl/blob/89ff1bce4f2e4d394bdda3767615dec243d3210d/test/base.jl#L74

https://github.com/denizyuret/AutoGrad.jl/blob/89ff1bce4f2e4d394bdda3767615dec243d3210d/test/linearalgebra.jl#L46

https://github.com/denizyuret/AutoGrad.jl/blob/89ff1bce4f2e4d394bdda3767615dec243d3210d/test/neuralnet.jl#L19

coveralls commented 6 years ago

Pull Request Test Coverage Report for Build 313


Totals Coverage Status
Change from base Build 311: 0.0%
Covered Lines: 264
Relevant Lines: 297

💛 - Coveralls
ekinakyurek commented 6 years ago

We can do a trick if you want to test Float32 errors rather than precision.

I am adding an example here:

function justcheck(f, w, x...; kwargs=[], o...) # define this function in test/gradcheck.jl
        y = f(w, x...; kwargs...)
        if !isa(y,Number); f = gc_scalar(f); end
        g = grad(f)
        d = g(w, x...; kwargs...)
        return true
end
@testset "base" begin    
    for ft in [Float64,Float32]
        if ft == Float32
            gradcheck=gradcheckN=justcheck # since it is defined in this @testset, it does not create a problem for other tests
        end
        .
        .
        .

What do you think ?

denizyuret commented 6 years ago

manually merged this.