Running into a problem with adding L2 regularization. An example is to change the loss function in examples/mnist.jl from
function loss(w,x,ygold)
ypred = predict(w,x)
ynorm = logp(ypred,1) # ypred .- log(sum(exp(ypred),1))
-sum(ygold .* ynorm) / size(ygold,2)
end
function loss(w,x,ygold)
ypred = predict(w,x)
ynorm = logp(ypred,1) # ypred .- log(sum(exp(ypred),1))
-sum(ygold .* ynorm) / size(ygold,2) + 1e-3*sum(sumabs2(w[1]))
end
I get the following error when running the code
$ julia mnist.jl
mnist.jl (c) Deniz Yuret, 2016. Multi-layer perceptron model on the MNIST handwritten digit recognition problem from http://yann.lecun.com/exdb/mnist.
opts=(:seed,-1)(:batchsize,100)(:hidden,Int64[])(:epochs,10)(:lr,0.5)(:atype,"KnetArray{Float32}")(:gcheck,0)(:winit,0.1)(:fast,false)
INFO: Loading MNIST...
(:epoch,0,:trn,(0.07851667f0,2.5425951f0),:tst,(0.0784f0,2.5669396f0))
ERROR: LoadError: MethodError: Cannot `convert` an object of type Float32 to an object of type Array{Float64,N}
This may have arisen from a call to the constructor Array{Float64,N}(...),
since type constructors fall back to convert methods.
in sum(::Type{AutoGrad.Grad{1}}, ::Float64, ::Float32, ::AutoGrad.Rec{Float32}) at ./<missing>:0
in backward_pass(::AutoGrad.Rec{Array{Any,1}}, ::AutoGrad.Rec{Float64}, ::Array{AutoGrad.Node,1}) at /home/ju17693/.julia/v0.5/AutoGrad/src/core.jl:254
in (::AutoGrad.##gradfun#1#3{MNIST.#loss,Int64})(::Array{Any,1}, ::Function, ::Array{Any,1}, ::Vararg{Any,N}) at /home/ju17693/.julia/v0.5/AutoGrad/src/core.jl:40
in (::AutoGrad.#gradfun#2)(::Array{Any,1}, ::Vararg{Any,N}) at /home/ju17693/.julia/v0.5/AutoGrad/src/core.jl:39
in #train#1(::Float64, ::Int64, ::Function, ::Array{Any,1}, ::Array{Any,1}) at /home/ju17693/.julia/v0.5/Knet/examples/mnist.jl:51
in (::MNIST.#kw##train)(::Array{Any,1}, ::MNIST.#train, ::Array{Any,1}, ::Array{Any,1}) at ./<missing>:0
in macro expansion at /home/ju17693/.julia/v0.5/Knet/examples/mnist.jl:150 [inlined]
in macro expansion at ./util.jl:188 [inlined]
in main(::Array{String,1}) at /home/ju17693/.julia/v0.5/Knet/examples/mnist.jl:149
in include_from_node1(::String) at ./loading.jl:488
in process_options(::Base.JLOptions) at ./client.jl:265
in _start() at ./client.jl:321
while loading /home/ju17693/.julia/v0.5/Knet/examples/mnist.jl, in expression starting on line 164
Running into a problem with adding L2 regularization. An example is to change the
loss
function inexamples/mnist.jl
fromI get the following error when running the code