denizyuret / Knet.jl

Koç University deep learning framework.
https://denizyuret.github.io/Knet.jl/latest
Other
1.43k stars 230 forks source link

minibatch cannot support 5 dims KnetArray. #406

Open lannisite110 opened 5 years ago

lannisite110 commented 5 years ago
  1. this part is ok! but the followed part is not ok!
    julia> dtrn
    Data(Float32[0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0], UInt8[0x05 0x0a … 0x06 0x08], 100, 60000, false, 1:60000, false, (28, 28, 1, 60000), (60000,), KnetArray{Float32,N} where N, Array{UInt8,1})
    julia> train!(LeNet, dtrn)
  2. next part

    ulia> #minibatching
    
    dtrn = minibatch(xtrn, ytrn, nbatch)
    Data([0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0], [18 16 … 14 20], 100, 1834, false, 1:1834, false, (30, 30, 30, 3, 1834), (1834,), KnetArray{Float64,5}, Array{Int64,1})
    julia> train!(ThreeD_model,dtrn)**
    Stacktrace:
    [1] error(::String, ::UInt32) at ./error.jl:42
    [2] macro expansion at /home/users/xdlan/.julia/packages/Knet/rKugL/src/gpu.jl:18 [inlined]
    [3] #conv4_algo#371(::Ptr{Nothing}, ::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::KnetArray{Float64,5}, ::KnetArray{Float64,5}, ::KnetArray{Float64,5}) at /home/users/xdlan/.julia/packages/Knet/rKugL/src/conv.jl:604
    [4] (::getfield(Knet, Symbol("#kw##conv4_algo")))(::NamedTuple{(:handle,),Tuple{Ptr{Nothing}}}, ::typeof(Knet.conv4_algo), ::KnetArray{Float64,5}, ::KnetArray{Float64,5}, ::KnetArray{Float64,5}) at ./none:0
    [5] #conv4#216(::Ptr{Nothing}, ::Int64, ::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::KnetArray{Float64,5}, ::KnetArray{Float64,5}) at /home/users/xdlan/.julia/packages/Knet/rKugL/src/conv.jl:39
    [6] conv4(::KnetArray{Float64,5}, ::KnetArray{Float64,5}) at /home/users/xdlan/.julia/packages/Knet/rKugL/src/conv.jl:37
    [7] #forw#4(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Function, ::Param{KnetArray{Float64,5}}, ::Vararg{Any,N} where N) at /home/users/xdlan/.julia/packages/AutoGrad/eAmjh/src/core.jl:91
    [8] forw at /home/users/xdlan/.julia/packages/AutoGrad/eAmjh/src/core.jl:89 [inlined]
    [9] #conv4#219 at ./none:0 [inlined]
    [10] conv4(::Param{KnetArray{Float64,5}}, ::AutoGrad.Result{KnetArray{Float64,5}}) at ./none:0
    [13] #nll#607(::Int64, ::Bool, ::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Chain, ::KnetArray{Float64,5}, ::Array{Int64,1}) at /home/users/xdlan/.julia/packages/Knet/rKugL/src/loss.jl:328
    [14] nll at /home/users/xdlan/.julia/packages/Knet/rKugL/src/loss.jl:328 [inlined]
    [15] (::getfield(Knet, Symbol("##567#568")){typeof(nll),Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},Chain,KnetArray{Float64,5},Array{Int64,1}})() at /home/users/xdlan/.julia/packages/AutoGrad/eAmjh/src/core.jl:82
    [16] #differentiate#1(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Function) at /home/users/xdlan/.julia/packages/AutoGrad/eAmjh/src/core.jl:54
    [17] differentiate(::Function) at /home/users/xdlan/.julia/packages/AutoGrad/eAmjh/src/core.jl:45
    [18] #train!#566(::typeof(nll), ::Adam, ::getfield(Knet, Symbol("#callback#569")){Float64}, ::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Chain, ::Data) at /home/users/xdlan/.julia/packages/Knet/rKugL/src/model.jl:60
    [19] train!(::Chain, ::Data) at /home/users/xdlan/.julia/packages/Knet/rKugL/src/model.jl:54
    [20] top-level scope at none:0
    [21] eval(::Module, ::Any) at ./boot.jl:319
    [22] eval_user_input(::Any, ::REPL.REPLBackend) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/REPL/src/REPL.jl:85
    [23] macro expansion at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/REPL/src/REPL.jl:117 [inlined]
    [24] (::getfield(REPL, Symbol("##28#29")){REPL.REPLBackend})() at ./task.jl:259ERROR: cudnn.cudnnFindConvolutionForwardAlgorithm error 4
    Stacktrace:
    [1] #differentiate#1(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Function) at /home/users/xdlan/.julia/packages/AutoGrad/eAmjh/src/core.jl:57
    [2] differentiate(::Function) at /home/users/xdlan/.julia/packages/AutoGrad/eAmjh/src/core.jl:45
    [3] #train!#566(::typeof(nll), ::Adam, ::getfield(Knet, Symbol("#callback#569")){Float64}, ::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Chain, ::Data) at /home/users/xdlan/.julia/packages/Knet/rKugL/src/model.jl:60
    [4] train!(::Chain, ::Data) at /home/users/xdlan/.julia/packages/Knet/rKugL/src/model.jl:54
    [5] top-level scope at none:0
ekinakyurek commented 5 years ago

@lannisite110 my personal suggestions for you before opening issues:

1) Please checkout https://help.github.com/articles/basic-writing-and-formatting-syntax/ and https://help.github.com/articles/creating-and-highlighting-code-blocks/ to learn basic usage of markdown and specially backtick

2) Title of the issues should neither be a complete question or a single word. I think it should be a definition of error. You may checkout others' issues and see what I mean.

3) Errors should be replicable, so we can understand which part of the current Knet causes the issue. It is important if you can provide an abstracted Julia code such that everybody here can run on their system without having your full code. Again, please checkout other issues in other repositories to see how people report them.

4) If you have questions, first look at Knet Docs and Knet tutorials under this repo. For further questions you can use email groups.

Best, Ekin