pluskid / Mocha.jl

Deep Learning framework for Julia
Other
1.29k stars 254 forks source link

Warnings/errors after upgrading to julia 0.4 #158

Closed jskDr closed 8 years ago

jskDr commented 8 years ago

In Julia 3.10, there were no these warnings and errors. After I upgrade to 4.0, I got following warnings and errors when I execute test.jl in examples of Mocha.jl.

WARNING: Base.String is deprecated, use AbstractString instead. likely near /home/jamessungjinkim/.julia/v0.4/Logging/src/Logging.jl:24 WARNING: Base.String is deprecated, use AbstractString instead. likely near /home/jamessungjinkim/.julia/v0.4/Logging/src/Logging.jl:24 WARNING: Base.String is deprecated, use AbstractString instead. likely near /home/jamessungjinkim/.julia/v0.4/Logging/src/Logging.jl:24 WARNING: Base.String is deprecated, use AbstractString instead. likely near /home/jamessungjinkim/.julia/v0.4/Logging/src/Logging.jl:40 WARNING: Base.String is deprecated, use AbstractString instead. likely near /home/jamessungjinkim/.julia/v0.4/Logging/src/Logging.jl:40 WARNING: Base.String is deprecated, use AbstractString instead. likely near /home/jamessungjinkim/.julia/v0.4/Logging/src/Logging.jl:40

Configuring Mocha...

LoadError: MethodError: call has no method matching call(::Type{Dict{Symbol,Any}}) Closest candidates are: BoundsError() BoundsError(, !Matched::Any...) DivideError() ... while loading In[1], in expression starting on line 44

*\ Mocha.SquareLossLayer(loss) Inputs ---------------------------- pred: Blob(10 x 500) label: Blob(10 x 500)


I found a discussion of another Julia library where a similar issue is reported and noticed as follow: https://github.com/carlobaldassi/ArgParse.jl/commit/08dd98f7891d8738b9cbccad3cab15cf338239d9 Fix julia 0.4 bindings

String -> AbstractString Union(...) -> Union{...} Nothing -> Void None -> Union{}

pluskid commented 8 years ago

Those are warning from the package Logging.jl. You could either use a git version of Logging.jl or open an issue in their repo to persuade them to make a new release so that everybody is happy. :)

jskDr commented 8 years ago

I got it. I will try to use a git version of Logging.jl.

jskDr commented 8 years ago

Instead of test.jl/example and others in Mocha.jl/Julia.03, I tried the same test files in Mocha.jl/Julia.04. Then I got no warnings or errors anymore except warnings from Logging.jl. All examples including minst.jl are completely working well in my machines. Also, I asked the github page of Logging.jl to fix the problem.

WARNING: Base.String is deprecated, use AbstractString instead. likely near /home/sungjin/.julia/v0.4/Logging/src/Logging.jl:24 WARNING: Base.String is deprecated, use AbstractString instead. likely near /home/sungjin/.julia/v0.4/Logging/src/Logging.jl:24 WARNING: Base.String is deprecated, use AbstractString instead. likely near /home/sungjin/.julia/v0.4/Logging/src/Logging.jl:24 WARNING: Base.String is deprecated, use AbstractString instead. likely near /home/sungjin/.julia/v0.4/Logging/src/Logging.jl:40 WARNING: Base.String is deprecated, use AbstractString instead. likely near /home/sungjin/.julia/v0.4/Logging/src/Logging.jl:40 WARNING: Base.String is deprecated, use AbstractString instead. likely near /home/sungjin/.julia/v0.4/Logging/src/Logging.jl:40 Configuring Mocha... INFO: Recompiling stale cache file /home/sungjin/.julia/lib/v0.4/HDF5.ji for module HDF5.

14-Oct 23:01:13:DEBUG:root:#DEBUG Checking network topology for back-propagation 14-Oct 23:01:13:DEBUG:root:Init network TEST 14-Oct 23:01:13:DEBUG:root:Init parameter weight for layer ip 14-Oct 23:01:13:DEBUG:root:Init parameter bias for layer ip 14-Oct 23:01:13:DEBUG:root:#DEBUG Initializing coffee breaks 14-Oct 23:01:14:INFO:root: TRAIN iter=000000 obj_val=137.87589131 14-Oct 23:01:14:DEBUG:root:#DEBUG Entering solver loop 14-Oct 23:01:14:INFO:root: TRAIN iter=000100 obj_val=0.99267456 14-Oct 23:01:14:INFO:root: TRAIN iter=000200 obj_val=0.85095908 14-Oct 23:01:14:INFO:root: TRAIN iter=000300 obj_val=0.73049708 14-Oct 23:01:14:INFO:root: TRAIN iter=000400 obj_val=0.62827269 14-Oct 23:01:14:INFO:root: TRAIN iter=000500 obj_val=0.54146989 14-Oct 23:01:14:INFO:root: TRAIN iter=000600 obj_val=0.46770412 14-Oct 23:01:14:INFO:root: TRAIN iter=000700 obj_val=0.40496286 14-Oct 23:01:14:INFO:root: TRAIN iter=000800 obj_val=0.35154738 14-Oct 23:01:14:INFO:root: TRAIN iter=000900 obj_val=0.30602356 14-Oct 23:01:14:INFO:root: TRAIN iter=001000 obj_val=0.26718051 14-Oct 23:01:14:INFO:root: TRAIN iter=001100 obj_val=0.23399562 14-Oct 23:01:14:INFO:root: TRAIN iter=001200 obj_val=0.20560512 14-Oct 23:01:14:INFO:root: TRAIN iter=001300 obj_val=0.18127928 14-Oct 23:01:14:INFO:root: TRAIN iter=001400 obj_val=0.16040144 14-Oct 23:01:14:INFO:root: TRAIN iter=001500 obj_val=0.14245041 14-Oct 23:01:14:INFO:root: TRAIN iter=001600 obj_val=0.12698552 14-Oct 23:01:14:INFO:root: TRAIN iter=001700 obj_val=0.11363412 14-Oct 23:01:14:INFO:root: TRAIN iter=001800 obj_val=0.10208097 14-Oct 23:01:14:INFO:root: TRAIN iter=001900 obj_val=0.09205929 14-Oct 23:01:14:INFO:root: TRAIN iter=002000 obj_val=0.08334323 14-Oct 23:01:14:INFO:root: TRAIN iter=002100 obj_val=0.07574154 14-Oct 23:01:14:INFO:root: TRAIN iter=002200 obj_val=0.06909217 14-Oct 23:01:14:INFO:root: TRAIN iter=002300 obj_val=0.06325775 14-Oct 23:01:14:INFO:root: TRAIN iter=002400 obj_val=0.05812178 14-Oct 23:01:15:INFO:root: TRAIN iter=002500 obj_val=0.05358537 14-Oct 23:01:15:INFO:root: TRAIN iter=002600 obj_val=0.04956456 14-Oct 23:01:15:INFO:root: TRAIN iter=002700 obj_val=0.04598801 14-Oct 23:01:15:INFO:root: TRAIN iter=002800 obj_val=0.04279503 14-Oct 23:01:15:INFO:root: TRAIN iter=002900 obj_val=0.03993396 14-Oct 23:01:15:INFO:root: TRAIN iter=003000 obj_val=0.03736080 14-Oct 23:01:15:INFO:root: TRAIN iter=003100 obj_val=0.03503801 14-Oct 23:01:15:INFO:root: TRAIN iter=003200 obj_val=0.03293353 14-Oct 23:01:15:INFO:root: TRAIN iter=003300 obj_val=0.03101998 14-Oct 23:01:15:INFO:root: TRAIN iter=003400 obj_val=0.02927388 14-Oct 23:01:15:INFO:root: TRAIN iter=003500 obj_val=0.02767513 14-Oct 23:01:15:INFO:root: TRAIN iter=003600 obj_val=0.02620645 14-Oct 23:01:15:INFO:root: TRAIN iter=003700 obj_val=0.02485298 14-Oct 23:01:15:INFO:root: TRAIN iter=003800 obj_val=0.02360192 14-Oct 23:01:15:INFO:root: TRAIN iter=003900 obj_val=0.02244221 14-Oct 23:01:15:INFO:root: TRAIN iter=004000 obj_val=0.02136427 14-Oct 23:01:15:INFO:root: TRAIN iter=004100 obj_val=0.02035979 14-Oct 23:01:15:INFO:root: TRAIN iter=004200 obj_val=0.01942156 14-Oct 23:01:15:INFO:root: TRAIN iter=004300 obj_val=0.01854326 14-Oct 23:01:15:INFO:root: TRAIN iter=004400 obj_val=0.01771940 14-Oct 23:01:15:INFO:root: TRAIN iter=004500 obj_val=0.01694513 14-Oct 23:01:15:INFO:root: TRAIN iter=004600 obj_val=0.01621621 14-Oct 23:01:15:INFO:root: TRAIN iter=004700 obj_val=0.01552888 14-Oct 23:01:15:INFO:root: TRAIN iter=004800 obj_val=0.01487981 14-Oct 23:01:15:INFO:root: TRAIN iter=004900 obj_val=0.01426606 14-Oct 23:01:15:INFO:root: TRAIN iter=005000 obj_val=0.01368499 14-Oct 23:01:15:INFO:root: TRAIN iter=005100 obj_val=0.01313424 14-Oct 23:01:15:INFO:root: TRAIN iter=005200 obj_val=0.01261171 14-Oct 23:01:15:INFO:root: TRAIN iter=005300 obj_val=0.01211547 14-Oct 23:01:15:INFO:root: TRAIN iter=005400 obj_val=0.01164381 14-Oct 23:01:15:INFO:root: TRAIN iter=005500 obj_val=0.01119517 14-Oct 23:01:15:INFO:root: TRAIN iter=005600 obj_val=0.01076811 14-Oct 23:01:15:INFO:root: TRAIN iter=005700 obj_val=0.01036135 14-Oct 23:01:15:INFO:root: TRAIN iter=005800 obj_val=0.00997368 14-Oct 23:01:15:INFO:root: TRAIN iter=005900 obj_val=0.00960402 14-Oct 23:01:15:INFO:root: TRAIN iter=006000 obj_val=0.00925135 14-Oct 23:01:15:INFO:root: TRAIN iter=006100 obj_val=0.00891473 14-Oct 23:01:15:INFO:root: TRAIN iter=006200 obj_val=0.00859332 14-Oct 23:01:15:INFO:root: TRAIN iter=006300 obj_val=0.00828629 14-Oct 23:01:15:INFO:root: TRAIN iter=006400 obj_val=0.00799291 14-Oct 23:01:15:INFO:root: TRAIN iter=006500 obj_val=0.00771247 14-Oct 23:01:15:INFO:root: TRAIN iter=006600 obj_val=0.00744432 14-Oct 23:01:16:INFO:root: TRAIN iter=006700 obj_val=0.00718786 14-Oct 23:01:16:INFO:root: TRAIN iter=006800 obj_val=0.00694249 14-Oct 23:01:16:INFO:root: TRAIN iter=006900 obj_val=0.00670770 14-Oct 23:01:16:INFO:root: TRAIN iter=007000 obj_val=0.00648296 14-Oct 23:01:16:INFO:root: TRAIN iter=007100 obj_val=0.00626780 14-Oct 23:01:16:INFO:root: TRAIN iter=007200 obj_val=0.00606177 14-Oct 23:01:16:INFO:root: TRAIN iter=007300 obj_val=0.00586444 14-Oct 23:01:16:INFO:root: TRAIN iter=007400 obj_val=0.00567540 14-Oct 23:01:16:INFO:root: TRAIN iter=007500 obj_val=0.00549427 14-Oct 23:01:16:INFO:root: TRAIN iter=007600 obj_val=0.00532070 14-Oct 23:01:16:INFO:root: TRAIN iter=007700 obj_val=0.00515433 14-Oct 23:01:16:INFO:root: TRAIN iter=007800 obj_val=0.00499485 14-Oct 23:01:16:INFO:root: TRAIN iter=007900 obj_val=0.00484193 14-Oct 23:01:16:INFO:root: TRAIN iter=008000 obj_val=0.00469529 14-Oct 23:01:16:INFO:root: TRAIN iter=008100 obj_val=0.00455465 14-Oct 23:01:16:INFO:root: TRAIN iter=008200 obj_val=0.00441975 14-Oct 23:01:16:INFO:root: TRAIN iter=008300 obj_val=0.00429032 14-Oct 23:01:16:INFO:root: TRAIN iter=008400 obj_val=0.00416612 14-Oct 23:01:16:INFO:root: TRAIN iter=008500 obj_val=0.00404694 14-Oct 23:01:16:INFO:root: TRAIN iter=008600 obj_val=0.00393254 14-Oct 23:01:16:INFO:root: TRAIN iter=008700 obj_val=0.00382272 14-Oct 23:01:16:INFO:root: TRAIN iter=008800 obj_val=0.00371729 14-Oct 23:01:16:INFO:root: TRAIN iter=008900 obj_val=0.00361605 14-Oct 23:01:16:INFO:root: TRAIN iter=009000 obj_val=0.00351882 14-Oct 23:01:16:INFO:root: TRAIN iter=009100 obj_val=0.00342544 14-Oct 23:01:16:INFO:root: TRAIN iter=009200 obj_val=0.00333573 14-Oct 23:01:16:INFO:root: TRAIN iter=009300 obj_val=0.00324953 14-Oct 23:01:16:INFO:root: TRAIN iter=009400 obj_val=0.00316671 14-Oct 23:01:16:INFO:root: TRAIN iter=009500 obj_val=0.00308712 14-Oct 23:01:16:INFO:root: TRAIN iter=009600 obj_val=0.00301061 14-Oct 23:01:16:INFO:root: TRAIN iter=009700 obj_val=0.00293707 14-Oct 23:01:16:INFO:root: TRAIN iter=009800 obj_val=0.00286636 14-Oct 23:01:16:INFO:root: TRAIN iter=009900 obj_val=0.00279836 14-Oct 23:01:16:INFO:root: TRAIN iter=010000 obj_val=0.00273298 14-Oct 23:01:16:INFO:root: Square-loss (avg over 5000500) = 0.2433