JuliaDiff / ReverseDiff.jl

Reverse Mode Automatic Differentiation for Julia
Other
348 stars 57 forks source link

Please use 'ReverseDiff.value' #211

Closed bdas123 closed 2 years ago

bdas123 commented 2 years ago

I am getting the following error when using ReverseDiff.jl instead of Zygote.jl. I'd like to resolve it as ReverseDiff.jl can handle mutating arrays.

'ArgumentError: Converting an instance of ReverseDiff.TrackedReal{Float64, Float64, ReverseDiff.TrackedArray{Float64, Float64, 1, Vector{Float64}, Vector{Float64}}} to Float64 is not defined. Please use `ReverseDiff.value` instead.' occurred while calling julia code:
optimization(x0)
mohamed82008 commented 2 years ago

Your code is probably not generic enough. Read about generic programming in Julia. If you post an example of a function that gives this error I can help.

mohamed82008 commented 2 years ago

If this issue is resolved, please close the issue. If not, please post a minimal example to help.

bdas123 commented 2 years ago

Posting a minimal example right now

bdas123 commented 2 years ago

Here is the Python code:

import numpy as np

from julia.api import Julia
jl = Julia(compiled_modules=False)
from julia import Main

a = np.random.rand(408);
b = 8
c = 34
d = 41
l = np.random.rand(41,34)
m = np.random.rand(41,34)
p=1
q=1
r = np.random.rand(41,34)
s = np.random.rand(34)
t = np.random.rand(41,8)
u = np.random.rand(41,34)
v = np.random.rand(34)
w = 1
;

Main.a = a
Main.b = b
Main.c = c
Main.d = d
Main.l = l
Main.m = m
Main.p = p
Main.q = q
Main.r = r
Main.s = s
Main.t = t
Main.u = u 
Main.v = v
Main.w = w

%%time

jl.eval('include("Dummy Julia v2.jl")')

%%time

jl.eval('calc_sse(a)')

%%time

jl.eval('optimization(a)')

Here is the Julia code:

import Pkg
Pkg.add("Nonconvex")

using AbstractDifferentiation, ReverseDiff
backend = AbstractDifferentiation.ReverseDiffBackend()

using Nonconvex
Nonconvex.@load NLopt

function calc_sse(a)

    f = a[1:c]
    g= a[c+1:2*c]
    h= a[2*c+1:3*c]
    k= a[3*c+1:4*c]

    n = rand(b,c) 

    o = zeros((d, c))

    mf = 1.0./(1.0 .- (r * f))

    fi = ((t.*mf)*n).*u

    qt = g

    for e in 1:d

        for j in 1:c

            if l[e,j] != 0.0

                o[e,j] =  qt[j]+(1-n[j])*(fi[e,j]+v[j])

                qt[j] = o[e,j]
            end

        end

    end

    diff = sum((o .- l).^2, dims=1) ./transpose(s)

    sse = sum(diff)/c

    return sse

end

function optimization(a)
    model = Model(calc_sse)
    addvar!(model, zeros(408), zeros(408).+1000, init=a)
    alg = NLoptAlg(Symbol(:LD_MMA))
    options = NLoptOptions(ftol_rel = 1.0)
    ad_model = abstractdiffy(model,backend)
    result = optimize(ad_model, alg, a, options = options)
    return result.minimum, result.minimizer, result

end
bdas123 commented 2 years ago

Here is the error I am getting:

JuliaError: Exception 'ArgumentError: Converting an instance of ReverseDiff.TrackedReal{Float64, Float64, Nothing} to Float64 is not defined. Please use `ReverseDiff.value` instead.' occurred while calling julia code:
optimization(a)
mohamed82008 commented 2 years ago
o = zeros((d, c))

The above line creates a vector of Float64. You need to do "generic programming" by using the element type of a as the element type of o. ReverseDiff doesn't use Float64 but TrackedReal numbers so your code must be generic enough for it to be differentiated by ReverseDiff.

o = zeros(eltype(a), d, c)
bdas123 commented 2 years ago

Awesome, my solution to the problem was the following line:

o = zeros(Real,d,c)

Now the code is working