Felix-Petersen / algovision

Differentiable Algorithms and Algorithmic Supervision.
MIT License
103 stars 5 forks source link

Embedding external function calls for string outputs #1

Closed karims closed 2 years ago

karims commented 2 years ago

Is it possible to call external library functions inside the Algorithm class which returns string output?

Or it is possible to mix and match standard python code inside the DSL written? Because I need the string output at the end of the code, but I don't need gradient of it.

Example:

def rand_str():
    return 'something'

a = Algorithm(
    Input('values'),
    Variable('x', torch.tensor(0.)),
    Variable('loss', torch.tensor(0.)),
    Variable('sample', rand_str()),
    While(
        LT('x', 5),
        Let('x', lambda x: inc(x))
    ),
    Print(lambda x: x),
    Output('sample'),
    Output('x'),
    beta=1.25,
    # debug=True,
)

v = torch.randn(3)
print(v)
print(a(v))
Felix-Petersen commented 2 years ago

Thanks for your question.

I am not certain why you would want to store a string in a variable. As you said, this would not yield a gradient and also is not affecting the rest of the program. If you just want a call of a to also return rand_str(), you could wrap a as follows:

def a_new(v):
    return rand_str(), a(v)

Variable modules only allow float tensors to ensure that everything remains differentiable. VariableInt allows storing ints and lists of ints. Allowing the example would technically be possible, and it is currently also only prevented by assertions to prevent bugs.

If you could clarify what exactly you want to do (and why) or what your goal is, I can possibly help you more.

karims commented 2 years ago

Ok, I will put here what I am doing. I am sorry for the delay to respond.

I am trying to do some operations on string, which is mutation of a molecule. The code is this file. It is independenant, provided rdkit library is installed.

https://github.com/sungsoo-ahn/genetic-expert-guided-learning/blob/main/model/genetic_operator/mutate.py

You can add the following at the bottom of the code and run a sample:

if __name__ == '__main__':
    mol = 'CC(C)(C)c1ccc2occ(CC(=O)Nc3ccccc3F)c2c1'
    mol = Chem.MolFromSmiles(mol)
    res = mutate(mol, 0.01)
    print(res)

So, it does manipulation of string to create a mutation of a molecule. I want to get gradient of mutate function. Let me know your thoughts and any questions if you have.

Felix-Petersen commented 2 years ago

I see. algovision provides gradients for the control flow of programs. For this, it requires all functions to be already automatically differentiable. You cannot use it to make a non-differentiable black-box function call differentiable. Everything you want a gradient wrt. has to be a PyTorch tensor.

I am not sure what kind of gradients you would like to obtain. mol is a string or a custom object (at least it seems to me not to be a PyTorch tensor), so no gradients there, and 0.01 is a float but not a tensor, so also no gradients there.