Rose-STL-Lab / LIMO

generative model for drug discovery
58 stars 14 forks source link

About Reverse Optimization #18

Closed tszslovewanpu closed 9 months ago

tszslovewanpu commented 9 months ago

Hello, excellent work! I have a question regarding the reverse optimization process. ^v^

In Section 3.3, you mentioned freezing the weights of f_dec and g_theta, while keeping z trainable, where z is a vector. My understanding is that backpropagation typically operates on the weights of neural networks. However, in this scenario, both the weights of the decoder and the property prediction model are frozen. Does this imply that we actually fine-tune the weights of the encoder to generate the desired z?

Alternatively, if there's anything I haven't understood clearly about backpropagation? That it can directly modify the vector z based on the loss between the predicted property and the ground truth property.

Thank you very much!

PeterEckmann1 commented 9 months ago

Thanks! In this case, backpropagation is used to calculate the gradient with respect to z, not to change the neural network weights. z can be thought of as the input to f_dec and g_theta, and we propagate the output from g_theta all the way backwards through the network (without changing the weights) until we reach the input z, which is then altered directly.

tszslovewanpu commented 9 months ago

So, this process is called the 'Optimization'. We only have to train the decoder/predictor once, but can adapt to many target/desired properties by changing z with the backpropagation technique. (Iteratively updating z, optimize it many times until the property precision is satisfied), am i right? Thank you! Btw, how many times do we update z on average, after we freeze the network and start to do the reverse optimization process? mua~

PeterEckmann1 commented 9 months ago

Yes, that is correct. Although I wouldn't say we can adapt to many different targets, since we only train the property predictor for one target, but yes we can swap in desired properties using their associated property predictors.

The number of times we update z is given by the num_steps parameter in get_optimized_z (https://github.com/Rose-STL-Lab/LIMO/blob/dc55c299010c62a9a8a3b5acdfc86ba50500256d/generate_molecules.py#L19), which is by default set to 10 gradient updates.

tszslovewanpu commented 9 months ago

Thank you!