martinpacesa / BindCraft

User friendly and accurate binder design pipeline
MIT License
209 stars 40 forks source link

Question on backprop #46

Closed henrySmith659 closed 6 days ago

henrySmith659 commented 1 week ago

Hello, I don't have any issues with the code, but I have a question about the pipeline. Generally, the term "backpropagation" is referred to as updating the weights of neurons to train the neural network. However, in this pipeline, it seems that the weights of the neurons only change in that AF2 selects one of the five different trained model weights in each iteration. The only variable that changes is the sequence to ensure that its complex with the target protein fits more the loss function after each iteration, but the weights do not change at all (asunder from what I mentioned of swapping among the 5 different model weights). Is that correct?

sokrypton commented 6 days ago

Generally the term "backpropagation" refers to propagating the error through the model and updating some parameters/weights. In this case the free parameters/weights are the input sequence of the binder, the rest of the alphafold parameters of the model are frozen (not updated).