Open agitter opened 7 years ago
@agitter thanks for posting this. I actually don't think that the code they provide for deepbind includes the model construction. They include pre-trained models, which you can use to score new sequences. But the deepbind model is very straightforward, I don't know why they didn't just code it themselves.
It's not important enough for me to download and check their code, but I saw text in the DeepBind README like
Train TF models on ENCODE ChIP-seq peaks, then test on held-out subset of peaks. See supplementary information if training/testing set is not clear from descriptions below. Use top 500 even to train, top 500 odd to test:
python deepbind_train_encode.py top calib,train,test,report
I believe there are two versions of the code, only one of which supports training.
I'm the corresponding author of this work. Your response are really fast. For DeepBind, we may miss the training code. Actually, we have implemented a simple version of DeepBind. However, the tricky part is the hyperparameter tuning. We cannot guarantee that we can repeat the whole process reported in the original DeepBind paper. Therefore, we decide to run TFImpute directly on the data used by DeepBind. We believe that it is a fair comparison.
@knowledgefold Thanks for clarifying. I think that makes a lot of sense. I also found it hard to find the DeepBind training code. Last year I posted that it wasn't available, and one of the authors corrected me by providing the link above.
We'll (eventually) discuss your paper here for our review, so please come back with comments if you have anything else to add.
http://doi.org/10.1371/journal.pcbi.1005403
@jacklanchantin can you please look at this for #236?
One minor comment is that they say
but I believe the DeepBind (#11) code here does include the training step. It's just hard to find from the main page.