I re-clone your code, dI just replaced the data with my own molecular and protein data, but the data structure is exactly the same as the davis sample data., then ran the drive4_d_warm.sh file, but I got this error:
/home/zh/anaconda3/envs/deep2.0.0/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
from ._conv import register_converters as _register_converters
Traceback (most recent call last):
File "driver.py", line 715, in <module>
tf.app.run(main=run_analysis, argv=[sys.argv[0]] + unparsed)
File "/home/zh/anaconda3/envs/deep2.0.0/lib/python3.6/site-packages/tensorflow/python/platform/app.py", line 124, in run
_sys.exit(main(argv))
File "driver.py", line 144, in run_analysis
'nci60': dcCustom.molnet.load_nci60
AttributeError: module 'dcCustom.molnet' has no attribute 'load_nci60'
Then I commented on the 144 lines of code in drive.py: 'nci60': dcCustom.molnet.load_nci60, then re-run drive4_d_warm.bash, I got the following error again:
Traceback (most recent call last):
File "driver.py", line 715, in <module>
tf.app.run(main=run_analysis, argv=[sys.argv[0]] + unparsed)
File "/home/zh/anaconda3/envs/deep2.0.0/lib/python3.6/site-packages/tensorflow/python/platform/app.py", line 124, in run
_sys.exit(main(argv))
File "driver.py", line 182, in run_analysis
prot_seq_dict=prot_seq_dict, oversampled=oversampled)
File "/project/git3/PADME/dcCustom/molnet/load_function/davis_dataset.py", line 117, in load_davis
frac_test=0)
File "/project/git3/PADME/dcCustom/splits/splitters.py", line 166, in train_valid_test_split
log_every_n=log_every_n)
File "/project/git3/PADME/dcCustom/splits/splitters.py", line 865, in split
remain_this_mol_entries = mol_entries[molecule] - removed_entries
UnboundLocalError: local variable 'removed_entries' referenced before assignment
I re-clone your code, dI just replaced the data with my own molecular and protein data, but the data structure is exactly the same as the davis sample data., then ran the
drive4_d_warm.sh
file, but I got this error:Then I commented on the 144 lines of code in
drive.py
:'nci60': dcCustom.molnet.load_nci60
, then re-rundrive4_d_warm.bash
, I got the following error again: