gcorso / DiffDock

Implementation of DiffDock: Diffusion Steps, Twists, and Turns for Molecular Docking
https://arxiv.org/abs/2210.01776
MIT License
976 stars 238 forks source link

GPU memory is used while running inference.py changing the device to cpu #206

Open srilekha1993 opened 3 months ago

srilekha1993 commented 3 months ago

Hi, I observed while running the command [ time python -m inference --config default_inference_args.yaml --protein_ligand_csv data/test_1.csv --out_dir results_L_cpu/user_predictions_small] the GPU memory is used while the device is changed to 'cpu' in inference.py. Can anyone help me out how to completely run the above script in cpu only?

jsilter commented 2 months ago

The simplest thing would be to hide by environment variable

export CUDA_VISIBLE_DEVICES=-1 time python -m inference --config default_inference_args.yaml --protein_ligand_csv data/test_1.csv --out_dir results_L_cpu/user_predictions_small