Closed sky1ove closed 7 hours ago
Please refer to https://github.com/google-deepmind/alphafold3/blob/main/docs/installation.md#obtaining-genetic-databases, in particular the section about using the gcp_mount_ssd.sh
and copy_to_ssd.sh
scripts.
Once you run those scripts to copy the relevant dbs in the SSD, you will have to launch as:
docker run -it \
--volume $HOME/af_input:/root/af_input \
--volume $HOME/af_output:/root/af_output \
--volume <MODEL_PARAMETERS_DIR>:/root/models \
--volume <SSD_DB_DIR>:/root/public_databases \
--volume <DB_DIR>:/root/public_databases_fallback \
--gpus all \
alphafold3 \
python run_alphafold.py \
--json_path=/root/af_input/fold_input.json \
--model_dir=/root/models \
--db_dir=/root/public_databases \
--db_dir=/root/public_databases_fallback \
--output_dir=/root/af_output
If I simply want a SSD / RAM-disk without GPU to run the MSA, which machine type from GCP do you recommend?
Besides the example of setting this gpu instance,
Could you show an example of setting a RAM disk from google cloud for the specific task of computing MSA?