baker-laboratory / RoseTTAFold-All-Atom

Other
649 stars 118 forks source link

Size limitations based on memory allocation #48

Open gulseva opened 8 months ago

gulseva commented 8 months ago

I am trying to run RFAA in multimer mode with two sequences, which are 223 and 563 aa-long. However, I am getting an OOM error on an A4500 with 20 GB GPU memory. I tried resetting the maximum split size in the setup_model.py script to values from 32 to 512, but the OOM errors persisted. These errors make me wonder whether I am hitting a size limit for the GPU card I have or if I am facing a memory allocation issue. If it is the former, what should be the maximum size of a system? If it is the latter, what other options can I try to prevent memory allocation errors?

amorehead commented 8 months ago

When you invoke the inference script, you can try setting loader_params.MAXCYCLE to a lower value than its default (i.e., 4). Setting it to 1 would yield a single-iteration (less accurate) prediction of your structure, however, it should be less memory-hungry this way.

cyangNYU commented 8 months ago

Similar issue here https://github.com/baker-laboratory/RoseTTAFold-All-Atom/issues/26