Closed xiongzhiqiang closed 2 days ago
Looking at the logs, it appears the GPU was found, and model inference ran on the GPU.
The line Found local devices: [CudaDevice(id=0), CudaDevice(id=1)]
indicates the GPU was found.
The line Calculating bucket size for input with 192 tokens. I1120 03:07:01.489473 131374588219392 pipeline.py:264] Got bucket size 256 for input with 192 tokens, resulting in 64 padded tokens.
indicates the bucket size for your input was 256.
The line Running model inference for seed 1 took 53.03 seconds.
indicates model inference took 53.03 seconds. I think that is a reasonable amount of time for bucket size 256 on an A100.
I'll close the issue for now, but please feel free to re-open if that doesn't answer your question.
I have installed alphafold3 successfully in my server with GPU A100. My GPU information as follow. Wed Nov 20 03:28:50 2024
+-----------------------------------------------------------------------------------------+ | NVIDIA-SMI 550.54.14 Driver Version: 550.54.14 CUDA Version: 12.6 | |-----------------------------------------+------------------------+----------------------+ | GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. | | | | MIG M. | |=========================================+========================+======================| | 0 NVIDIA A100-PCIE-40GB Off | 00000000:04:00.0 Off | 0 | | N/A 26C P0 36W / 250W | 15734MiB / 40960MiB | 0% Default | | | | Disabled | +-----------------------------------------+------------------------+----------------------+ | 1 NVIDIA A100-PCIE-40GB Off | 00000000:1B:00.0 Off | 0 | | N/A 26C P0 31W / 250W | 3MiB / 40960MiB | 0% Default | | | | Disabled | +-----------------------------------------+------------------------+----------------------+ When I ran one example, I found the GPU utilization is at 0% and the task take more much time than I expected. I think the alphafold3 can’t run in my GPU.
The command is here.
The log of run alphafold as follow.
Anyone can help me to find out why. Thank you very much.