Open rocketman8080 opened 1 year ago
My understanding is that the GPU is not used during the MSA steps, only during structure generation and relaxation. The output you shared stops near the end of the MSA. Based on the first few lines of output it looks like your GPU driver spun up just fine, so your GPU utilization should go up later in the process.
Hello, how would I be able to tell if the GPU is used during the computation? There are some warnings seen during initialization in the output below, I am unable to interpret them.
I see CPU is hitting full throttle, however GPU stats (at least ones I know of as shown below) appear to suggest that GPU is not engaged. Are there only certain phases in the batch which leverage GPU, or GPU should be used uniformly throughout the process?
Also, what is roughly the time expected to complete T1050 with / without GPU on AWS DLAMI instance?
Thank you for your help!
Output observed during a sample run:
And some basic GPU stats below,
(base) ubuntu@alphafold20:/datavol$ nvidia-smi Tue Jun 13 03:07:03 2023
+-----------------------------------------------------------------------------+ | NVIDIA-SMI 525.85.12 Driver Version: 525.85.12 CUDA Version: 12.0 | |-------------------------------+----------------------+----------------------+ | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. | | | | MIG M. | |===============================+======================+======================| | 0 Tesla V100-SXM2... On | 00000000:00:1E.0 Off | 0 | | N/A 32C P0 49W / 300W | 2102MiB / 16384MiB | 0% Default | | | | N/A | +-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+ | Processes: | | GPU GI CI PID Type Process name GPU Memory | | ID ID Usage | |=============================================================================| | 0 N/A N/A 5250 C python 308MiB | +-----------------------------------------------------------------------------+
nvidia-smi --format=csv --query-gpu=power.draw,utilization.gpu,fan.speed,temperature.gpu power.draw [W], utilization.gpu [%], fan.speed [%], temperature.gpu 49.46 W, 0 %, [N/A], 32