Ivanlh20 / multem

MULTEM is a powerful and advanced collection of C++ routines with CUDA support, designed to perform efficient and accurate multislice simulations for various TEM experiments such as HRTEM, STEM, ISTEM, ED, PED, CBED, ADF-TEM, ABF-HC, EFTEM, and EELS.
GNU General Public License v3.0
65 stars 26 forks source link

GPU usage by MULTEM #22

Closed Quantumstud closed 5 years ago

Quantumstud commented 5 years ago

Hi, I am using the MULTEM package in my ubuntu 18.04 LTS with an NVIDIA GeForce GTX 1060 (6GB) GPU. Whenever I run the package I find only 1GB out of my 6GB of the GPU is utilized by MULTEM and remaining is free. Is it a problem with the package or the algorithm? Is it possible to make the simulation faster by using more GPU, since more GPU power is available? Thanks.

thomasaarholt commented 5 years ago

You are mixing up GPU ram and GPU processing power. You should monitor the GPU load instead of the memory.

As you increase the size of your simulation, i.e. by increasing the real space or sampling space sizes, your memory requirement will go up. The advantage of having more ram on your GPU is that it can then process larger matrices at the same time, rather than having to (slowly) move data from GPU ram to "normal" ram or (much more slowly) to hard disk storage.

Quantumstud commented 5 years ago

Hi Thomas, Thank you for your crisp reply. Yeah using the 'nvidia-smi' command I can now monitor the GPU utility which shoots up to 97% during the simulation. In the idle state, it is around 3%. Wonderful!