Open MartinPeschel opened 5 months ago
Hi Martin, our collaborator in Minnesota has made some changes to the allocation of arrays, especially for the NAC processing. Especially some large arrays that scale as (3*natom)^2 are now only allocated if required (for NAC projection or angular momentum conservation). If the relevant options are off, this should dramatically reduce the memory needs of SHARC (especially in QM/MM). The changes are in the main branch.
Best, Sebastian
Hi everyone!
I recently tried to run a QM/MM calculation with 1359 atoms and 4 singlet states, that crashed with a surprising segfault. After a (longer) investigation, the culprit seemed to be the variable declarations in
subroutine NAC_processing(traj,ctrl)
, the stack usage of which exceeds the default stacksize on my machine.The problem can be solved by setting
ulimit -s unlimited
inrun.sh
. It would be nice to document this somewhere (or allocate the large arrays in this subroutine on the heap).Best wishes, Martin