Open zhangxiaoyu2046 opened 1 week ago
dear @zhangxiaoyu2046,
I think that, as you say, the size of your trajectory does not fit into the available RAM and numpy cannot allocate the memory. You can take a look at this section of dynaphopy's manual:
https://abelcarreras.github.io/DynaPhoPy/special.html
There, I describe different strategies that you can use to deal with large trajectories.
About the second error I don't know. Segmentation fault is too vague to figure out what is happening. Maybe it is also some memory issue. Did you try to run some of the provided examples (which are much smaller) to make sure that the installation is correct?.
The way dynaphopy work is to first read all the data from the trajectory in memory to compute the atomic velocities and then the power spectrum (this is where the memory usage is highest). Then, -sdata option just tells dynaphopy to store the results in a file. This latter part should not impact much to the memory usage.
Hi everyone,
I want to obtain the lifetime of each phonon mode. When I tried to obtain the quasiparticle phonon data by running: $ dynaphopy input_file TRAJECTORY -sdata there was an error:
seems that the numpy array is too large to be created? my structure has 5000 atoms, and the velocity.lammpstrj is about 14 GB. when I switched to a fat computation node with very large RAM, there is no output after 3 days while the process was stucked somewhere of dynaphopy code. when I tried to test on a smaller structure with less atoms (162 atoms), there was another error:
I don't know where I was wrong, could someone explain a bit about the workflow of generating the quasiparticle info? Is it necessary to perform a peak analysis by $ dynaphopy input_file TRAJECTORY -pa, before I run the -sdata command?
Many thanks!