An application for parameterization of biological models available in SBML and BNGL formats. Features include parallelization, metaheuristic optimization algorithms, and an adaptive Markov chain Monte Carlo (MCMC) sampling algorithm.
Other
22
stars
18
forks
source link
Fitting process stops because of high memory usage #296
I have a problem in running scatter search fitting. If I set up high number of iterations (about 50), some time after start of calculation I am obtaining a lot of the following messages in the log:
2020-11-04 08:04:31,716 distributed.worker WARNING ForkServerProcess-3 Memory use is high but worker has no data to store to disk. Perhaps some other process is leaking memory? Process memory: 2.26 GB -- Worker memory limit: 2.78 GB
The log file constantly grows up to 20 MB or more with these messages repeated many times, and no progress of fitting process is observed. The computer has a 32 GB of memory and parallel_count option was set to 6, so, the system does have a lot of spare memory.
How could I increase the memory limit in this case?
Dear pybnf developers,
I have a problem in running scatter search fitting. If I set up high number of iterations (about 50), some time after start of calculation I am obtaining a lot of the following messages in the log:
2020-11-04 08:04:31,716 distributed.worker WARNING ForkServerProcess-3 Memory use is high but worker has no data to store to disk. Perhaps some other process is leaking memory? Process memory: 2.26 GB -- Worker memory limit: 2.78 GB
The log file constantly grows up to 20 MB or more with these messages repeated many times, and no progress of fitting process is observed. The computer has a 32 GB of memory and parallel_count option was set to 6, so, the system does have a lot of spare memory.
How could I increase the memory limit in this case?
Kind regards, Oleksii