To make it more supercomputer friendly, the code need to handle dirty interruptions, for example as a result of running ut of time on a supercomputer job.
My idea is to make the analysis script save intermediate results, e.g., each completed greedy search, to temporary files when completed. It should then be possible to restart the analysis without having to rerun all iterations.
To make it more supercomputer friendly, the code need to handle dirty interruptions, for example as a result of running ut of time on a supercomputer job. My idea is to make the analysis script save intermediate results, e.g., each completed greedy search, to temporary files when completed. It should then be possible to restart the analysis without having to rerun all iterations.