Closed golipe closed 6 years ago
todo: document how this option works and how to use it with predict.py script.
todo: all restore predictions.csv
Restore functionality should now work as expected. Multiple major changes had happened.
Now it should be possible to stop and continue at any point. If user stops during burnin phase - next time the --root argument is passed - last burnin iteration is restored and burnin continues. If user stops during sampling phase - next time the --root argument is passed - last sampling iteration is restored and sampling continues.
Latest commits include the following improvement. When previous burning iteration state is removed additional comment line gets added to root file.
I think this issue can be considered as done. Additional things are addressed in separate issue: https://github.com/ExaScience/smurff/issues/91
-- Restoring model, predictions,... from 'demo*.csv'. terminate called after throwing an instance of 'std::runtime_error' what(): [ERROR] File 'demo-U0-latents.csv' not found.
The file is not found because it should be all the files of the form: demo-sample-*-U0-latents.csv and not 'demo-U0-latents.csv'