ModellingWebLab / project_issues

An issues-only repository for issues that cut across multiple repositories
1 stars 0 forks source link

Version number stored with all simulation results #76

Open MichaelClerx opened 3 years ago

MichaelClerx commented 3 years ago

It would be good to have a version number to store with all simulation results, and ideally something snappy that we can write into methods sections. (Just came across this working with Aditi & @mirams )

Is it doable to have a WL number that increases when any part of the front or back-end gets updated @jonc125 ? Or are we better off with a date?

jonc125 commented 3 years ago

A date + time is probably easiest to implement! There are many different components that you might want version numbers from...

Although we could possibly set the deployment to increment a single version number any time it is run - whether or not that run results in 'actual' changes to Web Lab code?

mirams commented 2 years ago

Yeah, Aditi is putting together a methods section where ideally we would say these are the results from WebLab version X, and could say "these are the results from Web Lab in Dec 2020" but then there's no obvious way for someone to say "oh, so how do I get that version of Web Lab up and running to replicate that?"

mirams commented 2 years ago

Update: the existing WebLab does replicate the results (not just Dec 2020), so if there's a way we can give it a version number that would let people reproduce it that would be great, is it all in a docker for instance somewhere that pulls in all the relevant dependencies. A version number that linked to a commit hash for a dockerfile with full build instructions or something would do the trick for reproducibility?

jonc125 commented 2 years ago

The challenge in deploying a matching WebLab is that not every version is pinned in the Ansible scripts, so there is potential for changes even if you use the deployment script from a known commit. This is actually true for Dockerfiles too; it's only the built containers that are guaranteed the same, and old versions of those might not remain available long term.

What we could do with some work is get the deployment to dump key version info, e.g. for all packages installed in each virtualenv, and OS version. Although even there you could have different versions of apt packages. There are many things that could subtly affect the results!

mirams commented 2 years ago

Hmmm, I mean in theory we have so many tests that the results can't change accidentally between versions, so it shouldn't matter. But I would guess it is a matter of time until we do decide we need finer solver tolerances or something and results do shift a little.

I guess if we are about reproducibility rather than replicability it isn't the end of the world if results shift within solver tolerances, but it is probably only a matter of time until a bug in some dependency is fixed and there are qualitative changes and we need to stamp things with a version that can be replicated...?