Open dsj976 opened 2 days ago
@ots22 wrote:
As far as I can tell, Docker is used only to create a fresh repository and to then run some tests that mutate the git repository with some git/dvc commands. Would isolating these tests in a temporary directory be enough, rather than in a full-blown container? That is, could this be a shell script instead of a Dockerfile?
@dsj976 replied:
yes, could be done, and it would definitely be a cheaper setup for testing. However, given how the package works currently, I see one main disadvantages to doing that: you currently need to run the tool from the root of the repo, and cannot choose where the downloaded data will be saved. Additionally, I'm personally not a big fan of having tests creating/deleting files on my local file system. Will open an issue about this in any case, and can be looked at in the future.
I see one main disadvantages to doing that: you currently need to run the tool from the root of the repo, and cannot choose where the downloaded data will be saved
It sounds like this might be easier to solve after #17 then, although shouldn't it be possible just to run the same commands in a script rather than a Dockerfile (making a fresh clone of the repository if that's necessary for instance)?
I'm personally not a big fan of having tests creating/deleting files on my local file system.
I don't think it is unreasonable for a test to interact with the filesystem if it's related to what it's testing! It's possible to do this safely, e.g. with mktemp
to give you a unique temporary directory.
The other benefit of running 'locally' is that it tests with the versions of the tools that you have installed, rather than some other versions that happen to get installed in the container (on a different OS potentially).
Containers provide a much stronger degree of isolation than is really needed at the cost of quite a heavy dependency.
See https://github.com/ClimeTrend/DMD-ERA5/pull/14/files/71795001e2e9f3e9a5099dbe7d4626ecf8adeb2c#r1861114346