Open oroszgy opened 6 months ago
This happens because Weasel checks the dependencies of a command and whether they've changed or not. To prevent this from happening, the large file should simply not be listed as output or input to a given command - then it won't be processed / validated.
Thanks. I just discovered this workaround for myself as well. Do you think it's feasible to use the last modification date instead of hashes? Alternatively, would it be a solution to compute hashes for file chunks to address this issue (refer to https://stackoverflow.com/questions/1131220/get-the-md5-hash-of-big-files-in-python)?
When creating a command which depends on a large file (which cannot be fitted into memory), weasel still tries to load the whole file which results in a
MemoryError
.The traceback for such a run: