Pare down the requirements (i.e., requirements.txt) to only include "top-level" packages (i.e. those that are actually used and are not dependencies of said packages). Furthermore, instead of pinning the versions of these packages, specify minimum versions instead.
Motivation
Over-specifying dependencies in the requirements makes using the library out-of-the-box on systems where you don't have full control of the available dependencies much trickier. This is very common when trying to use public/enterprise computing clusters (e.g. Compute Canada). Furthermore, tying sub-dependencies to specific versions while leaving the versions of core dependencies flexible (e.g. PyTorch, NumPy, etc.) leads to mismatching dependency errors. This will become increasingly problematic as the core dependencies are updated.
I understand the desire to fix the versions of everything for reproducibility. However, when the versions sub-dependencies are fixed but not the versions of core dependencies (again, PyTorch etc.) fixing the versions of these sub-dependencies is redundant.
Pitch
Already, core dependencies (e.g., PyTorch) are specified using minimum versions. I propose removing sub-dependencies from requirements.txt and specifying the remaining core dependencies as minimum versions (with the currently specified version as the new minimum).
The resulting requirements file would look like this:
I've already tested this to check that the training script still runs. It only requires one change to vit.py, changing the import path of to_2tuple to: from timm.models.layers import to_2tuple.
π Feature
Pare down the requirements (i.e.,
requirements.txt
) to only include "top-level" packages (i.e. those that are actually used and are not dependencies of said packages). Furthermore, instead of pinning the versions of these packages, specify minimum versions instead.Motivation
Over-specifying dependencies in the requirements makes using the library out-of-the-box on systems where you don't have full control of the available dependencies much trickier. This is very common when trying to use public/enterprise computing clusters (e.g. Compute Canada). Furthermore, tying sub-dependencies to specific versions while leaving the versions of core dependencies flexible (e.g. PyTorch, NumPy, etc.) leads to mismatching dependency errors. This will become increasingly problematic as the core dependencies are updated.
I understand the desire to fix the versions of everything for reproducibility. However, when the versions sub-dependencies are fixed but not the versions of core dependencies (again, PyTorch etc.) fixing the versions of these sub-dependencies is redundant.
Pitch
Already, core dependencies (e.g., PyTorch) are specified using minimum versions. I propose removing sub-dependencies from
requirements.txt
and specifying the remaining core dependencies as minimum versions (with the currently specified version as the new minimum).The resulting requirements file would look like this:
I've already tested this to check that the training script still runs. It only requires one change to
vit.py
, changing the import path ofto_2tuple
to:from timm.models.layers import to_2tuple
.Alternatives
None.
Additional context
None.