In the onnxruntime training API for Python, some packages in onnxruntime.training.optim import torch. Due to how Python package imports work, this results in a dependency of all packages in onnxruntime.training on torch. torch package sizes can be rather large and this size is problematic in some environments. It would be great to not require this package if using parts of onnxruntime that don't need it (e.g. onnxblock).
Not too urgent, I have a workaround in place that's patching the API in onnxruntime.training.__init__.py to only add the optim package if torch is available.
Describe the issue
In the onnxruntime training API for Python, some packages in
onnxruntime.training.optim
importtorch
. Due to how Python package imports work, this results in a dependency of all packages inonnxruntime.training
ontorch
.torch
package sizes can be rather large and this size is problematic in some environments. It would be great to not require this package if using parts of onnxruntime that don't need it (e.g.onnxblock
).To reproduce
In a fresh virtual env run:
This will fail with
Urgency
Not too urgent, I have a workaround in place that's patching the API in
onnxruntime.training.__init__.py
to only add theoptim
package iftorch
is available.ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.19.2
PyTorch Version
not using Pytorch
Execution Provider
Default CPU
Execution Provider Library Version
No response