Pinning transformers to the older version breaks other custom nodes, particularly ones which depend on onnxruntime-gpu. In particular, downgrading the transformers version to 4.26.1 results in onnxruntime defaulting to the CPU instead of the GPU, slowing down various GPU dependant processes of other nodes.
I'm not actually sure it's because the transformers version is causing onnxruntime-gpu to fail. I figured out a workaround for now of reinstalling onnxruntime-gpu after this module's installed.
Pinning transformers to the older version breaks other custom nodes, particularly ones which depend on onnxruntime-gpu. In particular, downgrading the transformers version to 4.26.1 results in onnxruntime defaulting to the CPU instead of the GPU, slowing down various GPU dependant processes of other nodes.
Steps to reproduce:
pip install onnxruntime-gpu
was-node-suite-comfyui
cd was-node-suite-comfyui && pip install -r requirements.txt
python -c "import onnxruntime as ort;print(ort.get_device())"
Expected results:
Should output
GPU
Actual results:
Outputs
CPU
, causing tasks such as openpose detection for DWPose to run on the CPU, as well as many other tasks from other nodes depending on the GPU.