Open lancejpollard opened 3 months ago
I actually resolved the issue by updating Optimum to the latest version and keeping all other packages in requirements.txt the same.
i also get the error
TypeError: quantize_dynamic() got an unexpected keyword argument 'optimize_model'
the optimize_model
argument was removed in https://github.com/microsoft/onnxruntime/pull/16422 (merged june 21 2023).
(i am using onnxruntime version 1.18.1, the current latest version.)
I just tried v3 branch, and upgraded onnxruntime to 1.18.1. It seems I have no problem with command "python -m scripts.convert --quantize --model_id bert-base-uncased" on Windows.
System Info
Environment/Platform
Description
I am trying to run the model locally, as it doesn't appear to work running a remote model in Node.js.
First I followed https://github.com/xenova/transformers.js/blob/main/scripts/convert.py (which is linked in the README):
So that
onnxruntime<1.16.0
does not seem to exist inpip
.Can you update that script?
Second, I tried just installing the latest versions of everything instead, by making this the
requirements.txt
:But after I ran this:
I got an error:
Full stack trace:
It only seems to have output these files:
So then when I run my Node.js script (full script code at the bottom of the question in the SO link above), I get:
How do I get this working?
Reproduction
As described above.
requirements.txt
from theconvert.py
script linked in the README. It fails.pip
packages, and runconvert
script. It also fails.