Open NarwhalChen opened 2 days ago
Hey @NarwhalChen, do you mind sharing a fully reproducible code snippet that runs into this issue? Also this seems to run in node rather than python?
cc @xenova, this seems linked to transformers.js rather than transformers
Moved to Transformers.js repo 👍
@NarwhalChen The reason you are facing that error is because the checkpoint you are trying to load is not compatible with Transformers.js. Notice that the "transformers.js" tag is not present on the repo:
Fortunately, I have made a conversion for this model already, which can be found at https://huggingface.co/Xenova/bert-base-uncased (set modelName
to "Xenova/bert-base-uncased"
)
System Info
when using AutoModel.from_pretrained(modelName) it has error message Error: Could not locate file: "https://huggingface.co/google-bert/bert-base-uncased/resolve/main/onnx/model.onnx", but there is nothing in this url, and i think the bug is in file src/transformers/utils/hub.py, line 149 HUGGINGFACE_CO_PREFIX = HUGGINGFACE_CO_RESOLVE_ENDPOINT + "/{model_id}/resolve/{revision}/{filename}"
Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
async function loadModel(modelName: string) {
} and report try { 8 | const tokenizer = await AutoTokenizer.from_pretrained(modelName);
console.log Error: Could not locate file: "https://huggingface.co/google-bert/bert-base-uncased/resolve/main/onnx/model.onnx".
Expected behavior
expected to pull model in true url