huggingface / transformers.js

State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
https://huggingface.co/docs/transformers.js
Apache License 2.0
12.06k stars 764 forks source link

BUG: AutoModel.from_pretrained(modelName) #1031

Open NarwhalChen opened 2 days ago

NarwhalChen commented 2 days ago

System Info

when using AutoModel.from_pretrained(modelName) it has error message Error: Could not locate file: "https://huggingface.co/google-bert/bert-base-uncased/resolve/main/onnx/model.onnx", but there is nothing in this url, and i think the bug is in file src/transformers/utils/hub.py, line 149 HUGGINGFACE_CO_PREFIX = HUGGINGFACE_CO_RESOLVE_ENDPOINT + "/{model_id}/resolve/{revision}/{filename}" image

Who can help?

No response

Information

Tasks

Reproduction

async function loadModel(modelName: string) {

    console.log(modelName);
try {
    const tokenizer = await AutoTokenizer.from_pretrained(modelName);
    const model = await AutoModel.from_pretrained(modelName);

    console.log(model);
    console.log(tokenizer);
    return { model, tokenizer };
} catch (error) {
    console.log(error);
    return null;
}

} and report try { 8 | const tokenizer = await AutoTokenizer.from_pretrained(modelName);

9 | const model = await AutoModel.from_pretrained(modelName); | ^ 10 | 11 | console.log(model); 12 | console.log(tokenizer);

  at getSession (node_modules/.pnpm/@huggingface+transformers@3.0.2/node_modules/@huggingface/transformers/dist/webpack:/@huggingface/transformers/src/models.js:178:1)
  at node_modules/.pnpm/@huggingface+transformers@3.0.2/node_modules/@huggingface/transformers/dist/webpack:/@huggingface/transformers/src/models.js:297:63
      at Array.map (<anonymous>)
  at constructSessions (node_modules/.pnpm/@huggingface+transformers@3.0.2/node_modules/@huggingface/transformers/dist/webpack:/@huggingface/transformers/src/models.js:296:1)
  at Function.from_pretrained (node_modules/.pnpm/@huggingface+transformers@3.0.2/node_modules/@huggingface/transformers/dist/webpack:/@huggingface/transformers/src/models.js:920:1)
  at Function.from_pretrained (node_modules/.pnpm/@huggingface+transformers@3.0.2/node_modules/@huggingface/transformers/dist/webpack:/@huggingface/transformers/src/models.js:6034:1)
  at loadModel (src/model/model.service.ts:9:23)
  at loadAllChatsModels (src/model/model.service.ts:28:31)
  at Object.<anonymous> (__tests__/loadAllChatsModels.spec.ts:23:9)

console.log Error: Could not locate file: "https://huggingface.co/google-bert/bert-base-uncased/resolve/main/onnx/model.onnx".

Expected behavior

expected to pull model in true url

LysandreJik commented 8 hours ago

Hey @NarwhalChen, do you mind sharing a fully reproducible code snippet that runs into this issue? Also this seems to run in node rather than python?

LysandreJik commented 8 hours ago

cc @xenova, this seems linked to transformers.js rather than transformers

xenova commented 5 hours ago

Moved to Transformers.js repo 👍

@NarwhalChen The reason you are facing that error is because the checkpoint you are trying to load is not compatible with Transformers.js. Notice that the "transformers.js" tag is not present on the repo: image

Fortunately, I have made a conversion for this model already, which can be found at https://huggingface.co/Xenova/bert-base-uncased (set modelName to "Xenova/bert-base-uncased") image