huggingface / transformers.js

State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
https://huggingface.co/docs/transformers.js
Apache License 2.0
11.77k stars 736 forks source link

Error: Can't create a session #502

Closed wizardAEI closed 9 months ago

wizardAEI commented 10 months ago

System Info

os: mac-arm node: v18.18.2 electron: "28.0.0", electron-vite: "^1.0.27",

Environment/Platform

Description

Hello! I want to use transformers.js in an Electron application, but I've encountered some issues. Can you please take a look?

An error occurred when importing ONNX file in the main process of Electron.

Reproduction

when i want to load the model:

export async function embedding(text: string) {
  const { AutoTokenizer, CLIPTextModelWithProjection, env } = await import('@xenova/transformers')
  env.localModelPath = getResourcesPath('models')
  env.backends.onnx.wasm.numThreads = 1
  env.cacheDir = getResourcesPath('cache')
  let tokenizer = await AutoTokenizer.from_pretrained('Xenova/bert-base-chinese')
  const text_model = await CLIPTextModelWithProjection.from_pretrained('Xenova/bert-base-chinese', {
    model_file_name: 'model'
  })
  // Run tokenization
  let text_inputs = tokenizer([text], { padding: true, truncation: true })
  // Compute embeddings
  const res = await text_model(text_inputs)
  console.log(res.logits.data)
}

there is the error:

Error: Can't create a session
    at e.createSessionFinalize (/Users/wangdejiang/Desktop/Gomoon/Gomoon/node_modules/onnxruntime-web/dist/ort-web.node.js:6:450535)
    at e.createSession (/Users/wangdejiang/Desktop/Gomoon/Gomoon/node_modules/onnxruntime-web/dist/ort-web.node.js:6:451133)
    at e.createSession (/Users/wangdejiang/Desktop/Gomoon/Gomoon/node_modules/onnxruntime-web/dist/ort-web.node.js:6:443359)
    at e.OnnxruntimeWebAssemblySessionHandler.loadModel (/Users/wangdejiang/Desktop/Gomoon/Gomoon/node_modules/onnxruntime-web/dist/ort-web.node.js:6:446253)
    at async Object.createSessionHandler (/Users/wangdejiang/Desktop/Gomoon/Gomoon/node_modules/onnxruntime-web/dist/ort-web.node.js:6:156051)
    at async m.create (/Users/wangdejiang/Desktop/Gomoon/Gomoon/node_modules/onnxruntime-common/dist/ort-common.node.js:6:11924)
    at async constructSession (file:///Users/wangdejiang/Desktop/Gomoon/Gomoon/node_modules/@xenova/transformers/src/models.js:143:16)
    at async Promise.all (index 1)
    at async BertModel.from_pretrained (file:///Users/wangdejiang/Desktop/Gomoon/Gomoon/node_modules/@xenova/transformers/src/models.js:785:20)
    at async AutoModel.from_pretrained (file:///Users/wangdejiang/Desktop/Gomoon/Gomoon/node_modules/@xenova/transformers/src/models.js:4993:20)

Node.js v18.18.2
wizardAEI commented 10 months ago

Additionally, I'm confused why the error is reported in the '/node_modules/onnxruntime-web' directory instead of onnxruntime-node.

xenova commented 10 months ago

Hi there 👋 Have you tried our example electron app template (link)? Also, can you confirm if the code is running in the main process, or the renderer process?

wizardAEI commented 10 months ago

I just tried it on an Electron template and got the same error. Could it be that I made a mistake in my operation? image

xenova commented 9 months ago

The onnxruntime backend (web or node) is selected using this line: https://github.com/xenova/transformers.js/blob/07df34ff3308cf3b1ab830a547bd9bcf22869783/src/backends/onnx.js#L32

Could you manually check the values of these?

wizardAEI commented 9 months ago

i cant console there. but when I commented out part of the code but still got errors. image And I printed the value at the place of process?.release?.name where the function is called, which is node. image

wizardAEI commented 9 months ago

and I log in the /node_modules/@xenova/transformers/src/utils/hub.js :

image

print the node in the terminal

xenova commented 9 months ago

I just tried it on an Electron template and got the same error. Could it be that I made a mistake in my operation?

Are you saying that without any modifications to the example template, you are experiencing these issues?

wizardAEI commented 9 months ago

I just tried it on an Electron template and got the same error. Could it be that I made a mistake in my operation?

Are you saying that without any modifications to the example template, you are experiencing these issues?

No, I made some changes, mainly to introduce local model. All the modifications are annotated in the above picture. (I added two lines of code and several files, all highlighted in red boxes.)

And the error won't occur unless I try to introduce an ONNX file.

xenova commented 9 months ago

Can you confirm that the path does indeed map to the correct location? The default local path is /models/, so perhaps you are missing the trailing slash? Also, if you are able to add debug statements, could you try see what paths are used here:

https://github.com/xenova/transformers.js/blob/07df34ff3308cf3b1ab830a547bd9bcf22869783/src/utils/hub.js#L331

wizardAEI commented 9 months ago

its right path image

wizardAEI commented 9 months ago

I'm really sorry, I just noticed that my onnx file somehow became an empty file at some point.

wizardAEI commented 9 months ago

I'll try again after re-downloading the onnx file.

xenova commented 9 months ago

No worries! 😇 Let's hope that's the fix! 🤞

wizardAEI commented 9 months ago

I succeeded in running!

Really sorry, my negligence has caused you such a long delay!

Thank you very much. image

xenova commented 9 months ago

Great! 🥳 Absolutely no worries :) Glad it's sorted 🚀