xenova / transformers.js

State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
https://huggingface.co/docs/transformers.js
Apache License 2.0
11.02k stars 676 forks source link

"intfloat / multilingual-e5-large" model not working in Node.js #938

Open marcosr-diipai opened 5 days ago

marcosr-diipai commented 5 days ago

System Info

Transformers.js: 2.17.2 "@nestjs/common": "^10.0.0", "@nestjs/config": "^3.2.3", "@nestjs/core": "^10.0.0", "@nestjs/platform-express": "^10.0.0",

node v20.17.0

Environment/Platform

Description

When using the model "intfloat / multilingual-e5-large" it's giving me this error:

When running the code in ps I get the following error: Error: Exception during initialization: /onnxruntime_src/onnxruntime/core/optimizer/initializer.cc:30 onnxruntime::Initializer::Initializer(const onnx::TensorProto&, const onnxruntime::Path&) !model_path.IsEmpty() was false. model_path must not be empty. Ensure that a path is provided when the model is created or loaded.

When using the one "Xenova/multilingual-e5-large" it's not giving me the same results as using "intfloat / multilingual-e5-large" with Python.

Reproduction

  1. Install transformers JS in a Node Application
  2. Execute a Autotokenizer and AutoModel with "intfloat / multilingual-e5-large"
DavidGOrtega commented 3 days ago

might be related to https://github.com/xenova/transformers.js/issues/941 ?

xenova commented 2 days ago

This is because we don't currently support the external data format in Node.js (will be added soon): image

If you are able to, it will work if you use the fp16 or q8 variants (by specifying dtype: 'fp16' or dtype: 'q8'.