xenova / transformers.js

State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
https://huggingface.co/docs/transformers.js
Apache License 2.0
9.71k stars 571 forks source link

Example not working on Chrome/Arc v.124(M1 Mac) #764

Closed justrach closed 1 month ago

justrach commented 1 month ago

System Info

M1 Mac Sonoma Using the example code.

Environment/Platform

Description

Using the repository from : https://github.com/xenova/transformers.js/tree/v3/examples/webgpu-chat and "@xenova/transformers": "github:xenova/transformers.js#v3", it gets stuck on ..loading model..

On the huggingface demo here is the output on the console:

image

Meanwhile on the local page, here is how the output looks like:

image

Reproduction

  1. Clone repo from https://github.com/xenova/transformers.js/tree/v3/examples/webgpu-chat
  2. npm i and do npm run dev;
  3. Go to url
  4. Load Model
  5. Model gets stuck on ...loading...
xenova commented 1 month ago

Duplicate of https://github.com/xenova/transformers.js/issues/748.

xenova commented 1 month ago

It should now work (commits: here and here)!

justrach commented 1 month ago

Thanks! Do you have a page where there's documentation on how to convert ONNX models to ONNX-Web models by any chance

xenova commented 1 month ago

Sure, we use Optimum, and you can find additional information there.

You can also use our helper conversion script for helping with quantization.

justrach commented 1 month ago

Sure, we use Optimum, and you can find additional information there.

You can also use our helper conversion script for helping with quantization.

Thanks Joshua!