xenova / transformers.js

State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
https://huggingface.co/docs/transformers.js
Apache License 2.0
9.87k stars 582 forks source link

Transformers.js seems to need an internet connection when it shouldn't? (Error: no available backend found.) #685

Closed flatsiedatsie closed 2 months ago

flatsiedatsie commented 2 months ago

Question

What is the recommended way to get Transformers.js to work even when, later on, there is no internet connection?

Is it using a service worker? Or are there other (perhaps hidden) settings for managing caching of files?

I'm assuming here that the Error: no available backend found error message is related to Transformers.js not being able to find files once Wi-Fi has been turned off. I was a bit surprised by that, since I do see a cache called transformers-cache being created. Is that not caching all the required files?

flatsiedatsie commented 2 months ago

Looking a bit further at which files are in transformers-cache (for a translation pipeline):

"https://huggingface.co/Xenova/opus-mt-nl-en/resolve/main/tokenizer_config.json": "transformers-cache",
    "https://huggingface.co/Xenova/opus-mt-nl-en/resolve/main/config.json": "transformers-cache",
    "https://huggingface.co/Xenova/opus-mt-nl-en/resolve/main/tokenizer.json": "transformers-cache",
    "https://huggingface.co/Xenova/opus-mt-nl-en/resolve/main/generation_config.json": "transformers-cache",
    "https://huggingface.co/Xenova/opus-mt-nl-en/resolve/main/onnx/encoder_model_quantized.onnx": "transformers-cache",
    "https://huggingface.co/Xenova/opus-mt-nl-en/resolve/main/onnx/decoder_model_merged_quantized.onnx": "transformers-cache",
    "https://huggingface.co/Xenova/whisper-tiny.en/resolve/main/tokenizer_config.json": "transformers-cache",
    "https://huggingface.co/Xenova/whisper-tiny.en/resolve/main/preprocessor_config.json": "transformers-cache",
    "https://huggingface.co/Xenova/whisper-tiny.en/resolve/main/config.json": "transformers-cache",
    "https://huggingface.co/Xenova/whisper-tiny.en/resolve/main/generation_config.json": "transformers-cache",
    "https://huggingface.co/Xenova/whisper-tiny.en/resolve/main/tokenizer.json": "transformers-cache",
    "https://huggingface.co/Xenova/whisper-tiny.en/resolve/main/onnx/encoder_model_quantized.onnx": "transformers-cache",
    "https://huggingface.co/Xenova/whisper-tiny.en/resolve/main/onnx/decoder_model_merged_quantized.onnx": "transformers-cache",

The transformers.jsfile itself doesn't seem to be in that cache, only models.

I'm assuming that's by design.

jonathanpv commented 2 months ago

Hmm, I'm fairly new to webdev but this information may help?

image
jonathanpv commented 2 months ago

If you find anything out please share :) I'd like my app to be working offline too

flatsiedatsie commented 2 months ago

I actually did go that serviceworker route. Have a look at the long script at the bottom here.

But what I find strange is this situation:

But instead I get this error. So Transformers.js - or something it relies on - seems to need an internet connection when it shouldn't?

Screenshot 2024-04-13 at 10 11 45

flatsiedatsie commented 2 months ago

OMG, of course. It must be the env settings.

env.allowLocalModels = true;
//env.allowRemoteModels = false;
Th3G33k commented 2 months ago

@flatsiedatsie

You should try setting those env settings :

env.backends.onnx.wasm.wasmPaths
env.localModelPath 
env.useBrowserCache
env.allowRemoteModels

By default, the wasm file is downloaded from cdn.

If you are using cache, make sure that there is no error404 page cached. Otherwise, for json and onnx, it will not redirect to huggingface.co, and will always retrieve the error pages.

flatsiedatsie commented 2 months ago

@Th3G33k Thanks! That's very helpful. It might help me squash this rimple:

Screenshot 2024-04-13 at 11 52 04

flatsiedatsie commented 2 months ago

I'll close this for now actually, since the original question is answered:

MarketingPip commented 1 month ago

@flatsiedatsie - tho this issue is closed and I do not need it etc.... I am VERY IMPRESSED with your last comment to help other users / developers when facing issues like this. (Hate when people do not summarize things like this for other developer's when user prone issue). Keep up the good work.