Closed flatsiedatsie closed 2 months ago
Looking a bit further at which files are in transformers-cache
(for a translation pipeline):
"https://huggingface.co/Xenova/opus-mt-nl-en/resolve/main/tokenizer_config.json": "transformers-cache",
"https://huggingface.co/Xenova/opus-mt-nl-en/resolve/main/config.json": "transformers-cache",
"https://huggingface.co/Xenova/opus-mt-nl-en/resolve/main/tokenizer.json": "transformers-cache",
"https://huggingface.co/Xenova/opus-mt-nl-en/resolve/main/generation_config.json": "transformers-cache",
"https://huggingface.co/Xenova/opus-mt-nl-en/resolve/main/onnx/encoder_model_quantized.onnx": "transformers-cache",
"https://huggingface.co/Xenova/opus-mt-nl-en/resolve/main/onnx/decoder_model_merged_quantized.onnx": "transformers-cache",
"https://huggingface.co/Xenova/whisper-tiny.en/resolve/main/tokenizer_config.json": "transformers-cache",
"https://huggingface.co/Xenova/whisper-tiny.en/resolve/main/preprocessor_config.json": "transformers-cache",
"https://huggingface.co/Xenova/whisper-tiny.en/resolve/main/config.json": "transformers-cache",
"https://huggingface.co/Xenova/whisper-tiny.en/resolve/main/generation_config.json": "transformers-cache",
"https://huggingface.co/Xenova/whisper-tiny.en/resolve/main/tokenizer.json": "transformers-cache",
"https://huggingface.co/Xenova/whisper-tiny.en/resolve/main/onnx/encoder_model_quantized.onnx": "transformers-cache",
"https://huggingface.co/Xenova/whisper-tiny.en/resolve/main/onnx/decoder_model_merged_quantized.onnx": "transformers-cache",
The transformers.js
file itself doesn't seem to be in that cache, only models.
I'm assuming that's by design.
Hmm, I'm fairly new to webdev but this information may help?
If you find anything out please share :) I'd like my app to be working offline too
I actually did go that serviceworker route. Have a look at the long script at the bottom here.
But what I find strange is this situation:
localhost.dd/project/js
But instead I get this error. So Transformers.js - or something it relies on - seems to need an internet connection when it shouldn't?
OMG, of course. It must be the env settings.
env.allowLocalModels = true;
//env.allowRemoteModels = false;
@flatsiedatsie
You should try setting those env settings :
env.backends.onnx.wasm.wasmPaths
env.localModelPath
env.useBrowserCache
env.allowRemoteModels
By default, the wasm file is downloaded from cdn.
If you are using cache, make sure that there is no error404 page cached.
Otherwise, for json and onnx, it will not redirect to huggingface.co
, and will always retrieve the error pages.
@Th3G33k Thanks! That's very helpful. It might help me squash this rimple:
I'll close this for now actually, since the original question is answered:
cache.add()
functionality, or by implementing a service worker.env
options did, and in the code I copied 'from teh internet' there was a setting that disabled grabbing files locally. Doh.@flatsiedatsie - tho this issue is closed and I do not need it etc.... I am VERY IMPRESSED with your last comment to help other users / developers when facing issues like this. (Hate when people do not summarize things like this for other developer's when user prone issue). Keep up the good work.
Question
What is the recommended way to get Transformers.js to work even when, later on, there is no internet connection?
Is it using a service worker? Or are there other (perhaps hidden) settings for managing caching of files?
I'm assuming here that the
Error: no available backend found
error message is related to Transformers.js not being able to find files once Wi-Fi has been turned off. I was a bit surprised by that, since I do see a cache calledtransformers-cache
being created. Is that not caching all the required files?