Open mram0509 opened 5 months ago
Hi there 👋 That's exactly correct! Like the python library, we first look for the model locally before attempting to download the model from the Hugging Face Hub. If the model is present locally, we use it. If it is not, we fallback to the Hugging Face Hub.
For anyone stumbling over this... local models will be loaded automatically from e.g. public/models/ (see localModelPath at https://huggingface.co/docs/transformers.js/api/env). Thats what caused the Unexpected token '<', "<!DOCTYPE "... is not valid JSON' error for me.
Question
When I set env.allowLocalModels = true and look at the env object I see both env.allowLocalModels and env.allowRemoteModels set to true. Does this mean that it will look for models locally first and then if not found go to the remoteHost?