Open yashhshah opened 3 months ago
Hi yashhshah,
Looking at this and it sounds reasonable. We could easily make INTENT_MODEL_VERSION configurable via an env var. What is the flow you are looking to achieve vis a vis downloading the intent model? Is it to predownload the files yourself, and then point or load from a specific on disk path?
Hello!
Love the product. We are currently trying to self host the product and our corporate policy requires us to vendor any hugging face models that we use. As a result, we are looking to use the offline capabilities of downloading the intent model and want to provide a file path. Currently INTENT_MODEL_VERSION is hardcoded to a hugging face path and will always make a network call (https://github.com/danswer-ai/danswer/blob/main/backend/shared_configs/configs.py#L17C2-L17C21). Happy to assist with a PR if this seems like a reasonable request.