I chose multilingual embedding model from the provided in constants.py file: EMBEDDING_MODEL_NAME = "intfloat/multilingual-e5-large" # Uses 2.5 GB of VRAM
and when I run run_localGPT_v2.py I get the error mentioned. I have no idea how to fix it to use this model.
I chose multilingual embedding model from the provided in constants.py file: EMBEDDING_MODEL_NAME = "intfloat/multilingual-e5-large" # Uses 2.5 GB of VRAM and when I run run_localGPT_v2.py I get the error mentioned. I have no idea how to fix it to use this model.