amaiya / onprem

A tool for running on-premises large language models with non-public data
https://amaiya.github.io/onprem
Apache License 2.0
684 stars 32 forks source link

ssl issue to get embedding model files for local usage of onprem #56

Closed ringo70 closed 7 months ago

ringo70 commented 7 months ago

First of all thank you for this great tool and a happy new year!

I am able to download the model using ''ssl_verify=False', however the issue I am running into is that I can't do the same for the embedding model files, i.e. using this:

from sentence_transformers import SentenceTransformer model = SentenceTransformer('sentence-transformers/all-MiniLM-L6-v2') model.save('/some/folder')

... I get:

SSLError: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /api/models/sentence-transformers/all-MiniLM-L6-v2 (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate

I suppose this is not an issue of 'onprem', but it does impari usage of the 'onprem' ingest function. Or am I misunderstanding how this is supposed to work?

Thanks and kind regards,

Ringo

amaiya commented 7 months ago

You can download the embedding model (used by LLM.ingest and LLM.ask) as follows:

wget --no-check-certificate https://public.ukp.informatik.tu-darmstadt.de/reimers/sentence-transformers/v0.2/all-MiniLM-L6-v2.zip

Supply the unzipped folder name as the embedding_model_name argument to LLM.