brumik / ollama-obsidian-indexer

ollama-obsidian-indexer
63 stars 11 forks source link

Why the need for huggingface? #7

Closed intrepidsilence closed 7 months ago

intrepidsilence commented 7 months ago

I noticed that in llm.py, there its the following call:

# Set up service context with our local llm and embedding
embed_model = HuggingFaceBgeEmbeddings(model_name="BAAI/bge-large-en-v1.5")

Why is this reaching out and setting up another model than the one specified in the env file? This prevents me from using it as I need everything to be local. Is there a way to fix/bypass this?

brumik commented 7 months ago

You can replace it with your own model, however hugging just downloads the embedding and runs it on your machine so it's all local after setup. Though it is a good idea to make the embedded model a string so you can replace it with whichever embedding you want from huggingface.

intrepidsilence commented 7 months ago

I have no idea how to:

  1. download my own model that is relatable to the HuggingFace model
  2. install it and make it available in a way that can be used.

I am hoping to have a full offline solution that never ever reaches out on its own to the Internet to pull anything. Can you help or make that an option? Maybe adding docs to that affect would be a nice addition for those who do not know how to do this?

The system I am putting this on will never be able to reach out directly to HuggingFace to get the model.

brumik commented 7 months ago

@intrepidsilence Downloading a model from Hugging face works exactly the same as Ollama. So when first starting this script it does something similar to ollama pull modelname. Then when you create a new note it does the same as ollama run modelname but that is on your own hardware. I do not know how you intend to install on your system a model from ollama (which is a requirement for this script to be running) but you can do the same way with huggingface.

intrepidsilence commented 7 months ago

Ollama and its models installed fine. I cannot install from higgingface. Is there a workaround you are willing to help with?

brumik commented 7 months ago

What do you mean "cannot"? I can offer limited support for this. If you could elaborate what is the exact issue I might advise.

intrepidsilence commented 7 months ago

The site is blocked.

What do you mean "cannot"? I can offer limited support for this. If you could elaborate what is the exact issue I might advise.

— Reply to this email directly, view it on GitHub (https://github.com/brumik/ollama-obsidian-indexer/issues/7#issuecomment-1973132035), or unsubscribe (https://github.com/notifications/unsubscribe-auth/AAFTEC32S2UM2HUGFPSX2FTYWBZ7TAVCNFSM6AAAAABD4362DKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNZTGEZTEMBTGU). You are receiving this because you were mentioned.Message ID: @.***>