different-ai / file-organizer-2000

AI-powered organization and chat assistant for Obsidian
https://fileorganizer2000.com
MIT License
261 stars 23 forks source link

What are possible values for local llm with ollama. #145

Open Malli88 opened 2 months ago

Malli88 commented 2 months ago

Hi, and first thank you for the superb plugin. It's just awesome! Could you give please a little bit more documentation about the local llm configuration? Specific, I mean what possible values there are for adding in the field LLM Model? I use llama3:70B and couldn't get it up and running. And, what does you mean with the hint to the Ollama Orgins?

Thanks a lot!

muhammadammarzahid commented 3 weeks ago

Can you please guide me how to use local installation of ollama? From my understanding, I used the following environment

environment # Uncomment lines below for a fully local setup
 MODEL_FOLDERS=llama3
 MODEL_RELATIONSHIPS=llama3
 MODEL_TAGGING=llama3
 MODEL_NAME=llama3
 MODEL_TEXT=llama3
 MODEL_VISION=llava-llama3

### The fastest way to get started is just to add your OpenAI API key to the .env file.
#OPENAI_API_KEY="sk"

But requests always look for an OPENAI_API_KEY. Do I need to fire up the ollama server myself and make any changes in the configuration?