Open clscott opened 1 year ago
Please this will be again another level!
has my vote if it also uses liteLLM to able to use manymore models that come out faster
@lzumot thanks for mentioning litellm - I'm the maintainer of LiteLLM
$ litellm --model ollama/codellama --temperature 0.3 --max_tokens 2048
Now just set the openai api base and you can use ollama with this project
import openai
openai.api_base = "http://0.0.0.0:8000"
print(openai.ChatCompletion.create(model="test", messages=[{"role":"user", "content":"Hey!"}]))
This would be awesome! Any updates on when it could be incorporated into the plugin? Love your work
Ollama was integrated, though the recent addition of openai APIs to ollama should make integration more straightforward for all future work. As well once embeddings get integrated into ollama then this will be possible to do locally and in a straightforward manner too.
This is a really great local llm backend that works on a lot of platforms (including intel macs) and is basically a 1-click install.
Main site: https://ollama.ai/ API dosc: https://github.com/jmorganca/ollama/blob/main/docs/api.md Article about to indexing an obsidian vault: https://ollama.ai/blog/llms-in-obsidian