Closed OrionRandD closed 1 year ago
Like most local LLM efforts, it's not quite there yet. From the about page:
We currently use OpenAI's GPT family of Large Language Models for the chat layer. We plan to support offline models for chat soon.
And from the README,
These notes, the last few messages and associated metadata is passed to ChatGPT along with your query for a response.
I haven't even been able to get openLlama to run on my system. That said, I'm keeping my eyes open for progress, and plan to add support for local LLMs to gptel when it's more convenient to use them.
Like most local LLM efforts, it's not quite there yet. From the about page:
We currently use OpenAI's GPT family of Large Language Models for the chat layer. We plan to support offline models for chat soon.
And from the README,
These notes, the last few messages and associated metadata is passed to ChatGPT along with your query for a response.
I haven't even been able to get openLlama to run on my system. That said, I'm keeping my eyes open for progress, and plan to add support for local LLMs to gptel when it's more convenient to use them.
Thx for the reply...
"I'm keeping my eyes open for progress..." ==> thx...
I'm closing this now, will reopen to update when I start working on local LLM support.
Support for local LLMs has been added to gptel.
Support for local LLMs has been added to gptel.
Yeah, it's really nice via ollama.
Have you looked at adding support for khoj?
Have you seen this?
https://khoj.dev/about https://github.com/khoj-ai/khoj#khoj-in-emacs-browser
Perhaps, when you have time, you can implement a gptel-like app for khoj...