Closed MadBomber closed 3 months ago
Hi @MadBomber thanks for the feedback! Just confirming - are you using the following to run Ollama locally?
brew install ollama
ollama serve
e.g. Meta Llama
ollama pull llama3:latest
If so, I believe that Ollama has compatibility with OpenAI API. This may work without any changes for you:
require 'omniai/openai'
client = OmniAI::OpenAI::Client.new(host: 'http://127.0.0.1:11434', api_key: nil)
completion = client.chat('Tell me a joke.', model: 'llama3')
completion.choice.message.content # 'Why couldn't the bicycle stand up by itself? Because it was two-tired!'
The host can be configured globally with:
OmniAI::OpenAI.config do |configuration|
configuration.host = 'http://127.0.0.1:11434'
end
If so, it probably warrants some better documentation for usage w/ ollama (and maybe a provider that just inherits from OpenAI or includes the subset of needed APIs?)
Just following up here, but some documentation / fixes have been merged that outlines the usage of both LocalAI and Ollama:
https://github.com/ksylvest/omniai#usage-with-localai https://github.com/ksylvest/omniai#usage-with-ollama
Going to close for now, but feel free to re-open if you find any issues. Make sure you have at least v1.2.2 of omniai
and v1.2.1 of omniai-openai
installed:
gem update omniai
gem update omniai-openai
Yep, you are right. I've forked your gem. One of the things that I'm looking at is removing the need to explicitly state a back-end client given that the model name has already been specified. I would also like to see the client interface within the OmniAI namespace as consistent as possible across all clients.
For Ollama, I would make that a OmniAI class that is a sub-class of the OmniAI OpenAI class. I am planning on actually reviewing the entire code base this weekend.
I do something similar in another project. The library doesn't currently have an exhaustive list of models for each provider. The approach I've taken is to instantiate the client based on a regex mapping of known names:
case model
when /claude/ then OmniAI::Anthropic::Client.new
when /gemini/ then OmniAI::Google::Client.new
when /mistral/ then OmniAI::Mistral::Client.new
else OmniAI::OpenAI::Client.new
end
The above snippet probably needs a extra cases and is not exhaustive (e.g. Mistral has codestral-latest
).
A challenge I'd try to avoid in having the client lookup per model require that every client provider is installed. For example, a user that only cares only about Google and OpenAI need only install the omniai-google
and omniai-openai
gems.
@MadBomber I know this issue is closed, but wanted to give a heads up that I've been working on including some basic CLI support. As part of that, the latest version (v1.4.1) of OmniAI::Client
now supports a method to lookup a provider dynamically by name:
https://github.com/ksylvest/omniai/blob/main/lib/omniai/client.rb#L75-L91
OmniAI::Client.find(provider: "anthropic")
OmniAI::Client.find(provider: "google")
OmniAI::Client.find(provider: "mistral")
OmniAI::Client.find(provider: "openai")
Hopefully this helps if you can just provide a model -> provider mapping.
It would also be interesting if within the provider's API or developer websites there might be a way to "download" the list of model names and their attributes.
I think that's a great idea. I think some of the LLM providers support that API:
https://platform.openai.com/docs/api-reference/models/list https://docs.mistral.ai/api/#operation/listModels https://ai.google.dev/api/rest/v1beta/models/list
The only one missing seems to be Claude (Anthropic). For that use case though, I'd say it'd be fairly straightforward to support a coded list using this:
Thank you very much for this capability. Its going to make my
aia
CLI tool so much nicer.In order for me to dump all of my current dependency on other back-end CLI programs I need to be able to access LocalAI and Ollama APIs.