Open xiaominame opened 3 months ago
Since there are many AI models and they are updated very quickly, this may be arranged as a later plan.
thank you,looking forward to your plan
I found that the API of ollama is compatible with openai. You can use it by directly filling in the location of openai.
url should be filled in:
http://localhost:11434/v1/chat/completions
However, you need to set an environment variable before starting ollama
OLLAMA_ORIGINS=*
I will preset more openai interfaces in the next version.
Can ollama be integrated?