Open SongPing opened 3 months ago
Try use http://localhost:11434/v1
as base url
Try use
http://localhost:11434/v1
as base url
How do I set them? (How do I change the Endpoint URL?) @HavenDV
OpenAIClientOptions settings = new()
{
Endpoint = new Uri("https://my-domain.tld")
};
Has anybody got this to work? I am having the same issue as @SongPing
This is what's working for me:
if (!config.UseOpenAi())
{
var clientOptions = new OpenAIClientOptions
{
Endpoint = new Uri(options.Value.Endpoint),
};
client = new ChatClient(
model: options.Value.Model,
credential: apiKey,
clientOptions);
}
else
{
client = new ChatClient(
model: options.Value.Model,
credential: apiKey);
}
my config options for both cases:
"EnvironmentSettings": {
"UseOpenAi": true
},
"OpenAiOptions": {
"ApiKey": "*****",
"Model": "gpt-4o-mini"
},
"OllamaOptions": {
"ApiKey": "key",
"Model": "gemma2:27b",
"Endpoint": "http://localhost:11434/v1"
}
@yarmoliq Thanks this is a step closer. This works if i host the ollama server on my local computer and point to http. However if i host the ollama server on in the cloud then i get:
An error occurred: Retry failed after 4 tries. (The SSL connection could not be established, see inner exception.) (The SSL connection could not be established, see inner exception.) (The SSL connection could not be established, see inner exception.) (The SSL connection could not be established, see inner exception.)
I have verified that the ollama server works since i can reach it using python. I just really need it to work with the dotnet library. I am using OpenAI 2.0.0.
Also i have verified that the page has a valid certificate.
So, what the inner exception tells you?
I've tried the access a local ollama instance via openai-dotnet. I set the Endpoint in OpenAIClientOptions to http://localhost:11434/ and set the model name to an existing model. But I Always a 404 error. Does somebody successfully called a local Ollama model?