OllamaSharp provides .NET bindings for the Ollama API, simplifying interactions with Ollama both locally and remotely.
OllamaSharp wraps each Ollama API endpoint in awaitable methods that fully support response streaming.
The following list shows a few simple code examples.
ℹ Try our full-featured Ollama API client app OllamaSharpConsole to interact with your Ollama instance.
// set up the client
var uri = new Uri("http://localhost:11434");
var ollama = new OllamaApiClient(uri);
// select a model which should be used for further operations
ollama.SelectedModel = "llama3.1:8b";
var models = await ollama.ListLocalModels();
await foreach (var status in ollama.PullModel("llama3.1:405b"))
Console.WriteLine($"{status.Percent}% {status.Status}");
await foreach (var stream in ollama.Generate("How are you today?"))
Console.Write(stream.Response);
var chat = new Chat(ollama);
while (true)
{
var message = Console.ReadLine();
await foreach (var answerToken in chat.Send(message))
Console.Write(answerToken);
}
// messages including their roles and tool calls will automatically be tracked within the chat object
// and are accessible via the Messages property
The icon and name were reused from the amazing Ollama project.
I would like to thank all the contributors who take the time to improve OllamaSharp. First and foremost mili-tan, who always keeps OllamaSharp in sync with the Ollama API. ❤