awaescher / OllamaSharp

The easiest way to use the Ollama API in .NET
https://www.nuget.org/packages/OllamaSharp
MIT License
488 stars 68 forks source link
ai gemma2 gpt hacktoberfest llama llamacpp llm localllama microsoft-extensions-ai mistralai ollama ollama-api streaming

 ollama

OllamaSharp 🦙

OllamaSharp provides .NET bindings for the Ollama API, simplifying interactions with Ollama both locally and remotely.

Features

Usage

OllamaSharp wraps each Ollama API endpoint in awaitable methods that fully support response streaming.

The following list shows a few simple code examples.

Try our full-featured Ollama API client app OllamaSharpConsole to interact with your Ollama instance.

Initializing

// set up the client
var uri = new Uri("http://localhost:11434");
var ollama = new OllamaApiClient(uri);

// select a model which should be used for further operations
ollama.SelectedModel = "llama3.1:8b";

Listing all models that are available locally

var models = await ollama.ListLocalModels();

Pulling a model and reporting progress

await foreach (var status in ollama.PullModel("llama3.1:405b"))
    Console.WriteLine($"{status.Percent}% {status.Status}");

Generating a completion directly into the console

await foreach (var stream in ollama.Generate("How are you today?"))
    Console.Write(stream.Response);

Building interactive chats

var chat = new Chat(ollama);
while (true)
{
    var message = Console.ReadLine();
    await foreach (var answerToken in chat.Send(message))
        Console.Write(answerToken);
}
// messages including their roles and tool calls will automatically be tracked within the chat object
// and are accessible via the Messages property

Credits

The icon and name were reused from the amazing Ollama project.

I would like to thank all the contributors who take the time to improve OllamaSharp. First and foremost mili-tan, who always keeps OllamaSharp in sync with the Ollama API. ❤