OllamaSharp provides .NET bindings for the Ollama API, simplifying interactions with Ollama both locally and remotely.
OllamaSharp wraps each Ollama API endpoint in awaitable methods that fully support response streaming.
The following list shows a few simple code examples.
ℹ Try our full featured demo application that's included in this repository
// set up the client var uri = new Uri("http://localhost:11434"); var ollama = new OllamaApiClient(uri); // select a model which should be used for further operations ollama.SelectedModel = "llama3.1:8b";Listing all models that are available locally
var models = await ollama.ListLocalModelsAsync();Pulling a model and reporting progress
await foreach (var status in ollama.PullModelAsync("llama3.1:405b")) Console.WriteLine($"{status.Percent}% {status.Status}");Generating a completion directly into the console
await foreach (var stream in ollama.GenerateAsync("How are you today?")) Console.Write(stream.Response);Building interactive chats
// messages including their roles and tool calls will automatically be tracked within the chat object // and are accessible via the Messages property var chat = new Chat(ollama); while (true) { var message = Console.ReadLine(); await foreach (var answerToken in chat.SendAsync(message)) Console.Write(answerToken); }Usage with Microsoft.Extensions.AI
Microsoft built an abstraction library to streamline the usage of different AI providers. This is a really interesting concept if you plan to build apps that might use different providers, like ChatGPT, Claude and local models with Ollama.
I encourage you to read their accouncement Introducing Microsoft.Extensions.AI Preview – Unified AI Building Blocks for .NET.
OllamaSharp is the first full implementation of their IChatClient
and IEmbeddingGenerator
that makes it possible to use Ollama just like every other chat provider.
To do this, simply use the OllamaApiClient
as IChatClient
instead of IOllamaApiClient
.
// install package Microsoft.Extensions.AI.Abstractions private static IChatClient CreateChatClient(Arguments arguments) { if (arguments.Provider.Equals("ollama", StringComparison.OrdinalIgnoreCase)) return new OllamaApiClient(arguments.Uri, arguments.Model); else return new OpenAIChatClient(new OpenAI.OpenAIClient(arguments.ApiKey), arguments.Model); // ChatGPT or compatible }
The OllamaApiClient
implements both interfaces from Microsoft.Extensions.AI, you just need to cast it accordingly:
IChatClient
for model inferenceIEmbeddingGenerator<string, Embedding<float>>
for embedding generationI would like to thank all the contributors who take the time to improve OllamaSharp. First and foremost mili-tan, who always keeps OllamaSharp in sync with the Ollama API.
The icon and name were reused from the amazing Ollama project.
Special thanks to JetBrains for supporting this project.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4