Salta ai contenuti

Ollama Client integration reference

Questi contenuti non sono ancora disponibili nella tua lingua.

⭐ Community Toolkit Ollama logo

To get started with the Aspire Ollama integrations, follow the Get started with Ollama integrations guide.

This article includes full details about the Aspire Ollama Client integration.

To get started with the Aspire OllamaSharp integration, install the 📦 CommunityToolkit.Aspire.OllamaSharp NuGet package in the client-consuming project, that is, the project for the application that uses the Ollama client:

.NET CLI — Add CommunityToolkit.Aspire.OllamaSharp package
dotnet add package CommunityToolkit.Aspire.OllamaSharp

In the Program.cs file of your client-consuming project, call the AddOllamaApiClient extension to register an IOllamaApiClient for use via the dependency injection container. If the resource provided in the AppHost, and referenced in the client-consuming project, is an OllamaModelResource, then the AddOllamaApiClient method will register the model as the default model for the IOllamaApiClient:

builder.AddOllamaApiClient("llama3");

After adding IOllamaApiClient to the builder, you can get the IOllamaApiClient instance using dependency injection. For example, to retrieve your context object from service:

public class ExampleService(IOllamaApiClient ollama)
{
// Use ollama...
}

There might be situations where you want to register multiple IOllamaApiClient instances with different connection names. To register keyed Ollama clients, call the AddKeyedOllamaApiClient method:

builder.AddKeyedOllamaApiClient(name: "chat");
builder.AddKeyedOllamaApiClient(name: "embeddings");

Then you can retrieve the IOllamaApiClient instances using dependency injection. For example, to retrieve the connection from an example service:

public class ExampleService(
[FromKeyedServices("chat")] IOllamaApiClient chatOllama,
[FromKeyedServices("embeddings")] IOllamaApiClient embeddingsOllama)
{
// Use ollama...
}

To configure the Ollama client integration, use a connection string from the ConnectionStrings configuration section, providing the name of the connection string when calling the AddOllamaApiClient method:

builder.AddOllamaApiClient("llama");

Then the connection string will be retrieved from the ConnectionStrings configuration section:

{
"ConnectionStrings": {
"llama": "Endpoint=http://localhost:1234;Model=llama3"
}
}

The 📦 Microsoft.Extensions.AI NuGet package provides an abstraction over the Ollama client API, using generic interfaces. OllamaSharp supports these interfaces, and they can be registered by chaining either the IChatClient or IEmbeddingGenerator<string, Embedding<float>> registration methods to the AddOllamaApiClient method.

To register an IChatClient, chain the AddChatClient method to the AddOllamaApiClient method:

builder.AddOllamaApiClient("llama")
.AddChatClient();

Similarly, to register an IEmbeddingGenerator, chain the AddEmbeddingGenerator method:

builder.AddOllamaApiClient("llama")
.AddEmbeddingGenerator();

After adding IChatClient to the builder, you can get the IChatClient instance using dependency injection. For example, to retrieve your context object from service:

public class ExampleService(IChatClient chatClient)
{
// Use chat client...
}

There might be situations where you want to register multiple AI client instances with different connection names. To register keyed AI clients, use the keyed versions of the registration methods:

builder.AddOllamaApiClient("chat")
.AddKeyedChatClient("chat");
builder.AddOllamaApiClient("embeddings")
.AddKeyedEmbeddingGenerator("embeddings");

Then you can retrieve the AI client instances using dependency injection. For example, to retrieve the clients from an example service:

public class ExampleService(
[FromKeyedServices("chat")] IChatClient chatClient,
[FromKeyedServices("embeddings")] IEmbeddingGenerator<string, Embedding<float>> embeddingGenerator)
{
// Use AI clients...
}