इसे छोड़कर कंटेंट पर जाएं
Docs Try Aspire
Docs Try

Connect to Ollama

यह कंटेंट अभी तक आपकी भाषा में उपलब्ध नहीं है।

⭐ Community Toolkit Ollama logo

This page describes how consuming apps connect to an Ollama model resource that’s already modeled in your AppHost. For the AppHost API surface — adding an Ollama server, models, data volumes, GPU support, and more — see Ollama hosting integration.

When you reference an Ollama model resource from your AppHost, Aspire injects the connection information into the consuming app as environment variables. Your app can either read those environment variables directly — the pattern works the same from any language — or, in C#, use the Aspire OllamaSharp client integration for automatic dependency injection.

Aspire exposes each property as an environment variable named [RESOURCE]_[PROPERTY]. For instance, the Uri property of a resource called ollama-llama3 becomes OLLAMA_LLAMA3_URI.

The Ollama server resource exposes the following connection properties:

Property NameDescription
HostThe hostname or IP address of the Ollama server
PortThe port number the Ollama server is listening on (default: 11434)
UriThe full HTTP endpoint URI, with the format http://{Host}:{Port}

Example connection string:

Uri: http://localhost:11434

The Ollama model resource inherits all properties from its parent server resource and adds:

Property NameDescription
ModelThe name of the model, for example llama3 or phi3.5

The model resource connection string combines both:

Endpoint=http://localhost:11434;Model=llama3

Pick the language your consuming app is written in. Each example assumes your AppHost adds an Ollama model resource named llama3 on an Ollama server resource named ollama — producing a consuming resource called ollama-llama3 with the env var prefix OLLAMA_LLAMA3_.

For C# apps, the recommended approach is the Aspire OllamaSharp client integration. It registers an IOllamaApiClient through dependency injection and supports Microsoft.Extensions.AI abstractions (IChatClient, IEmbeddingGenerator). If you’d rather read environment variables directly, see the Read environment variables section at the end of this tab.

Install the 📦 CommunityToolkit.Aspire.OllamaSharp NuGet package in the client-consuming project:

.NET CLI — Add CommunityToolkit.Aspire.OllamaSharp package
dotnet add package CommunityToolkit.Aspire.OllamaSharp

In Program.cs, call AddOllamaApiClient on your IHostApplicationBuilder to register an IOllamaApiClient. When the resource provided in the AppHost is an OllamaModelResource, the model is set as the default model automatically:

C# — Program.cs
builder.AddOllamaApiClient(connectionName: "ollama-llama3");

Resolve the client through dependency injection:

C# — ExampleService.cs
public class ExampleService(IOllamaApiClient ollama)
{
// Use ollama...
}

To register multiple IOllamaApiClient instances with different connection names, use AddKeyedOllamaApiClient:

C# — Program.cs
builder.AddKeyedOllamaApiClient(name: "chat");
builder.AddKeyedOllamaApiClient(name: "embeddings");

Then resolve each instance by key:

C# — ExampleService.cs
public class ExampleService(
[FromKeyedServices("chat")] IOllamaApiClient chatOllama,
[FromKeyedServices("embeddings")] IOllamaApiClient embeddingsOllama)
{
// Use ollama clients...
}

The 📦 Microsoft.Extensions.AI package provides portable IChatClient and IEmbeddingGenerator<string, Embedding<float>> abstractions. OllamaSharp supports these interfaces and you can register them by chaining onto AddOllamaApiClient:

C# — Program.cs
// Register IChatClient
builder.AddOllamaApiClient("ollama-llama3")
.AddChatClient();
// Register IEmbeddingGenerator
builder.AddOllamaApiClient("ollama-llama3")
.AddEmbeddingGenerator();

Resolve through dependency injection:

C# — ExampleService.cs
public class ExampleService(IChatClient chatClient)
{
// Use chat client...
}
C# — Program.cs
builder.AddOllamaApiClient("chat")
.AddKeyedChatClient("chat");
builder.AddOllamaApiClient("embeddings")
.AddKeyedEmbeddingGenerator("embeddings");

Then resolve by key:

C# — ExampleService.cs
public class ExampleService(
[FromKeyedServices("chat")] IChatClient chatClient,
[FromKeyedServices("embeddings")] IEmbeddingGenerator<string, Embedding<float>> embeddingGenerator)
{
// Use AI clients...
}

Connection strings. When using a connection string from the ConnectionStrings configuration section, pass the connection name to AddOllamaApiClient:

C# — Program.cs
builder.AddOllamaApiClient("ollama-llama3");

The connection string is resolved from the ConnectionStrings section:

JSON — appsettings.json
{
"ConnectionStrings": {
"ollama-llama3": "Endpoint=http://localhost:11434;Model=llama3"
}
}

If you prefer not to use the Aspire client integration, you can read the Aspire-injected environment variables directly and construct an OllamaApiClient:

C# — Program.cs
using OllamaSharp;
var endpoint = Environment.GetEnvironmentVariable("OLLAMA_LLAMA3_URI");
var modelName = Environment.GetEnvironmentVariable("OLLAMA_LLAMA3_MODEL");
var client = new OllamaApiClient(new Uri(endpoint!))
{
SelectedModel = modelName
};
// Use client...