Gå til indhold

Get started with the Ollama integrations

Dette indhold er ikke tilgængeligt i dit sprog endnu.

⭐ Community Toolkit Ollama logo

Ollama is a powerful, open source language model that can be used to generate text based on a given prompt. The Aspire Ollama integration provides a way to host Ollama models using the docker.io/ollama/ollama container image and access them via the OllamaSharp client.

In this introduction, you’ll see how to install and use the Aspire Ollama integrations in a simple configuration. If you already have this knowledge, see Ollama hosting integration for full reference details.

To begin, install the Aspire Ollama Hosting integration in your Aspire AppHost project. This integration allows you to create and manage Ollama model instances from your Aspire hosting projects:

Aspire CLI — Tilføj CommunityToolkit.Aspire.Hosting.Ollama-pakke
aspire add communitytoolkit-ollama

Aspire CLI er interaktiv; vælg det passende søgeresultat når du bliver spurgt:

Aspire CLI — Eksempel output
Select an integration to add:
> communitytoolkit-ollama (CommunityToolkit.Aspire.Hosting.Ollama)
> Other results listed as selectable options...

Next, in the AppHost project, register and consume the Ollama integration using the AddOllama extension method to add the Ollama container to the application builder. You can then add models to the container, which download and run when the container starts, using the AddModel extension method:

C# — AppHost.cs
var builder = DistributedApplication.CreateBuilder(args);
var ollama = builder.AddOllama("ollama");
var phi35 = ollama.AddModel("phi3.5");
var exampleProject = builder.AddProject<Projects.ExampleProject>()
.WithReference(phi35);
builder.Build().Run();

Now that the hosting integration is ready, the next step is to install and configure the client integration in any projects that need to use it.

Install the Aspire OllamaSharp client integration in the client-consuming project:

.NET CLI — Add CommunityToolkit.Aspire.OllamaSharp package
dotnet add package CommunityToolkit.Aspire.OllamaSharp

In the Program.cs file of your client-consuming project, call the AddOllamaApiClient extension to register an IOllamaApiClient for use via the dependency injection container:

builder.AddOllamaApiClient("llama3");

After adding IOllamaApiClient to the builder, you can get the IOllamaApiClient instance using dependency injection. For example, to retrieve your context object from service:

public class ExampleService(IOllamaApiClient ollama)
{
// Use ollama...
}

For full reference details, see Ollama hosting integration and Ollama client integration.