Pular para o conteúdo

Get started with the Ollama integrations

Este conteúdo não está disponível em sua língua ainda.

⭐ Community Toolkit Ollama logo

Ollama is a powerful, open source language model that can be used to generate text based on a given prompt. The Aspire Ollama integration provides a way to host Ollama models using the docker.io/ollama/ollama container image and access them via the OllamaSharp client.

In this introduction, you’ll see how to install and use the Aspire Ollama integrations in a simple configuration. If you already have this knowledge, see Ollama hosting integration for full reference details.

To begin, install the Aspire Ollama Hosting integration in your Aspire AppHost project. This integration allows you to create and manage Ollama model instances from your Aspire hosting projects:

Aspire CLI — Adicionar pacote CommunityToolkit.Aspire.Hosting.Ollama
aspire add communitytoolkit-ollama

A Aspire CLI é interativa; escolha o resultado adequado quando solicitado:

Aspire CLI — Exemplo de saída
Select an integration to add:
> communitytoolkit-ollama (CommunityToolkit.Aspire.Hosting.Ollama)
> Other results listed as selectable options...

Next, in the AppHost project, register and consume the Ollama integration using the AddOllama extension method to add the Ollama container to the application builder. You can then add models to the container, which download and run when the container starts, using the AddModel extension method:

C# — AppHost.cs
var builder = DistributedApplication.CreateBuilder(args);
var ollama = builder.AddOllama("ollama");
var phi35 = ollama.AddModel("phi3.5");
var exampleProject = builder.AddProject<Projects.ExampleProject>()
.WithReference(phi35);
builder.Build().Run();

Now that the hosting integration is ready, the next step is to install and configure the client integration in any projects that need to use it.

Install the Aspire OllamaSharp client integration in the client-consuming project:

.NET CLI — Add CommunityToolkit.Aspire.OllamaSharp package
dotnet add package CommunityToolkit.Aspire.OllamaSharp

In the Program.cs file of your client-consuming project, call the AddOllamaApiClient extension to register an IOllamaApiClient for use via the dependency injection container:

builder.AddOllamaApiClient("llama3");

After adding IOllamaApiClient to the builder, you can get the IOllamaApiClient instance using dependency injection. For example, to retrieve your context object from service:

public class ExampleService(IOllamaApiClient ollama)
{
// Use ollama...
}

For full reference details, see Ollama hosting integration and Ollama client integration.