Get started with the Ollama integrations
Este conteúdo não está disponível em sua língua ainda.
Ollama is a powerful, open source language model that can be used to generate text based on a given prompt. The Aspire Ollama integration provides a way to host Ollama models using the docker.io/ollama/ollama container image and access them via the OllamaSharp client.
In this introduction, you’ll see how to install and use the Aspire Ollama integrations in a simple configuration. If you already have this knowledge, see Ollama hosting integration for full reference details.
Set up hosting integration
Section titled “Set up hosting integration”To begin, install the Aspire Ollama Hosting integration in your Aspire AppHost project. This integration allows you to create and manage Ollama model instances from your Aspire hosting projects:
aspire add communitytoolkit-ollamaA Aspire CLI é interativa; escolha o resultado adequado quando solicitado:
Select an integration to add:
> communitytoolkit-ollama (CommunityToolkit.Aspire.Hosting.Ollama)> Other results listed as selectable options...#:package CommunityToolkit.Aspire.Hosting.Ollama@*<PackageReference Include="CommunityToolkit.Aspire.Hosting.Ollama" Version="*" />Next, in the AppHost project, register and consume the Ollama integration using the AddOllama extension method to add the Ollama container to the application builder. You can then add models to the container, which download and run when the container starts, using the AddModel extension method:
var builder = DistributedApplication.CreateBuilder(args);
var ollama = builder.AddOllama("ollama");
var phi35 = ollama.AddModel("phi3.5");
var exampleProject = builder.AddProject<Projects.ExampleProject>() .WithReference(phi35);
builder.Build().Run();Set up client integration
Section titled “Set up client integration”Now that the hosting integration is ready, the next step is to install and configure the client integration in any projects that need to use it.
Install the Aspire OllamaSharp client integration in the client-consuming project:
dotnet add package CommunityToolkit.Aspire.OllamaSharp#:package CommunityToolkit.Aspire.OllamaSharp@*<PackageReference Include="CommunityToolkit.Aspire.OllamaSharp" Version="*" />In the Program.cs file of your client-consuming project, call the AddOllamaApiClient extension to register an IOllamaApiClient for use via the dependency injection container:
builder.AddOllamaApiClient("llama3");After adding IOllamaApiClient to the builder, you can get the IOllamaApiClient instance using dependency injection. For example, to retrieve your context object from service:
public class ExampleService(IOllamaApiClient ollama){ // Use ollama...}For full reference details, see Ollama hosting integration and Ollama client integration.