Salta ai contenuti

OpenAI integration

Questi contenuti non sono ancora disponibili nella tua lingua.

OpenAI logo

OpenAI provides access to chat/completions, embeddings, image, and audio models via a REST API. The OpenAI integration lets you:

  • Model an OpenAI account (endpoint + API key) once in the AppHost.
  • Add one or more model resources that compose their connection strings from the parent.
  • Reference those model resources from projects to get strongly-named connection strings.
  • Consume those connection strings with the Aspire.OpenAI component to obtain an OpenAIClient and (optionally) an IChatClient.

The hosting integration models OpenAI with two resource types:

  • OpenAIResource: Parent that holds the shared API key and base endpoint (defaults to https://api.openai.com/v1).
  • OpenAIModelResource: Child representing a specific model; composes a connection string from the parent (Endpoint + Key + Model).

To access these types and APIs, install the 📦 Aspire.Hosting.OpenAI NuGet package in your AppHost project:

Aspire CLI — Aggiungi pacchetto Aspire.Hosting.OpenAI
aspire add openai

La CLI Aspire è interattiva; seleziona il risultato corretto quando richiesto:

Aspire CLI — Output di esempio
Select an integration to add:
> openai (Aspire.Hosting.OpenAI)
> Other results listed as selectable options...
C# — AppHost.cs
var builder = DistributedApplication.CreateBuilder(args);
var openai = builder.AddOpenAI("openai");
builder.AddProject<Projects.ExampleProject>()
.WithReference(openai);
// After adding all resources, run the app...

Add one or more model children beneath the parent and reference them from projects:

C# — AppHost.cs
var builder = DistributedApplication.CreateBuilder(args);
var openai = builder.AddOpenAI("openai");
var chat = openai.AddModel("chat", "gpt-4o-mini");
var embeddings = openai.AddModel("embeddings", "text-embedding-3-small");
builder.AddProject<Projects.ExampleProject>()
.WithReference(chat);
// After adding all resources, run the app...

Referencing chat passes a connection string named chat to the project. Multiple models can share the single API key and endpoint via the parent resource.

Calling AddOpenAI("openai") creates a secret parameter named openai-openai-apikey. Aspire resolves its value in this order:

  1. The Parameters:openai-openai-apikey configuration key (user secrets, appsettings.*, or environment variables).
  2. The OPENAI_API_KEY environment variable.

If neither source provides a value, startup throws an exception. Provide the key via user-secrets:

Terminal window
dotnet user-secrets set Parameters:openai-openai-apikey sk-your-api-key

Replace the default parameter by creating your own secret parameter and calling WithApiKey on the parent:

C# — AppHost.cs
var builder = DistributedApplication.CreateBuilder(args);
var apiKey = builder.AddParameter("my-api-key", secret: true);
var openai = builder.AddOpenAI("openai")
.WithApiKey(apiKey);
var chat = openai.AddModel("chat", "gpt-4o-mini");
builder.AddProject<Projects.ExampleProject>()
.WithReference(chat);

Override the default endpoint (for example to use a proxy or compatible gateway):

C# — AppHost.cs
var builder = DistributedApplication.CreateBuilder(args);
var openai = builder.AddOpenAI("openai")
.WithEndpoint("https://my-gateway.example.com/v1");
var chat = openai.AddModel("chat", "gpt-4o-mini");
builder.AddProject<Projects.ExampleProject>()
.WithReference(chat);

Add an optional single-run health check per model when diagnosing issues:

var chat = builder.AddOpenAI("openai")
.AddModel("chat", "gpt-4o-mini")
.WithHealthCheck();

The model health check validates endpoint reachability, API key validity (401), and model existence (404). It executes only once per application instance to limit rate-limit implications. A status-page check against https://status.openai.com/api/v2/status.json is automatically registered for each parent resource.

Common identifiers:

  • gpt-5
  • gpt-4o-mini
  • gpt-4o
  • gpt-4-turbo
  • gpt-realtime
  • text-embedding-3-small
  • text-embedding-3-large
  • dall-e-3
  • whisper-1

For more information, see the OpenAI models documentation.

When you reference an OpenAI resource using WithReference, the following connection properties are made available to the consuming project:

The OpenAI resource exposes the following connection properties:

Property NameDescription
EndpointThe base endpoint URI for the OpenAI API, with the format https://api.openai.com/v1
UriThe endpoint URI (same as Endpoint), with the format https://api.openai.com/v1
KeyThe API key for authentication

Example properties:

Uri: https://api.openai.com/v1
Key: sk-proj-abc123...

The OpenAI model resource combines the parent properties above and adds the following connection property:

Property NameDescription
ModelNameThe model identifier for inference requests, for instance gpt-4o-mini

To get started with the Aspire OpenAI client integration, install the 📦 Aspire.OpenAI NuGet package:

.NET CLI — Add Aspire.OpenAI package
dotnet add package Aspire.OpenAI

In the Program.cs file of your client-consuming project, use AddOpenAIClient to register an OpenAIClient for dependency injection. The method requires a connection name parameter:

builder.AddOpenAIClient(connectionName: "chat");

After adding the OpenAIClient, you can retrieve the client instance using dependency injection:

public class ExampleService(OpenAIClient client)
{
// Use client...
}

Add OpenAI client with registered IChatClient

Section titled “Add OpenAI client with registered IChatClient”
builder.AddOpenAIClient("chat")
.AddChatClient(); // Model inferred from connection string (Model=...)

If only a parent resource was defined (no child model), provide the model name explicitly:

builder.AddOpenAIClient("openai")
.AddChatClient("gpt-4o-mini");

AddChatClient optionally accepts a model/deployment name; if omitted it comes from the connection string’s Model entry. Inject OpenAIClient or IChatClient as needed.

The OpenAI library provides multiple options to configure the OpenAI connection. Either a Endpoint or a ConnectionString is required.

Resolved connection string shapes:

Parent (no model):

Endpoint={endpoint};Key={api_key}

Model child:

Endpoint={endpoint};Key={api_key};Model={model_name}

Configure via Aspire:OpenAI keys (global) and Aspire:OpenAI:{connectionName} (per named client). Example appsettings.json:

{
"ConnectionStrings": {
"chat": "Endpoint=https://api.openai.com/v1;Key=${OPENAI_API_KEY};Model=gpt-4o-mini"
},
"Aspire": {
"OpenAI": {
"DisableTracing": false,
"DisableMetrics": false,
"ClientOptions": {
"UserAgentApplicationId": "myapp",
"NetworkTimeout": "00:00:30"
}
}
}
}

Inline configuration:

builder.AddOpenAIClient("chat", settings => settings.DisableTracing = true);
builder.AddOpenAIClient("chat", configureOptions: o => o.NetworkTimeout = TimeSpan.FromSeconds(30));
  • OpenAI.*
  • OpenAI.* (when telemetry enabled and not disabled)
  • OpenAI.* meter (when telemetry enabled and not disabled)
Domande & RisposteCollaboraCommunityDiscuteGuarda