OpenAI client integration
Bu içerik henüz dilinizde mevcut değil.
To get started with the Aspire OpenAI client integration, install the 📦 Aspire.OpenAI NuGet package:
dotnet add package Aspire.OpenAI#:package Aspire.OpenAI@*<PackageReference Include="Aspire.OpenAI" Version="*" />Add an OpenAI client
Section titled “Add an OpenAI client”In the Program.cs file of your client-consuming project, use AddOpenAIClient to register an OpenAIClient for dependency injection. The method requires a connection name parameter:
builder.AddOpenAIClient(connectionName: "chat");After adding the OpenAIClient, you can retrieve the client instance using dependency injection:
public class ExampleService(OpenAIClient client){ // Use client...}Add OpenAI client with registered IChatClient
Section titled “Add OpenAI client with registered IChatClient”builder.AddOpenAIClient("chat") .AddChatClient(); // Model inferred from connection string (Model=...)If only a parent resource was defined (no child model), provide the model name explicitly:
builder.AddOpenAIClient("openai") .AddChatClient("gpt-4o-mini");AddChatClient optionally accepts a model/deployment name; if omitted it comes from the connection string’s Model entry. Inject OpenAIClient or IChatClient as needed.
Connection properties
Section titled “Connection properties”When you reference an OpenAI resource using WithReference, the following connection properties are made available to the consuming project:
OpenAI
Section titled “OpenAI”The OpenAI resource exposes the following connection properties:
| Property Name | Description |
|---|---|
Endpoint | The base endpoint URI for the OpenAI API, with the format https://api.openai.com/v1 |
Uri | The endpoint URI (same as Endpoint), with the format https://api.openai.com/v1 |
Key | The API key for authentication |
Example properties:
Uri: https://api.openai.com/v1Key: sk-proj-abc123...OpenAI model
Section titled “OpenAI model”The OpenAI model resource combines the parent properties above and adds the following connection property:
| Property Name | Description |
|---|---|
ModelName | The model identifier for inference requests, for instance gpt-4o-mini |
Configuration
Section titled “Configuration”The OpenAI library provides multiple options to configure the OpenAI connection. Either a Endpoint or a ConnectionString is required.
Use a connection string
Section titled “Use a connection string”Resolved connection string shapes:
Parent (no model):
Endpoint={endpoint};Key={api_key}Model child:
Endpoint={endpoint};Key={api_key};Model={model_name}Use configuration providers
Section titled “Use configuration providers”Configure via Aspire:OpenAI keys (global) and Aspire:OpenAI:{connectionName} (per named client). Example appsettings.json:
{ "ConnectionStrings": { "chat": "Endpoint=https://api.openai.com/v1;Key=${OPENAI_API_KEY};Model=gpt-4o-mini" }, "Aspire": { "OpenAI": { "DisableTracing": false, "DisableMetrics": false, "ClientOptions": { "UserAgentApplicationId": "myapp", "NetworkTimeout": "00:00:30" } } }}Inline configuration:
builder.AddOpenAIClient("chat", settings => settings.DisableTracing = true);builder.AddOpenAIClient("chat", configureOptions: o => o.NetworkTimeout = TimeSpan.FromSeconds(30));Observability and telemetry
Section titled “Observability and telemetry”Logging
Section titled “Logging”OpenAI.*
Tracing
Section titled “Tracing”OpenAI.*(when telemetry enabled and not disabled)
Metrics
Section titled “Metrics”OpenAI.*meter (when telemetry enabled and not disabled)