Connect to Azure OpenAI
Это содержимое пока не доступно на вашем языке.
This page describes how consuming apps connect to an Azure OpenAI resource that’s already modeled in your AppHost. For the AppHost API surface — adding an Azure OpenAI account, deployment resources, managed identity, and infrastructure customization — see Azure OpenAI hosting integration.
When you reference an Azure OpenAI deployment from your AppHost, Aspire injects the connection information into the consuming app as environment variables. Your app can either read those environment variables directly — the pattern works the same from any language — or, in C#, use the Aspire Azure OpenAI client integration for automatic dependency injection, health checks, and telemetry.
Connection properties
Section titled “Connection properties”Aspire exposes each property as an environment variable named [RESOURCE]_[PROPERTY]. For instance, the Uri property of a resource called chat becomes CHAT_URI.
Azure OpenAI account resource
Section titled “Azure OpenAI account resource”The Azure OpenAI account resource exposes the following connection properties:
| Property Name | Description |
|---|---|
Uri | The endpoint URI for the Azure OpenAI account, for example https://{account-name}.openai.azure.com/ |
Example connection string:
https://{account-name}.openai.azure.com/Azure OpenAI deployment resource
Section titled “Azure OpenAI deployment resource”The Azure OpenAI deployment resource inherits all properties from its parent account resource and adds:
| Property Name | Description |
|---|---|
ModelName | The name of the deployment, for example chat |
Example environment variables for a deployment resource named chat:
CHAT_URI=https://{account-name}.openai.azure.com/CHAT_MODELNAME=chatConnect from your app
Section titled “Connect from your app”Pick the language your consuming app is written in. Each example assumes your AppHost adds an Azure OpenAI deployment resource named chat and references it from the consuming app.
For C# apps, the recommended approach is the Aspire Azure OpenAI client integration. It registers an AzureOpenAIClient through dependency injection and, optionally, registers an IChatClient via Microsoft.Extensions.AI. If you’d rather read environment variables directly, see the Read environment variables section at the end of this tab.
Install the client integration
Section titled “Install the client integration”Install the 📦 Aspire.Azure.AI.OpenAI NuGet package in the client-consuming project:
dotnet add package Aspire.Azure.AI.OpenAI#:package Aspire.Azure.AI.OpenAI@*<PackageReference Include="Aspire.Azure.AI.OpenAI" Version="*" />Add an Azure OpenAI client
Section titled “Add an Azure OpenAI client”In Program.cs, call AddAzureOpenAIClient on your IHostApplicationBuilder to register an AzureOpenAIClient:
builder.AddAzureOpenAIClient(connectionName: "chat");Resolve the client through dependency injection:
public class ExampleService(AzureOpenAIClient client){ // Use client...}Add a chat client
Section titled “Add a chat client”Call AddChatClient after AddAzureOpenAIClient to also register an IChatClient from Microsoft.Extensions.AI. The deployment name is passed as an argument:
builder.AddAzureOpenAIClient("chat") .AddChatClient("chat");Use AddKeyedChatClient to register a keyed IChatClient:
builder.AddAzureOpenAIClient("chat") .AddKeyedChatClient("serviceKey", "chat");Resolve the IChatClient through dependency injection:
public class ExampleService(IChatClient chatClient){ public async Task<string> GetResponseAsync(string userMessage) { var response = await chatClient.CompleteAsync(userMessage); return response.Message.Text ?? string.Empty; }}For more information on IChatClient and Microsoft.Extensions.AI, see Unified AI Building Blocks for .NET.
Add keyed Azure OpenAI clients
Section titled “Add keyed Azure OpenAI clients”To register multiple AzureOpenAIClient instances with different connection names, use AddKeyedAzureOpenAIClient:
builder.AddKeyedAzureOpenAIClient(name: "chat");builder.AddKeyedAzureOpenAIClient(name: "embeddings");Then resolve each instance by key:
public class ExampleService( [FromKeyedServices("chat")] AzureOpenAIClient chatClient, [FromKeyedServices("embeddings")] AzureOpenAIClient embeddingsClient){ // Use clients...}Configuration
Section titled “Configuration”The Aspire Azure OpenAI client integration offers multiple ways to provide configuration.
Connection strings. When using a connection string from the ConnectionStrings configuration section, pass the connection name to AddAzureOpenAIClient:
builder.AddAzureOpenAIClient("chat");The connection string is resolved from the ConnectionStrings section. Two formats are supported:
-
Managed identity (recommended): The endpoint URI only.
JSON — appsettings.json {"ConnectionStrings": {"chat": "https://{account-name}.openai.azure.com/"}} -
API key: Endpoint plus key.
JSON — appsettings.json {"ConnectionStrings": {"chat": "Endpoint=https://{account-name}.openai.azure.com/;Key={api-key}"}}
Configuration providers. The client integration supports Microsoft.Extensions.Configuration. It loads AzureOpenAISettings from appsettings.json (or any other configuration source) using the Aspire:Azure:AI:OpenAI key:
{ "Aspire": { "Azure": { "AI": { "OpenAI": { "DisableTracing": false } } } }}Inline delegates. Configure settings inline using an Action<AzureOpenAISettings>:
builder.AddAzureOpenAIClient( "chat", settings => settings.DisableTracing = true);To configure the AzureOpenAIClientOptions inline:
builder.AddAzureOpenAIClient( connectionName: "chat", configureClientBuilder: clientBuilder => { clientBuilder.ConfigureOptions(options => { options.UserAgentApplicationId = "MyApp"; }); });Add Azure OpenAI client from configuration
Section titled “Add Azure OpenAI client from configuration”Use AddOpenAIClientFromConfiguration to register an OpenAIClient or AzureOpenAIClient based on the connection string value:
builder.AddOpenAIClientFromConfiguration("chat");The method selects the client type according to these rules:
| Connection string example | Registered client type |
|---|---|
https://{account}.openai.azure.com/ | AzureOpenAIClient |
Endpoint=https://{account}.openai.azure.com/;Key={key} | AzureOpenAIClient |
Endpoint=https://{account}.openai.azure.com/;Key={key};IsAzure=false | OpenAIClient |
Endpoint=https://localhost:18889;Key={key} | OpenAIClient |
Client integration health checks
Section titled “Client integration health checks”Aspire client integrations enable health checks by default. The Azure OpenAI client integration participates in the standard Aspire health check pipeline.
Observability and telemetry
Section titled “Observability and telemetry”The Aspire Azure OpenAI client integration automatically configures logging, tracing, and metrics through OpenTelemetry.
Logging categories:
AzureAzure.CoreAzure.IdentityAzure.AI.OpenAI
Tracing activities:
Azure.AI.OpenAI.*gen_ai.system— generic AI system tracinggen_ai.operation.name— operation names for AI calls
Metrics:
The Aspire Azure OpenAI integration currently does not emit metrics by default due to limitations with the Azure SDK for .NET.
Any of these telemetry features can be disabled through the configuration options above.
Read environment variables in C#
Section titled “Read environment variables in C#”If you prefer not to use the Aspire client integration, you can read the Aspire-injected endpoint from the environment and construct an AzureOpenAIClient directly using DefaultAzureCredential for managed identity:
using Azure.AI.OpenAI;using Azure.Identity;
var endpoint = Environment.GetEnvironmentVariable("CHAT_URI");var deploymentName = Environment.GetEnvironmentVariable("CHAT_MODELNAME");
var client = new AzureOpenAIClient( new Uri(endpoint!), new DefaultAzureCredential());
var chatClient = client.GetChatClient(deploymentName);// Use chatClient...Use the Azure SDK for Go OpenAI client:
go get github.com/Azure/azure-sdk-for-go/sdk/ai/azopenaigo get github.com/Azure/azure-sdk-for-go/sdk/azidentityRead the injected environment variables and connect using managed identity:
package main
import ( "context" "fmt" "os"
"github.com/Azure/azure-sdk-for-go/sdk/ai/azopenai" "github.com/Azure/azure-sdk-for-go/sdk/azidentity")
func main() { // Read the Aspire-injected connection properties endpoint := os.Getenv("CHAT_URI") deploymentName := os.Getenv("CHAT_MODELNAME")
credential, err := azidentity.NewDefaultAzureCredential(nil) if err != nil { panic(err) }
client, err := azopenai.NewClient(endpoint, credential, nil) if err != nil { panic(err) }
resp, err := client.GetChatCompletions( context.Background(), azopenai.ChatCompletionsOptions{ DeploymentName: &deploymentName, Messages: []azopenai.ChatRequestMessageClassification{ &azopenai.ChatRequestUserMessage{ Content: azopenai.NewChatRequestUserMessageContent("Hello!"), }, }, }, nil, ) if err != nil { panic(err) }
fmt.Println(*resp.Choices[0].Message.Content)}Install the official OpenAI Python library and the Azure Identity library:
pip install openai azure-identityRead the injected environment variables and connect using managed identity:
import osfrom openai import AzureOpenAIfrom azure.identity import DefaultAzureCredential, get_bearer_token_provider
# Read the Aspire-injected connection propertiesendpoint = os.environ["CHAT_URI"]deployment_name = os.environ["CHAT_MODELNAME"]
# Obtain a token provider for managed identitytoken_provider = get_bearer_token_provider( DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default")
client = AzureOpenAI( azure_endpoint=endpoint, azure_ad_token_provider=token_provider, api_version="2024-10-21",)
response = client.chat.completions.create( model=deployment_name, messages=[{"role": "user", "content": "Hello!"}],)
print(response.choices[0].message.content)Install the official Azure OpenAI TypeScript library:
npm install @azure/openai @azure/identityRead the injected environment variables and connect using managed identity:
import { AzureOpenAI } from '@azure/openai';import { DefaultAzureCredential, getBearerTokenProvider } from '@azure/identity';
// Read Aspire-injected connection propertiesconst endpoint = process.env.CHAT_URI!;const deploymentName = process.env.CHAT_MODELNAME ?? 'chat';
const credential = new DefaultAzureCredential();const tokenProvider = getBearerTokenProvider( credential, 'https://cognitiveservices.azure.com/.default');
const client = new AzureOpenAI({ endpoint, azureADTokenProvider: tokenProvider, apiVersion: '2024-10-21', deployment: deploymentName,});
const response = await client.chat.completions.create({ model: deploymentName, messages: [{ role: 'user', content: 'Hello!' }],});
console.log(response.choices[0].message.content);