Get started with the Azure AI Inference integrations
Konten ini belum tersedia dalam bahasa Anda.
Azure AI Inference provides serverless API endpoints for deploying and using AI models. The Aspire Azure AI Inference integration enables you to connect to Azure AI Inference services from your applications, making it easy to call models for chat, completions, embeddings, and more.
In this introduction, you’ll see how to install and use the Aspire Azure AI Inference integrations in a simple configuration. If you already have this knowledge, see Azure AI Inference Hosting integration for full reference details.
Set up hosting integration
Section titled “Set up hosting integration”Although the Azure AI Inference library doesn’t currently offer direct hosting integration, you can still integrate it into your AppHost project. Simply add a connection string to establish a reference to an existing Azure AI Foundry resource.
If you already have an Azure AI Foundry service, you can easily connect to it by adding a connection string to your AppHost:
var builder = DistributedApplication.CreateBuilder(args);
var aiFoundry = builder.AddConnectionString("ai-foundry");
builder.AddProject<Projects.ExampleProject>() .WithReference(aiFoundry);
// After adding all resources, run the app...
builder.Build().Run();The connection string is configured in the AppHost’s configuration, typically under User Secrets, under the ConnectionStrings section:
{ "ConnectionStrings": { "ai-foundry": "Endpoint=https://{endpoint}/;DeploymentId={deploymentName}" }}Set up client integration
Section titled “Set up client integration”To use Azure AI Inference from your client applications, install the 📦 Aspire.Azure.AI.Inference NuGet package in the client-consuming project:
dotnet add package Aspire.Azure.AI.Inference#:package Aspire.Azure.AI.Inference@*<PackageReference Include="Aspire.Azure.AI.Inference" Version="*" />In the Program.cs file of your client-consuming project, add the Azure AI Inference Chat Completions client:
builder.AddAzureChatCompletionsClient(connectionName: "ai-foundry") .AddChatClient("deploymentName");After adding the IChatClient, you can retrieve the client instance using dependency injection:
public class ExampleService(IChatClient chatClient){ public async Task<string> GetResponseAsync(string userMessage) { var response = await chatClient.CompleteAsync(userMessage); return response.Message.Text ?? string.Empty; }}For more information on using the client integration, see Azure AI Inference Client integration.