Get started with the GitHub Models integration
Цей контент ще не доступний вашою мовою.
GitHub Models provides access to various AI models including OpenAI’s GPT models, DeepSeek, Microsoft’s Phi models, and other leading AI models, all accessible through GitHub’s infrastructure. The Aspire GitHub Models integration enables you to connect to GitHub Models from your applications for prototyping and production scenarios.
In this introduction, you’ll see how to install and use the Aspire GitHub Models integrations in a simple configuration. If you already have this knowledge, see GitHub Models hosting integration for full reference details.
Set up hosting integration
Section titled “Set up hosting integration”To begin, install the Aspire GitHub Models hosting integration in your Aspire AppHost project. This integration allows you to create and manage GitHub Models resources:
aspire add github-modelsAspire CLI інтерактивний; оберіть відповідний результат пошуку:
Select an integration to add:
> github-models (Aspire.Hosting.GitHub.Models)> Other results listed as selectable options...#:package Aspire.Hosting.GitHub.Models@*<PackageReference Include="Aspire.Hosting.GitHub.Models" Version="*" />Next, in the AppHost project, create instances of GitHub Models resources and pass them to the consuming client projects:
var builder = DistributedApplication.CreateBuilder(args);
var chat = builder.AddGitHubModel("chat", "openai/gpt-4o-mini");
builder.AddProject<Projects.ExampleProject>() .WithReference(chat);
builder.Build().Run();The preceding code adds a GitHub Model resource named chat using the identifier string for OpenAI’s GPT-4o-mini model. The WithReference method passes the connection information to the ExampleProject project.
Configure authentication
Section titled “Configure authentication”The GitHub Models integration requires a GitHub personal access token with models: read permission.
Environment variables in Codespaces and GitHub Actions
Section titled “Environment variables in Codespaces and GitHub Actions”When running an app in GitHub Codespaces or GitHub Actions, the GITHUB_TOKEN environment variable is automatically available and can be used without additional configuration.
Personal access tokens for local development
Section titled “Personal access tokens for local development”For local development, create a fine-grained personal access token with the models: read scope and configure it in user secrets:
{ "Parameters": { "chat-gh-apikey": "github_pat_YOUR_TOKEN_HERE" }}Set up client integration
Section titled “Set up client integration”To get started with the Aspire GitHub Models client integration, you can use either the Azure AI Inference client or the OpenAI client, depending on your needs and model compatibility.
Using Azure AI Inference client
Section titled “Using Azure AI Inference client”Install the Azure AI Inference client package in the client-consuming project:
dotnet add package Aspire.Azure.AI.Inference#:package Aspire.Azure.AI.Inference@*<PackageReference Include="Aspire.Azure.AI.Inference" Version="*" />In the Program.cs file of your client-consuming project, use the AddAzureChatCompletionsClient method to register a ChatCompletionsClient for dependency injection:
builder.AddAzureChatCompletionsClient("chat");You can then retrieve the ChatCompletionsClient instance using dependency injection:
public class ExampleService(ChatCompletionsClient client){ public async Task<string> GetResponseAsync(string prompt) { var response = await client.GetChatCompletionsAsync( new[] { new ChatMessage(ChatRole.User, prompt) });
return response.Value.Choices[0].Message.Content; }}Using OpenAI client
Section titled “Using OpenAI client”For models compatible with the OpenAI API (such as openai/gpt-4o-mini), you can use the OpenAI client:
dotnet add package Aspire.OpenAI#:package Aspire.OpenAI@*<PackageReference Include="Aspire.OpenAI" Version="*" />builder.AddOpenAIClient("chat");You can then use the OpenAI client:
public class ChatService(OpenAIClient client){ public async Task<string> GetChatResponseAsync(string prompt) { var chatClient = client.GetChatClient(GitHubModel.OpenAI.OpenAIGpt4oMini); var response = await chatClient.CompleteChatAsync( new[] { new UserChatMessage(prompt) });
return response.Value.Content[0].Text; }}