Disclaimer: This post was created with the help of AI tools to speed up drafting and formatting.
Hi!
Just in case you want to avoid reading, here is a 8-min video overview.
If you are still here, I’ll (try to) walk you through how I updated the eShopLite scenario to take advantage of .NET Aspire 9.4 and the latest GPT-5 models. This is a great example of how a simple model upgrade can be the first step — and then how new Aspire capabilities like Azure AI Foundry integration can make your code cleaner, easier to maintain, and more deployment-friendly.
Details
Step 1 — Updating the Model
The first thing I did was replace the existing model definition in the eShopLite solution:
From: gpt-4.1-mini
To: gpt-5-mini
This is just a small change in the model creation definition, but it gives you access to the latest and fastest GPT-5 models.
📦 Locally, this worked perfectly. ☁️ In Azure, after generating the infrastructure files using:
bashCopyEditazd infra gen
…the deployment went smoothly.
Step 2 — Leveraging Aspire 9.4 Azure AI Foundry Integration
Aspire 9.4 introduces built-in support for Azure AI Foundry elements. Instead of manually managing your model references and endpoints, you can now:
Automatically bind to Azure AI Foundry resources in your Aspire app
Use cleaner, more readable code
Simplify deployment to Azure with Foundry-focused Bicep files
After refactoring the code to use this new integration, I ran it locally — and everything worked as expected. Even better, the Azure deployment was more transparent and maintainable.
Here is a sample of the AppHost code
var chatDeploymentName = "gpt-5-mini";
var embeddingsDeploymentName = "text-embedding-ada-002";
// Add Azure AI Foundry project
var foundry = builder.AddAzureAIFoundry("foundry");
// Add specific model deployments
var gpt5mini = foundry.AddDeployment(chatDeploymentName, chatDeploymentName, "2025-08-07", "OpenAI");
var embeddingsDeployment = foundry.AddDeployment(embeddingsDeploymentName, embeddingsDeploymentName, "2", "OpenAI");
And this is the client consumption in the Products project.
// Add Foundry elements
var azureOpenAiClientName = "foundry";
var chatDeploymentName = builder.Configuration["AI_ChatDeploymentName"] ?? "gpt-5-mini";
var embeddingsDeploymentName = builder.Configuration["AI_embeddingsDeploymentName"] ?? "text-embedding-ada-002";
var client = builder.AddOpenAIClient(azureOpenAiClientName);
client.AddChatClient(chatDeploymentName);
client.AddEmbeddingGenerator(embeddingsDeploymentName);
Step 3 — Deployment with Bicep Files
With the updated approach, the generated Bicep files are clearer:
Explicitly define Azure AI Foundry resources
Clearly map Aspire services to those resources
Reduce deployment complexity
Once deployed, the application was running on GPT-5 with full Aspire 9.4 + Foundry support.
Conclusion
This upgrade journey shows how Aspire 9.4 and Azure AI Foundry integration can significantly improve both the development and deployment experience. With just a few steps, we moved from a basic GPT-4.1-mini setup to a cleaner, faster, and smarter GPT-5 architecture, ready to run locally or in Azure.
If you’re already using Aspire, upgrading to 9.4 and leveraging Foundry is a no-brainer.
Leave a comment