Hi!
Here is a small intro to the .NET Aspire + Ollama scenario in this repo: https://aka.ms/netaiaspireollamachat. You can also learn more about this, in this 5-minutes overview video:
Introduction to .NET Aspire and Ollama
Have you ever thought, “Why can’t my .NET app have an AI chatbot that’s ridiculously easy to integrate?” Well, meet .NET Aspire and Ollama, two powerful tech pieces that together make magic for us. .NET Aspire offers powerful tools for .NET Devs and using extensions like the .NET Aspire Community Toolkit Ollama, it’s easy to inject AI into applications. On the other hand, Ollama provides access to pre-trained language models that make chat capabilities feel like magic! Today, we’ll walk through how to set up this dynamic duo to add a natural language interface to your application.
.NET Aspire Community Toolkit Ollama integration
The .NET Aspire Community Toolkit Ollama integration, part of .NET Aspire Community Toolkit, is like a starter pack for integrating Ollama’s AI models into your app.
It includes hosting and client integrations, allowing you to host the model locally (say, via Docker) and access it later through the Ollama API. With just a few lines of code, you can be up and running with an AI that understands context, generates answers, and feels like a natural extension of your app. 😎
Deploy Steps
- Install the Packages: Use NuGet to add
CommunityToolkit.Aspire.Hosting.Ollamato your app host project.
dotnet add package CommunityToolkit.Aspire.Hosting.Ollama
- Add the Ollama Model: Use the
AddOllamamethod in your code to register the Ollama instance and add a model.
var ollama = builder.AddOllama("ollama")
.AddModel("llama3");
- Optional – GPU Acceleration: Running on CPU by default, you can speed things up by enabling GPU support.
var ollama = builder.AddOllama("ollama")
.AddModel("llama3")
.WithContainerRuntimeArgs("--gpus=all");
- Run the Ollama Server: Fire up the app and watch as it downloads and spins up Ollama. Keep the orchestration app open to track download progress.
- Add Client API: If your app needs to talk to Ollama, install
CommunityToolkit.Aspire.OllamaSharpin your client project, then callAddOllamaClientApifor easy access to the API.
builder.AddOllamaClientApi("ollama");
Conclusion
With .NET Aspire and Ollama, turning your .NET app into a chatty assistant is fast, fun, and full of possibilities. With just a few steps, you’ve got AI handling conversations like a pro. So, get your apps talking and let them do the heavy lifting for your users! 👏
Leave a comment