Code Sample: Integrating Azure OpenAI Search with #SemanticKernel in .NET

Hi!

Today I’ll try to expand a little the scenario described in this Semantic Kernel blog post: “Azure OpenAI On Your Data with Semantic Kernel“.

The code below uses an GPT-4o model to support the chat and also is connected to Azure AI Search using SK. While runing this demo, you will notice the mentions to [doc1], [doc2] and more. Extending the original SK blog post, this sample shows at the bottom the details of each one of the mentioned documents.

A similar question using Azure AI Studio, will also shows the references to source documents.

Semantic Kernel Blog Post

The SK team explored how to leverage Azure OpenAI Service in conjunction with the Semantic Kernel to enhance AI solutions. By combining these tools, you can harness the capabilities of large language models to work effectively with data, using Azure AI Search capabilities. The post covered the integration process, highlighted the benefits, and provided a high-level overview of the architecture.

The post showcases the importance of context-aware responses and how the Semantic Kernel can manage state and memory to deliver more accurate and relevant results. This integration between SK and Azure AI Search empowers developers to build applications that understand and respond to user queries in a more human-like manner.

The blog post provides a code sample showcasing the integration steps. To run the scenario, you’ll need:

  • Upload our data files to Azure Blob storage.
  • Vectorize and index data in Azure AI Search.
  • Connect Azure OpenAI service with Azure AI Search.

For more in-depth guidance, be sure to check out the full post here.

Code Sample

And now it’s time to who how we can access the details of the response from a SK call, when the response includes information from Azure AI Search.

Let’s take a look at the following program.cs to understand its structure and functionality.

  • The sample program is a showcase of how to utilize Azure OpenAI and Semantic Kernel to create a chat application capable of generating suggestions based on user queries.
  • The program starts by importing necessary namespaces, ensuring access to Azure OpenAI, configuration management, and Semantic Kernel functionalities.
  • Next, the program uses a configuration builder to securely load Azure OpenAI keys from user secrets.
  • The core of the program lies in setting up a chat completion service with Semantic Kernel. This service is configured to use Azure OpenAI for generating chat responses, utilizing the previously loaded API keys and endpoints.
  • To handle the conversation, the program creates a sample chat history. This history includes both system and user messages, forming the basis for the chat completion service to generate responses.
  • An Azure Search extension is configured to enrich the chat responses with relevant information. This extension uses an Azure Search index to pull in data, enhancing the chat service’s ability to provide informative and contextually relevant responses.
  • Finally, the program runs the chat prompt, using the chat history and the Azure Search extension configuration to generate a response.
  • This response is then printed to the console. Additionally, if the response includes citations from the Azure Search extension, these are also processed and printed, showcasing the integration’s ability to provide detailed and informative answers.
// Copyright (c) 2024
// Author : Bruno Capuano
// Change Log :
// – Sample console application to use Azure OpenAI Search and Semantic Kernel
//
// The MIT License (MIT)
//
// Permission is hereby granted, free of charge, to any person obtaining a copy
// of this software and associated documentation files (the "Software"), to deal
// in the Software without restriction, including without limitation the rights
// to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
// copies of the Software, and to permit persons to whom the Software is
// furnished to do so, subject to the following conditions:
//
// The above copyright notice and this permission notice shall be included in
// all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
// THE SOFTWARE.
using Azure.AI.OpenAI;
using Microsoft.Extensions.Configuration;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
using Microsoft.SemanticKernel.Connectors.OpenAI;
// Azure OpenAI keys
var config = new ConfigurationBuilder().AddUserSecrets<Program>().Build();
var deploymentName = config["AZURE_OPENAI_MODEL"];
var endpoint = config["AZURE_OPENAI_ENDPOINT"];
var apiKey = config["AZURE_OPENAI_APIKEY"];
var aiSearchEndpoint = config["AZURE_AISEARCH_ENDPOINT"];
var aiSearchApiKey = config["AZURE_AISEARCH_APIKEY"];
// Create a chat completion service
var builder = Kernel.CreateBuilder();
builder.AddAzureOpenAIChatCompletion(deploymentName, endpoint, apiKey);
// Get the chat completion service
Kernel kernel = builder.Build();
var chat = kernel.GetRequiredService<IChatCompletionService>();
// Create a sample chat history
var history = new ChatHistory();
history.AddSystemMessage("You are a helpful assistant.");
history.AddUserMessage("What do you suggest to go hiking in a rainy day?");
var azureSearchExtensionConfiguration = new AzureSearchChatExtensionConfiguration
{
SearchEndpoint = new Uri(aiSearchEndpoint),
Authentication = new OnYourDataApiKeyAuthenticationOptions(aiSearchApiKey),
IndexName = "contoso-products-with-weather-index"
};
var chatExtensionsOptions = new AzureChatExtensionsOptions { Extensions = { azureSearchExtensionConfiguration } };
var executionSettings = new OpenAIPromptExecutionSettings { AzureChatExtensionsOptions = chatExtensionsOptions };
// run the prompt
var result = await chat.GetChatMessageContentsAsync(history, executionSettings);
var content = result[^1].Content;
Console.WriteLine(content);
if (result.FirstOrDefault().InnerContent is ChatResponseMessage)
{
var ic = result.FirstOrDefault().InnerContent as ChatResponseMessage;
var aec = ic.AzureExtensionsContext;
var citations = aec.Citations;
foreach (var citation in citations)
{
Console.WriteLine($"Title: {citation.Title}");
Console.WriteLine($"URL: {citation.Url}");
Console.WriteLine($"Filepath: {citation.Filepath}");
Console.WriteLine($"Content: {citation.Content.Substring(0, 50)}");
}
}

Happy coding!

Greetings

El Bruno

More posts in my blog ElBruno.com.

More info in https://beacons.ai/elbruno


Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.