This sample contains an Azure Function using OpenAI bindings extension to highlight OpenAI retrieval augmented generation with Azure AI Search.
You can learn more about the OpenAI trigger and bindings extension in the GitHub documentation and in the Official OpenAI extension documentation
Once you have your Azure subscription, run the following in a new terminal window to create Azure OpenAI, Azure AI Search and other resources needed: You will be asked if you want to enable a virtual network that will lock down your OpenAI and AI Search services so they are only available from the deployed function app over private endpoints. To skip virtual network integration, select true. If you select networking, your local IP will be added to the OpenAI and AI Search services so you can debug locally.
azd init --template https://github.com/Azure-Samples/azure-functions-openai-aisearch-dotnet
Mac/Linux:
chmod +x ./infra/scripts/*.sh
Windows:
set-executionpolicy remotesigned
Run the follow command to provision resoruces in Azure
If you don't run azd provision, you can create an OpenAI resource and an AI Search resource in the Azure portal to get your endpoints. After it deploys, click Go to resource and view the Endpoint value. You will also need to deploy a model, e.g. with name chat
with model gpt-35-turbo
and embeddings
with model text-embedding-3-small
{ "IsEncrypted": false, "Values": { "AzureWebJobsStorage": "UseDevelopmentStorage=true", "FUNCTIONS_WORKER_RUNTIME": "dotnet-isolated", "AZURE_OPENAI_ENDPOINT": "<paste from above>", "CHAT_MODEL_DEPLOYMENT_NAME": "chat", "AZURE_AISEARCH_ENDPOINT": "<paste from above>", "EMBEDDING_MODEL_DEPLOYMENT_NAME": "embeddings", "SYSTEM_PROMPT": "You must only use the provided documents to answer the question" } }Add your account (contoso.microsoft.com) with the following permissions to the Azure OpenAI and AI Search resources when testing locally.
If you used azd provision
this step is already done - your logged in user and your function's managed idenitty already have permissions granted.
If you selected virtual network integration, access to Azure OpenAI and Azure AI Search is limited to the Azure Function app through private endpoints and cannot be reached from the internet. To allow testing from your local machine, you need to go to the networking tab in Azure OpenAI and Azure AI Search and add your client ip to the allowed list. If you used azd provision
this step is already done.
code .
code command to open the project in Visual Studio Code.Azurite: Start
, which enables debugging without warnings.ingest
and ask
endpoints respectively using your HTTP test tool. If you have the RestClient extension installed, you can execute requests directly from the test.http
project file.AISearchSample.sln
solution file in Visual Studio.localhost
URL endpoints, including the port, which might not be 7071
.test.http
project file, update the port on the localhost
URL (if needed), and then use the built-in HTTP client to call the ingest
and ask
endpoints.Run this command to provision the function app, with any required Azure resources, and deploy your code:
You're prompted to supply these required deployment parameters:
Parameter Description Environment name An environment that's used to maintain a unique deployment context for your app. You won't be prompted if you created the local project usingazd init
. Azure subscription Subscription in which your resources are created. Azure location Azure region in which to create the resource group that contains the new Azure resources. Only regions that currently support the Flex Consumption plan are shown.
After publish completes successfully, azd
provides you with the URL endpoints of your new functions, but without the function key values required to access the endpoints. To learn how to obtain these same endpoints along with the required function keys, see Invoke the function on Azure in the companion article Quickstart: Create and deploy functions to Azure Functions using the Azure Developer CLI.
You can run the azd up
command as many times as you need to both provision your Azure resources and deploy code updates to your function app.
Note
Deployed code files are always overwritten by the latest deployment package.
When you're done working with your function app and related resources, you can use this command to delete the function app and its related resources from Azure and avoid incurring any further costs (--purge does not leave a soft delete of AI resource and recovers your quota):
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4