This sample highlights how to use the OpenAI trigger and bindings extension in an Azure Function to call custom functions using OpenAI using the Assistants and Assistant Skills features.
You can learn more about the OpenAI trigger and bindings extension in the GitHub documentation and in the Official OpenAI extension documentation
Run the following command to download the project code
azd init -t https://github.com/Azure-Samples/azure-functions-assistants-openai-dotnet
Once you have your Azure subscription, run the following in a new terminal window to create Azure OpenAI and other resources needed:
Take note of the value of AZURE_OPENAI_ENDPOINT
which can be found in ./.azure/<env name from azd provision>/.env
. It will look something like:
AZURE_OPENAI_ENDPOINT="https://cog-<unique string>.openai.azure.com/"
Alternatively you can create an OpenAI resource in the Azure portal to get your key and endpoint. After it deploys, click Go to resource and view the Endpoint value. You will also need to deploy a model, e.g. with name chat
and model gpt-35-turbo
.
{ "IsEncrypted": false, "Values": { "AzureWebJobsStorage": "UseDevelopmentStorage=true", "FUNCTIONS_WORKER_RUNTIME": "dotnet-isolated", "AZURE_OPENAI_ENDPOINT": "<paste from above>", "CHAT_MODEL_DEPLOYMENT_NAME": "chat" } }Add the following permissions to the Azure OpenAI resource:
Cognitive Services OpenAI User - Add your account (contoso.microsoft.com) to the OpenAI resource if you did not create the OpenAI resource to test locally and the Azure Function App's Managed Identity when running in Azure. If you used azd provision
this step is already done - your logged in user and your function's managed idenitty already have permissions granted.
code .
code command to open the project in Visual Studio Code.Azurite: Start
, which enables debugging without warnings.httpget
and httppost
endpoints respectively using your HTTP test tool (or browser for httpget
). If you have the RestClient extension installed, you can execute requests directly from the test.http
project file.AssistantSample.sln
solution file in Visual Studio.localhost
URL endpoints, including the port, which might not be 7071
.test.http
project file, update the port on the localhost
URL (if needed), and then use the built-in HTTP client to call the httpget
and httppost
endpoints.Run this command to provision the function app, with any required Azure resources, and deploy your code:
You're prompted to supply these required deployment parameters:
Parameter Description Environment name An environment that's used to maintain a unique deployment context for your app. You won't be prompted if you created the local project usingazd init
. Azure subscription Subscription in which your resources are created. Azure location Azure region in which to create the resource group that contains the new Azure resources. Only regions that currently support the Flex Consumption plan are shown.
After publish completes successfully, azd
provides you with the URL endpoints of your new functions, but without the function key values required to access the endpoints. To learn how to obtain these same endpoints along with the required function keys, see Invoke the function on Azure in the companion article Quickstart: Create and deploy functions to Azure Functions using the Azure Developer CLI.
You can run the azd up
command as many times as you need to both provision your Azure resources and deploy code updates to your function app.
Note
Deployed code files are always overwritten by the latest deployment package.
When you're done working with your function app and related resources, you can use this command to delete the function app and its related resources from Azure and avoid incurring any further costs (--purge does not leave a soft delete of AI resource and recovers your quota):
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4