A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://docs.microsoft.com/azure/azure-functions/functions-create-ai-enabled-apps below:

Use AI tools and models in Azure Functions

Azure Functions provides serverless compute resources that integrate with AI and Azure services to streamline the process of building cloud-hosted intelligent applications. This article provides a survey of the breadth of AI-related scenarios, integrations, and other AI resources that you can use in your function apps.

Some of the inherent benefits of using Azure Functions as a compute resource for your AI-integrated tasks include:

This article is language-specific, so make sure you choose your programming language at the top of the page.

Core AI integration scenarios

The combination of built-in bindings and broad support for external libraries provides you with a wide range of potential scenarios for augmenting your apps and solutions with the power of AI. These are some key AI integration scenarios supported by Functions.

Retrieval-augmented generation

Because Functions is able to handle multiple events from various data sources simultaneously, it's an effective solution for real-time AI scenarios, like RAG systems that require fast data retrieval and processing. Rapid event-driven scaling reduces the latency experienced by your customers, even in high-demand situations.

Here are some reference samples for RAG-based scenarios:

RAG with Azure AI Search

RAG with Azure AI Search

RAG with Azure AI Search

RAG with Azure AI Search

For RAG, you can use SDKs, including but not limited to Azure Open AI and Azure SDKs to build out your scenarios. This reference sample uses the OpenAI binding extension to highlight OpenAI RAG with Azure AI Search.

Custom chat bot

Custom chat bot

Custom chat bot

Custom chat bot

Shows you how to create a friendly chat bot that issues simple prompts, receives text completions, and sends messages, all in a stateful session using the OpenAI binding extension.

Assistant function calling

Assistant function calling gives your AI assistant or agent the ability to invoke specific functions or APIs dynamically based on the context of a conversation or task. These behaviors enable assistants to interact with external systems, retrieve data, and perform other actions.

Functions is ideal for implementing assistant function calling in agentic workflows. In addition to scaling efficiently to handle demand, binding extensions simplify the process of using Functions to connect assistants with remote Azure services. If there's no binding for your data source or you need full control over SDK behaviors, you can always manage your own client SDK connections in your app.

Here are some reference samples for assistant function calling scenarios:

Assistants function calling (OpenAI bindings)

Assistants function calling (OpenAI bindings)

Agents function calling (Azure AI SDKs)

Agents function calling (Azure AI SDKs)

Agents function calling (Azure AI SDKs)

Uses function calling features for agents in Azure AI SDKs to implement custom functions calling.

Remote MCP servers

The Model Context Protocol (MCP) provides a standardized way for AI models to communicate with external systems to determine their capabilities and how they can best be used by AI assistants and agents. An MCP server enables an AI model (client) to more efficiently make these determinations.

Functions provides an MCP binding extension that simplifies the process of creating custom MCP servers in Azure.

Here's an example of such a custom MCP server project:

Remote MCP servers

Remote MCP servers

Remote MCP servers

Provides an MCP server template along with several function tool endpoints, which can be run locally and also deployed to Azure.

Agentic workflows

While it's common for AI-driven processes to autonomously determine how to interact with models and other AI assets, there are many cases where a higher level of predicability is required or where the required steps are well-defined. These directed agentic workflows are composed of an orchestration of separate tasks or interactions that agents are required to follow.

The Durable Functions extension helps you take advantage of the strengths of Functions to create multi-step, long-running operations with built-in fault tolerance. These workflows are perfect for your directed agentic workflows. For example, a trip planning solution might first gather requirements from the user, search for plan options, obtain user approval, and finally make required bookings. In this scenario, you can build an agent for each step and then coordinate their actions as a workflow using Durable Functions.

For more workflow scenario ideas, see Application patterns in Durable Functions.

Because Functions lets you build apps in your preferred language and using your favorite libraries, there's a wide range of flexibility in what AI libraries and frameworks you can use in your AI-enabled function apps.

Here are some of the key Microsoft AI frameworks of which you should be aware:

Framework/library Description Azure AI Services SDKs By working directly with client SDKs, you can use the full breadth of Azure AI services functionality directly in your function code. OpenAI binding extension Easily integrate the power of Azure OpenAI in your functions and let Functions manage the service integration. Semantic Kernel Enables you to easily build AI agents and models.

Functions also enables your apps to reference third-party libraries and frameworks, which means that you can also use all of your favorite AI tools and libraries in your AI-enabled functions.

Related articles

RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4