A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://docs.databricks.com/aws/en/machine-learning/model-inference/ below:

Deploy models for batch inference and prediction

Deploy models for batch inference and prediction

This article describes what Databricks recommends for batch inference.

For real-time model serving on Databricks, see Deploy models using Mosaic AI Model Serving.

AI Functions for batch inference​

Preview

This feature is in Public Preview.

AI Functions are built-in functions that you can use to apply AI on your data that is stored on Databricks. You can run batch inference using task-specific AI functions or the general purpose function, ai_query.

The following is an example of batch inference using the task-specific AI function, ai_translate. If you want perform batch inference on an entire table, you can remove the limit 500 from your query.

SQL


SELECT
writer_summary,
ai_translate(writer_summary, "cn") as cn_translation
from user.batch.news_summaries
limit 500
;

Alternatively, you can use the general purpose function, ai_query to perform batch inference.

Batch inference using a Spark DataFrame​

See Perform batch inference using a Spark DataFrame for a step-by-step guide through the model inference workflow using Spark.

For deep learning model inference examples see the following articles:


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4