A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://developers.google.com/bigquery/docs/reference/standard-sql/bigqueryml-syntax-global-explain below:

The ML.GLOBAL_EXPLAIN function | BigQuery

Stay organized with collections Save and categorize content based on your preferences.

The ML.GLOBAL_EXPLAIN function

This document describes the ML.GLOBAL_EXPLAIN function, which lets you provide explanations for the entire model by aggregating the local explanations of the evaluation data. You can only use ML.GLOBAL_EXPLAIN with models that are trained with the ENABLE_GLOBAL_EXPLAIN option set to TRUE.

Syntax
ML.GLOBAL_EXPLAIN(
  MODEL `PROJECT_ID.DATASET.MODEL`,
  STRUCT(
    [CLASS_LEVEL_EXPLAIN AS class_level_explain]))
Arguments

ML.GLOBAL_EXPLAIN takes the following arguments:

Output

The output of ML.GLOBAL_EXPLAIN has two formats:

Examples

The following examples assume your model is in your default project.

Regression model

This example gets global feature importance for the boosted tree regression model mymodel in mydataset. The dataset is in your default project.

SELECT
  *
FROM
  ML.GLOBAL_EXPLAIN(MODEL `mydataset.mymodel`)
Classifier model

This example gets global feature importance for the boosted tree classifier model mymodel in mydataset. The dataset is in your default project.

SELECT
  *
FROM
  ML.GLOBAL_EXPLAIN(MODEL `mydataset.mymodel`, STRUCT(TRUE AS class_level_explain))
What's next

Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-08-07 UTC.

[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-07 UTC."],[[["`ML.GLOBAL_EXPLAIN` provides explanations for an entire model by aggregating the local explanations of the evaluation data."],["This function is only usable with models trained with the `ENABLE_GLOBAL_EXPLAIN` option set to `TRUE`."],["The output format varies between classification models, offering class-specific importance, and regression models, which provide overall feature importance."],["The function's output provides feature names and their importance or attribution as either `FLOAT64` or `STRING` values."],["The function also includes an optional parameter, `class_level_explain`, which enables a user to get global importance for each class in non-AutoML classification models."]]],[]]


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4