Stay organized with collections Save and categorize content based on your preferences.
The ML.GLOBAL_EXPLAIN functionThis document describes the ML.GLOBAL_EXPLAIN
function, which lets you provide explanations for the entire model by aggregating the local explanations of the evaluation data. You can only use ML.GLOBAL_EXPLAIN
with models that are trained with the ENABLE_GLOBAL_EXPLAIN
option set to TRUE
.
ML.GLOBAL_EXPLAIN( MODEL `PROJECT_ID.DATASET.MODEL`, STRUCT( [CLASS_LEVEL_EXPLAIN AS class_level_explain]))Arguments
ML.GLOBAL_EXPLAIN
takes the following arguments:
PROJECT_ID
: your project ID.DATASET
: the BigQuery dataset that contains the model.MODEL
: the name of the model.CLASS_LEVEL_EXPLAIN
: a BOOL
value that specifies whether global feature importances are returned for each class. Applies only to non-AutoML Tables classification models. When set to FALSE
, the global feature importance of the entire model is returned rather than that of each class. The default value is FALSE
.
Regression models and AutoML Tables classification models only have model-level global feature importance.
The output of ML.GLOBAL_EXPLAIN
has two formats:
For classification models with class_level_explain
set to FALSE
, and for regression models, the following columns are returned:
feature
: a STRING
value that contains the feature name.attribution
: a FLOAT64
value that contains the feature importance to the model overall.For classification models with class_level_explain
set to TRUE
, the following columns are returned:
<class_name>
: a STRING
value that contains the name of the class in the label column.feature
: a STRING
value that contains the feature name.attribution
: a FLOAT64
value that contains the feature importance to this class.For each class, only the top 10 most important features are returned.
The following examples assume your model is in your default project.
Regression modelThis example gets global feature importance for the boosted tree regression model mymodel
in mydataset
. The dataset is in your default project.
SELECT * FROM ML.GLOBAL_EXPLAIN(MODEL `mydataset.mymodel`)Classifier model
This example gets global feature importance for the boosted tree classifier model mymodel
in mydataset
. The dataset is in your default project.
SELECT * FROM ML.GLOBAL_EXPLAIN(MODEL `mydataset.mymodel`, STRUCT(TRUE AS class_level_explain))What's next
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-08-07 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-07 UTC."],[[["`ML.GLOBAL_EXPLAIN` provides explanations for an entire model by aggregating the local explanations of the evaluation data."],["This function is only usable with models trained with the `ENABLE_GLOBAL_EXPLAIN` option set to `TRUE`."],["The output format varies between classification models, offering class-specific importance, and regression models, which provide overall feature importance."],["The function's output provides feature names and their importance or attribution as either `FLOAT64` or `STRING` values."],["The function also includes an optional parameter, `class_level_explain`, which enables a user to get global importance for each class in non-AutoML classification models."]]],[]]
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4