note
Access modes have been renamed. Shared access mode is now Standard. Single user access mode is now Dedicated and can be assigned to a single user or group. Group access is in Public Preview.
Databricks recommends using standard access mode (formerly shared access mode) for most workloads. This article outlines limitations and requirements for each access mode with Unity Catalog. For details on access modes, see Access modes.
Databricks recommends using compute policies to simplify configuration options for most users. See Create and manage compute policies.
For demos of updating compute for Unity Catalog, see:
note
No-isolation shared and credential passthrough are legacy access modes that do not support Unity Catalog.
Dedicated access mode limitations on Unity CatalogâDedicated access mode on Unity Catalog has the following limitations. These are in addition to the general limitations for all Unity Catalog access mode. See General limitations for Unity Catalog.
Fine-grained access control support with dedicated access modeânote
To take advantage of the data filtering available on Databricks Runtime 15.4 LTS and above, your workspace must be enabled for serverless compute.
Databricks Runtime 15.4 LTS and above supports fine-grained access control for read operations.
Databricks Runtime 16.3 and above supports writes into tables with row and column filters using MERGE INTO and the DataFrame.write.mode("append")
API. See Support for write operations.
On Databricks Runtime 15.3 and below, fine-grained access control on dedicated compute is not supported. Specifically:
SELECT
on all tables and views that are referenced by the view.On Databricks Runtime 15.3 and below, you cannot use dedicated compute to query tables that were created using Lakeflow Declarative Pipelines, including streaming tables and materialized views, if those tables are owned by other users. The user who creates a table is the owner.
To query streaming tables and materialized views created by Lakeflow Declarative Pipelines and owned by other users, use one of the following:
Your workspace must also be enabled for serverless compute. For more information, see Fine-grained access control on dedicated compute.
Streaming limitations for Unity Catalog dedicated access modeâStreamingQueryListener
requires Databricks Runtime 15.1 or above to use credentials or interact with objects managed by Unity Catalog on dedicated compute.Standard access mode in Unity Catalog has the following limitations. These are in addition to the general limitations for all Unity Catalog access modes. See General limitations for Unity Catalog.
spark.databricks.scala.kernel.fullClasspath.enabled
to true
.sc
),spark.sparkContext
, and sqlContext
are not supported for Scala in any Databricks Runtime and are not supported for Python in Databricks Runtime 14.0 and above.
spark
variable to interact with the SparkSession
instance.sc
functions are also not supported: emptyRDD
, range
, init_batched_serializer
, parallelize
, pickleFile
, textFile
, wholeTextFiles
, binaryFiles
, binaryRecords
, sequenceFile
, newAPIHadoopFile
, newAPIHadoopRDD
, hadoopFile
, hadoopRDD
, union
, runJob
, setSystemProperty
, uiWebUrl
, stop
, setJobGroup
, setLocalProperty
, getConf
.map
, mapPartitions
, foreachPartition
, flatMap
, reduce
and filter
.spark.executor.extraJavaOptions
is not supported.User-defined functions (UDFs) have the following limitations with standard access mode:
applyInPandas
and mapInPandas
require Databricks Runtime 14.3 or above.grpc
, pyarrow
, or protobuf
in a PySpark UDF through notebook-scoped or cluster-scoped libraries is not supported because the installed version is always preferred. To find the version of installed libraries, see the System Environment section of the specific Databricks Runtime version release notes.See User-defined functions (UDFs) in Unity Catalog.
Streaming limitations and requirements for Unity Catalog standard access modeâstatestore
and state-metadata
to query state information for stateful streaming queries.transformWithState
and associated APIs are not supported.transformWithStateInPandas
requires Databricks Runtime 16.3 and above.foreach
requires Databricks Runtime 16.1 or above. foreachBatch
, and flatMapGroupsWithState
require Databricks Runtime 16.2 or above.foreachBatch
has the following behavior changes in Databricks Runtime 14.0 and above:
print()
commands write output to the driver logs.dbutils.widgets
submodule inside the function.from_avro
requires Databricks Runtime 14.2 or above.applyInPandasWithState
requires Databricks Runtime 14.3 LTS or above.sourceArchiveDir
must be in the same external location as the source when you use option("cleanSource", "archive")
with a data source managed by Unity Catalog.kafka.sasl.client.callback.handler.class
kafka.sasl.login.callback.handler.class
kafka.sasl.login.class
kafka.partition.assignment.strategy
kafka.ssl.truststore.location
kafka.ssl.keystore.location
StreamingQueryListener
requires Databricks Runtime 16.1 and above.StreamingQueryListener
requires Databricks Runtime 14.3 LTS or above to use credentials or interact with objects managed by Unity Catalog on compute with standard access mode.The following limitations apply when using the scala kernel on standard access mode compute.
Input
. For a list of almond's defined imports, see almond imports.//connector/sql-aws-connectors:sql-aws-connectors
is not in the Scala REPL's bazel target, use results in ClassNotFoundException
./
) for DBFS are not supported.The following limitations apply to all Unity Catalog-enabled access modes.
UDFsâGraviton instance support for UDFs on Unity Catalog-enabled clusters is available in Databricks Runtime 15.2 and above. Additional limitations exist for standard access mode. See UDF limitations and requirements for Unity Catalog standard access mode.
Streaming limitations for Unity CatalogâSee also Streaming limitations for Unity Catalog dedicated access mode and Streaming limitations and requirements for Unity Catalog standard access mode.
For more on streaming with Unity Catalog, see Using Unity Catalog with Structured Streaming.
Spark API limitations for Unity CatalogâRDD APIs are not supported.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4