This topic discusses the basic approaches for making secured Databricks CLI or REST API calls using Databricks account credentials, such as user accounts or service principals.
To access a Databricks resource with the Databricks CLI or REST APIs, clients must authorize using a Databricks account. This account must have permissions to access the resource, which can be configured by your Databricks administrator or a user account with adminstrator privileges.
There are two types of accounts that you can use, depending on how you intend to access your Databricks resources:
Once you have decided on the Databricks account type, you must acquire an access token that represents the account's credentials. You provide this access token when accessing the account's resources in your scripts or code, or in interactive sessions.
Your account's credentials are represented by a secure access token, which you provide either directly or indirectly to the CLI command or API call.
To securely run a Databricks CLI command or API request that requires authorized access to an account or workspace, you must provide an access token based on valid Databricks account credentials.
The following table shows the authorization methods available to your Databricks account.
Account-level APIs and workspace-level APIsâTo authenticate with Databricks REST APIs, you must understand the difference between account-level and workspace-level APIs.
A Databricks account can host multiple workspaces and is managed through the account console. A workspace contains resources like jobs and notebooks and is managed by workspace admins.
Because Databricks tools and SDKs work with one or more supported Databricks authorization methods, you can select the best authorization method for your use case. For details, see the tool or SDK documentation in Local development tools.
What authorization option should I choose?âDatabricks provides two options for authorization or authentication with an access token:
important
Databricks strongly recommends you use OAuth tokens over PATs for authorization as OAuth tokens are automatically refreshed by default and do not require the direct management of the access token, improving your security against token hijacking and unwanted access.
Because OAuth creates and manages the access token for you, you provide an OAuth token endpoint URL, a client ID, and a secret you generate from your Databricks workspace instead of directly providing a token string. Choose PATs only when you are integrating a third-party tool or service that is unsupported by Databricks unified client authentication or has no OAuth support.
How do I use OAuth to authorize access to Databricks resources?âDatabricks provides unified client authentication to assist you with authorization by using a default set of environment variables you can set to specific credential values. This helps you work more easily and securely since these environment variables are specific to the environment that will be running the Databricks CLI commands or calling Databricks APIs.
These environment variables are:
You can set these directly, or through the use of a Databricks configuration profile (.databrickscfg
) on your client machine.
To use an OAuth access token, your Databricks workspace or account administrator must have granted your user account or service principal the CAN USE
privilege for the account and workspace features your code will access.
For more details on configuring OAuth authorization for your client and to review cloud provider-specific authorization options, see Unified client authentication.
If you are writing code which accesses third-party services, tools, or SDKs you must use the authentication and authorization mechanisms provided by the third-party. However, if you must grant a third-party tool, SDK, or service access to your Databricks account or workspace resources, Databricks provides the following support:
A Databricks configuration profile contains settings and other information that Databricks needs to authorize access. Databricks configuration profiles are stored in local client files for your tools, SDKs, scripts, and apps to use. The standard configuration profile file is named .databrickscfg
.
For more information, see Databricks configuration profiles.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4