Skip to content

Enable unified host support without flag#1358

Open
hectorcast-db wants to merge 3 commits intomainfrom
hectorcast-db/stack/unified-host-ga
Open

Enable unified host support without flag#1358
hectorcast-db wants to merge 3 commits intomainfrom
hectorcast-db/stack/unified-host-ga

Conversation

@hectorcast-db
Copy link
Copy Markdown
Contributor

@hectorcast-db hectorcast-db commented Mar 25, 2026

🥞 Stacked PR

Use this link to review incremental changes.


Summary

Remove the experimental_is_unified_host flag, which was not used anymore. Update README to document the new support, the updates to default authentication flow and auto detection.

This PR includes no behavioral changes.

Co-authored-by: Isaac

How is this tested?

N/A

Remove the `experimental_is_unified_host` flag and enable host metadata
resolution for all hosts automatically. Update README to document the
default authentication flow including cloud auto-detection and auth_type
forcing. Add NEXT_CHANGELOG entry for the new feature.

Co-authored-by: Isaac
@hectorcast-db hectorcast-db changed the title Graduate unified/SPOG host support from experimental Enable unified/SPOG host support for all hosts Mar 25, 2026

1. [Databricks native authentication](#databricks-native-authentication)
2. [Azure native authentication](#azure-native-authentication)
3. [GCP native authentication](#google-cloud-platform-native-authentication)
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unrelated, but was missing

Add a new "Unified host support" section to the README explaining that
a single configuration profile can serve both WorkspaceClient and
AccountClient when the host supports it and both account_id and
workspace_id are available. Update changelog entry accordingly.

Co-authored-by: Isaac
@hectorcast-db
Copy link
Copy Markdown
Contributor Author

Range-diff: main (6b2f59c -> c7938a9)
NEXT_CHANGELOG.md
@@ -4,7 +4,7 @@
  ## Release v0.103.0
  
  ### New Features and Improvements
-+* Add support for host-agnostic and SPOG (Single Pane of Glass) URLs. The SDK now automatically supports for all hosts without requiring the `experimental_is_unified_host` flag, which has been removed.
++* Add support for unified hosts, including SPOG (Single Pane of Glass) URLs. A single configuration profile can now be used for both account-level and workspace-level operations when the host supports it and both `account_id` and `workspace_id` are available. The `experimental_is_unified_host` flag has been removed; unified host detection is now automatic.
  * Accept `DATABRICKS_OIDC_TOKEN_FILEPATH` environment variable for consistency with other Databricks SDKs (Go, CLI, Terraform). The previous `DATABRICKS_OIDC_TOKEN_FILE` is still supported as an alias.
  
  ### Security
README.md
@@ -1,6 +1,13 @@
 diff --git a/README.md b/README.md
 --- a/README.md
 +++ b/README.md
+ ### In this section
+ 
+ - [Default authentication flow](#default-authentication-flow)
++- [Unified host support](#unified-host-support)
+ - [Databricks native authentication](#databricks-native-authentication)
+ - [Azure native authentication](#azure-native-authentication)
+ - [Overriding .databrickscfg](#overriding-databrickscfg)
  
  1. [Databricks native authentication](#databricks-native-authentication)
  2. [Azure native authentication](#azure-native-authentication)
@@ -22,20 +29,58 @@
  
  For each authentication method, the SDK searches for compatible authentication credentials in the following locations,
  in the following order. Once the SDK finds a compatible set of credentials that it can use, it stops searching:
+ 
+ Depending on the Databricks authentication method, the SDK uses the following information. Presented are the `WorkspaceClient` and `AccountClient` arguments (which have corresponding `.databrickscfg` file fields), their descriptions, and any corresponding environment variables.
+ 
++### Unified host support
++
++Certain Databricks host types, such as SPOG (Single Pane of Glass) URLs, support both account-level and workspace-level API operations from a single endpoint. When using a unified host, a single configuration profile can be used to create both `WorkspaceClient` and `AccountClient` instances without changing the `host`.
++
++For this to work, the following conditions must be met:
++
++1. The host must support unified operations (e.g., a SPOG URL).
++2. Both `account_id` and `workspace_id` must be available — either set explicitly in the configuration or auto-discovered.
++
++When both values are present, the SDK uses `workspace_id` to route workspace-level requests and `account_id` to route account-level requests, all through the same host.
++
++```ini
++# .databrickscfg
++[unified]
++host         = https://mycompany.databricks.com
++account_id   = 00000000-0000-0000-0000-000000000000
++workspace_id = 1234567890
++```
++
++```python
++from databricks.sdk import WorkspaceClient, AccountClient
++
++# Both clients share the same host and profile
++w = WorkspaceClient(profile='unified')
++a = AccountClient(profile='unified')
++
++# A WorkspaceClient for a different workspace under the same host and account
++w = WorkspaceClient(profile='unified', workspace_id='2345678901')
++```
++
++If the host supports it, `account_id` and `workspace_id` may be auto-discovered, reducing the required explicit configuration.
++
+ ### Databricks native authentication
+ 
+ By default, the Databricks SDK for Python initially tries [Databricks token authentication](https://docs.databricks.com/dev-tools/api/latest/authentication.html) (`auth_type='pat'` argument). If the SDK is unsuccessful, it then tries Workload Identity Federation (WIF). See [Supported WIF](https://docs.databricks.com/aws/en/dev-tools/auth/oauth-federation-provider) for the supported JWT token providers.
  - For Databricks OIDC authentication, you must provide the `host`, `client_id` and `token_audience` _(optional)_ either directly, through the corresponding environment variables, or in your `.databrickscfg` configuration file.
  - For Azure DevOps OIDC authentication, the `token_audience` is irrelevant as the audience is always set to `api://AzureADTokenExchange`. Also, the `System.AccessToken` pipeline variable required for OIDC request must be exposed as the `SYSTEM_ACCESSTOKEN` environment variable, following [Pipeline variables](https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml#systemaccesstoken)
  
-+During initialization, the SDK automatically resolves missing configuration fields (`account_id`, `workspace_id`, `cloud`, and `discovery_url`). Any explicitly provided values take precedence and are never overwritten. If the metadata endpoint is unavailable, the SDK falls back to the explicit configuration. It is recommended to always set explicit configuration.
++During initialization, the SDK automatically resolves missing configuration fields (`account_id`, `workspace_id`, `cloud`, and `discovery_url`). Any explicitly provided values take precedence and are never overwritten. If the auto discovery fails, the SDK falls back to the explicit configuration. It is recommended to always set explicit configuration.
 +
  | Argument         | Description                                                                                                                                                                                                                                                               | Environment variable    |
  |------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------------------|
 -| `host`           | _(String)_ The Databricks host URL for either the Databricks workspace endpoint or the Databricks accounts endpoint.                                                                                                                                                      | `DATABRICKS_HOST`       |     
 -| `account_id`     | _(String)_ The Databricks account ID for the Databricks accounts endpoint. Only has effect when `Host` is either `https://accounts.cloud.databricks.com/` _(AWS)_, `https://accounts.azuredatabricks.net/` _(Azure)_, or `https://accounts.gcp.databricks.com/` _(GCP)_.  | `DATABRICKS_ACCOUNT_ID` |
 +| `host`           | _(String)_ The Databricks host URL for either the Databricks workspace endpoint or the Databricks accounts endpoint.                                                                                                                                                      | `DATABRICKS_HOST`       |
-+| `account_id`     | _(String)_ The Databricks account ID for the Databricks accounts endpoint. Auto-detected from host metadata if not provided.  | `DATABRICKS_ACCOUNT_ID` |
-+| `workspace_id`   | _(String)_ The Databricks workspace ID for the Databricks workspace endpoint. Auto-detected from host metadata if not provided. | `DATABRICKS_WORKSPACE_ID` |
-+| `cloud`          | _(String)_ The cloud provider for the Databricks workspace (`AWS`, `AZURE`, or `GCP`). Auto-detected from host metadata if not provided. When set, `is_aws`, `is_azure`, and `is_gcp` use this value directly instead of inferring from hostname. | `DATABRICKS_CLOUD` |
-+| `discovery_url`  | _(String)_ The OpenID Connect discovery URL. Auto-detected from host metadata if not provided. When set, OIDC endpoints are fetched directly from this URL instead of using the default host-based well-known endpoint logic. | `DATABRICKS_DISCOVERY_URL` |
++| `account_id`     | _(String)_ The Databricks account ID for the Databricks accounts endpoint. Auto-discovered if not provided.  | `DATABRICKS_ACCOUNT_ID` |
++| `workspace_id`   | _(String)_ The Databricks workspace ID for the Databricks workspace endpoint. Auto-discovered if not provided. | `DATABRICKS_WORKSPACE_ID` |
++| `cloud`          | _(String)_ The cloud provider for the Databricks workspace (`AWS`, `AZURE`, or `GCP`). Auto-discovered if not provided. When set, `is_aws`, `is_azure`, and `is_gcp` use this value directly instead of inferring from hostname. | `DATABRICKS_CLOUD` |
++| `discovery_url`  | _(String)_ The OpenID Connect discovery URL. Auto-discovered if not provided. When set, OIDC endpoints are fetched directly from this URL instead of using the default host-based well-known endpoint logic. | `DATABRICKS_DISCOVERY_URL` |
  | `token`          | _(String)_ The Databricks personal access token (PAT) _(AWS, Azure, and GCP)_ or Azure Active Directory (Azure AD) token _(Azure)_.                                                                                                                                       | `DATABRICKS_TOKEN`      |
  | `client_id`      | _(String)_ The Databricks Service Principal Application ID.                                                                                                                                                                                                               | `DATABRICKS_CLIENT_ID`  |
  | `token_audience` | _(String)_ When using Workload Identity Federation, the audience to specify when fetching an ID token from the ID token supplier.                                                                                                                                         | `TOKEN_AUDIENCE`        |

Reproduce locally: git range-diff 31ef698..6b2f59c 31ef698..c7938a9 | Disable: git config gitstack.push-range-diff false

@hectorcast-db hectorcast-db changed the title Enable unified/SPOG host support for all hosts Enable unified host support without flag Mar 25, 2026
@hectorcast-db
Copy link
Copy Markdown
Contributor Author

Range-diff: main (c7938a9 -> 5357947)
NEXT_CHANGELOG.md
@@ -4,7 +4,7 @@
  ## Release v0.103.0
  
  ### New Features and Improvements
-+* Add support for unified hosts, including SPOG (Single Pane of Glass) URLs. A single configuration profile can now be used for both account-level and workspace-level operations when the host supports it and both `account_id` and `workspace_id` are available. The `experimental_is_unified_host` flag has been removed; unified host detection is now automatic.
++* Add support for unified hosts. A single configuration profile can now be used for both account-level and workspace-level operations when the host supports it and both `account_id` and `workspace_id` are available. The `experimental_is_unified_host` flag has been removed; unified host detection is now automatic.
  * Accept `DATABRICKS_OIDC_TOKEN_FILEPATH` environment variable for consistency with other Databricks SDKs (Go, CLI, Terraform). The previous `DATABRICKS_OIDC_TOKEN_FILE` is still supported as an alias.
  
  ### Security
README.md
@@ -34,11 +34,11 @@
  
 +### Unified host support
 +
-+Certain Databricks host types, such as SPOG (Single Pane of Glass) URLs, support both account-level and workspace-level API operations from a single endpoint. When using a unified host, a single configuration profile can be used to create both `WorkspaceClient` and `AccountClient` instances without changing the `host`.
++Certain Databricks host types support both account-level and workspace-level API operations from a single endpoint. When using such a unified host, a single configuration profile can be used to create both `WorkspaceClient` and `AccountClient` instances without changing the `host`.
 +
 +For this to work, the following conditions must be met:
 +
-+1. The host must support unified operations (e.g., a SPOG URL).
++1. The host must support unified operations.
 +2. Both `account_id` and `workspace_id` must be available — either set explicitly in the configuration or auto-discovered.
 +
 +When both values are present, the SDK uses `workspace_id` to route workspace-level requests and `account_id` to route account-level requests, all through the same host.

Reproduce locally: git range-diff 31ef698..c7938a9 31ef698..5357947 | Disable: git config gitstack.push-range-diff false

@hectorcast-db hectorcast-db requested a review from tanmay-db March 26, 2026 13:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants