HomeGuidesSDK ExamplesAnnouncementsCommunity
Guides

Azure Databricks OAuth Connection

This guide outlines how to set up a a JDBC connection to Azure Databricks using OAuth with automatic token refresh.

What AuthMech=11 is (and how it differs from AuthMech=3)

  • AuthMech=3: Uses a Databricks Personal Access Token (PAT). You pass UID=token; PWD=<pat>. No automatic refresh; user-scoped; requires manual token rotation.
  • AuthMech=11: Uses OAuth. Multiple flows:
    • Auth_Flow=0: You supply a pre-generated OAuth access token (expires in ~1 hour).
    • Auth_Flow=1: Driver performs client credentials (service principal) and automatically refreshes tokens. Recommended for applications.
    • Auth_Flow=2: Interactive (browser) login.
    • Auth_Flow=3: Azure Managed Identity.

For production applications, use Auth_Flow=1 as it provides automatic token refresh without manual intervention.

References: OAuth M2M for SPNs, Auth mechanisms/flows, JDBC auth reference

Prerequisites and objects you need

  • Databricks workspace in Azure
  • A SQL Warehouse (recommended) or cluster
  • Service principal inside the Databricks workspace (Databricks-managed is fine):
    • App/Client ID (UUID)
    • OAuth secret (generated from the service principal)
    • Entitlements enabled: "Workspace access" and "Databricks SQL access"
  • Permission "Can Use" for this service principal on the specific warehouse
  • JDBC driver version 2.6.36 or later

Step 1 — Create SQL Warehouse and capture connection values

  • Workspace → SQL → SQL Warehouses → Create or open warehouse
  • Connection details tab:
    • Copy Server hostname (e.g., adb-401997777489262.2.azuredatabricks.net)
    • Copy HTTP path (e.g., /sql/1.0/warehouses/277f3f16faa85d52)

Step 2 — Create the service principal and entitlements

  • Workspace settings → Identity and access → Service principals → Add → Databricks managed
  • Open the SP → Configurations:
    • Check "Workspace access" and "Databricks SQL access"
    • Click "Update"
  • Generate OAuth secret: Service principal → "Credentials & secrets" → "Generate secret"
  • Save both the Client ID (UUID) and Secret — the secret is only shown once

Docs: OAuth M2M SP setup

Step 3 — Grant the SP "Can Use" on the warehouse

If the "Permissions/Manage access" menu isn't visible in the new UI, use the Permissions API with a human user PAT:

export DATABRICKS_HOST='https://<your-databricks-hostname>'
export USER_TOKEN='<your-user-PAT>'
export WAREHOUSE_ID='<your-warehouse-id>'
export SP_APP_ID='<your-service-principal-client-id>'

curl -s -X PATCH "$DATABRICKS_HOST/api/2.0/permissions/warehouses/$WAREHOUSE_ID" \
  -H "Authorization: Bearer $USER_TOKEN" \
  -H "Content-Type: application/json" \
  -d "{\"access_control_list\":[{\"service_principal_name\":\"$SP_APP_ID\",\"permission_level\":\"CAN_USE\"}]}"

Verify the permission was granted:

curl -s -H "Authorization: Bearer $USER_TOKEN" \
  "$DATABRICKS_HOST/api/2.0/permissions/warehouses/$WAREHOUSE_ID"

Docs: Permissions API — warehouses

Step 4 — Build the production JDBC URL (Auth_Flow=1 — Recommended)

For applications requiring consistent database access, use Auth_Flow=1 which provides automatic token management:

jdbc:databricks://<server-hostname>:443;httpPath=<http-path>;SSL=1;AuthMech=11;Auth_Flow=1;OAuth2ClientId=<service-principal-client-id>

Example:

  • You may need to include OAuth2Secret in the JDBC URL: OAuth2Secret=<databricks-oauth-secret>
jdbc:databricks://adb-401234234389262.2.azuredatabricks.net:443;httpPath=/sql/1.0/warehouses/324f3f16fvd55d52;SSL=1;AuthMech=11;Auth_Flow=1;OAuth2ClientId=d1982c30-243d-7354-6240-a1eet322c8a1

Step 5 — Configuration in MaxAI

  1. Go to MaxAI Dashboard > Skill Studio > Database Connections > + button at the top > Add Connection
  2. Configure the following Database Connection:
    • Connection Type: Databricks
    • Connection URL: The JDBC URL created above
    • Database Name: Your DB Name
    • Database Schema: Your DB Schema
    • Password: Use the OAuth2Secret value if it’s not in the JDBC URL; otherwise, any non-empty value works (still required but not used).
  3. Test Connection
  4. Save

Troubleshooting

  • 403 PERMISSION_DENIED on warehouse: Grant "Can Use" to the SP on that warehouse (Step 3)
  • 401 invalid_client during connection: Wrong OAuth2Secret or using Microsoft Entra client secret instead of the Databricks OAuth secret
  • 401 on Permissions API: Use a human user PAT/OAuth token (not the SP token) to grant permissions
  • 404 or 403 when connecting: Verify exact httpPath and that the warehouse is Running
  • Token expired (Auth_Flow=0 only): Switch to Auth_Flow=1 for automatic refresh
  • Driver issues: Update to Databricks JDBC driver 2.6.36 or later

Security Best Practices

  • Store the OAuth2Secret in a secure secret store (e.g., Azure Key Vault)
  • Use Auth_Flow=1 for production applications to avoid token expiration issues
  • Rotate service principal secrets periodically
  • Limit service principal permissions to only required warehouses/clusters
  • Monitor service principal usage through Databricks audit logs

For Testing Purposes: Manual token management (Auth_Flow=0)

If you need to manage tokens manually (not recommended for applications):

  1. Generate a workspace-level OAuth token:
export CLIENT_ID='<service-principal-client-id>'
export CLIENT_SECRET='<service-principal-oauth-secret>'
export TOKEN_ENDPOINT='https://<your-databricks-hostname>/oidc/v1/token'

ACCESS_TOKEN=$(curl -s --request POST \
  --url "$TOKEN_ENDPOINT" \
  --user "$CLIENT_ID:$CLIENT_SECRET" \
  --data 'grant_type=client_credentials&scope=all-apis' | jq -r .access_token)
  1. Use the token in your JDBC URL:
jdbc:databricks://<server-hostname>:443;httpPath=<http-path>;SSL=1;AuthMech=11;Auth_Flow=0;Auth_AccessToken=<ACCESS_TOKEN>

Note: Tokens expire in ~1 hour and require manual refresh.

Docs: Manually generate a workspace-level OAuth token


References