Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Prerequisites for creating Azure SQL Warehouse Connection
Anchor
AzureSQLconnection
AzureSQLconnection

Before creating an Azure SQL Warehouse connection without Spark, complete the following prerequisites.

  1. Unity Catalog must be enabled on your Databricks cluster.

  2. In Unity Catalog, create storage credentials and an external location with appropriate access permissions for both the source and destination locations. You must have permission to create storage credentials and external locations.

  3. To create storage credentials, first, create an access connector for Azure Databricks. Assign that access connector while creating storage credentials.

  4. Grant the managed identity access to the storage account. You must have the Owner or User Access Administrator Azure RBAC role to grant permission to access your storage account.

    1. Log in to your Azure Data Lake Storage Gen2 account.

    2. Go to Access Control (IAM), click + Add, and select Add role assignment.

    3. Select the Storage Blob Data Contributor role and click Next.

    4. Under Assign access, select a Managed identity.

    5. Click +Select Members, and select Access connector for Azure Databricks or User-assigned managed identity.

    6. Search for your connector name or user-assigned identity, select it, and click Review and Assign.

  5. Fetch the ABFSS path till the parent directory of your respective storage account.

  6. On the Unity Catalog page, go to external locations and create a new external location by providing an external location name, newly created Storage credential, and fetched ABFSS URL. The URL format should be in abfss://my-container-name@my-storage-account.dfs.core.windows.net/<path>

  7. Test the newly created external location by clicking the 'Test connection' button. This will validate the connection with the external location path. Through the permissions tab, assign your user the CREATE EXTERNAL TABLE and WRITE FILESrole.

  8. Go to semantic model Advanced Properties, and add the following properties:

    1. temp catalog name for create parquet table = kyvos.sqlwarehouse.catalog

    2. temp database name for create parquet table = kyvos.sqlwarehouse.tempdb

  9. You must have 'create table and Use schema' permissions on the temp catalog.

Prerequisites for OAuth Connection
Anchor
prereq
prereq

To create Databricks SQL Warehouse connection with OAuth, you need to complete the following prerequisites.

...

For more detail about Authentication settings for the Databricks JDBC Driver, refer to AWS documentation.

Creating Azure SQL Warehouse connection with no-Spark
Anchor
withnospark
withnospark

To create Azure SQL Warehouse connection for processing semantic models without Spark, perform the following steps.

...

  1. After you finish configuring the settings using the table shown below the screenshot, click the Test button from the top left to validate the connection settings.

  2. If the connection is valid, click the Save button. 

...

Connection Details in Kyvos for Sanity Suite
Anchor
sanitysuite
sanitysuite

To create Azure SQL Warehouse connection for sanity suite, perform the following steps.

Parameter

Description

Name

Enter SanityConnection as a name.

Category

Select the Warehouse option.

Provider

Select the Generic option.

Driver

Enter the Driver class as com.databricks.client.jdbc.Driver 

URL

Enter JDBC Databricks URL.

Username

Enter Databricks token.

Password

Enter Databricks token.

Authentication Type

Select Personal Access Token.

Use as Source

This checkbox is auto selected. Enter Spark Read Method as JDBC.

Is Default SQL Engine

To enable the connection for raw data, click the Is Default SQL Engine checkbox to set this connection to run the default SQL engine.

Properties

Click Properties to view or set properties.

Catalog Enabled

Select this checkbox to list different catalog created in the workspace.

Working with Databricks SQL warehouse with Personal Token and OAuth
Anchor
personaltokenandoauth
personaltokenandoauth

Kyvos supports Databricks SQL warehouse to enable the execution of ROLAP and MOLAP queries.

...

Creating Databricks SQL Warehouse connection with Personal Token
Anchor
PT
PT

  1. From the Toolbox, click Setup, then choose Connections.

  2. From the Actions menu (  ), click Add Connection.

  3. Enter a Name for the connection.

  4. From the Category drop-down list, select the Warehouse option.

  5. From the Provider list, select the DatabricksSQL option.

  6. The Driver Class field is prepopulated.

  7. Enter Databricks SQL Warehouse JDBC URL. For more information, see Microsoft documentation.

  8. Select OAuth from the Personal Access Token list.

  9. Enter Username as token.

  10. Enter Databricks SQL Personal Access Token for the Databricks SQL workspace in the Password field.

  11. The Use as Source checkbox is disabled as this is a source connection.

  12. To use this connection as the default SQL engine, select the Is Default SQL Engine checkbox.

  13. Select the Catalog Enabled checkbox. It is mandatory to enabled it for Databricks SQL.

  14. Click the Properties link to view or set properties.

  15. After configuring the settings, click the Save button. 

  16. To refresh connections, click the Actions menu ( ⋮ ) at the top of the Connections column and select Refresh.

...

Creating Databricks SQL Warehouse connection with OAuth
Anchor
OAUTH
OAUTH

You can create Databricks SQL Warehouse connection with OAuth machine-to-machine (M2M) authentication type. See Prerequisites for more details.

Creating Databricks SQL Warehouse connection with OAuth
Anchor
OAuth-1
OAuth-1

  1. From the Toolbox, click Setup, then choose Connections.

  2. From the Actions menu (  ), click Add Connection.

  3. Enter a Name for the connection.

  4. From the Category drop-down list, select the Warehouse option.

  5. From the Provider list, select the DatabricksSQL option.

  6. The Driver Class field is prepopulated.

  7. Enter Databricks SQL Warehouse JDBC URL. For more information, see Microsoft documentation.

  8. Select OAuth from the Authentication Type list. Upon selecting this option, the Client ID and Client Secret fields are displayed.

  9. In the Client ID field, enter the value of the service principal’s Application ID, Client ID, or OAuth2ClientId.

  10. In the Client Secret field, enter the value of the secret created for the above service principal or the OAuth2ClientId.

  11. The Use as Source checkbox is disabled as this is a source connection.

  12. To use this connection as the default SQL engine, select the Is Default SQL Engine checkbox.

  13. Select the Catalog Enabled checkbox. It is mandatory to enabled it for Databricks SQL.

  14. Click the Properties link to view or set properties.

  15. After configuring the settings, click the Save button. 

  16. To refresh connections, click the Actions menu ( ⋮ ) at the top of the Connections column and select Refresh.

...