Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Prerequisites for creating AWS SQL Warehouse Connection
Anchor
AzureSQLconnection
AzureSQLconnection

...

  1. Unity Catalog must be enabled on your Databricks cluster.In Unity Catalog

  2. , To create storage credentials and an external location with appropriate access permissions for both the source and destination locations. You must have permission to create storage credentials and external locations.On the Unity Catalog page, go to external locations and create a new external location by providing an external location name, newly created Storage credential, and fetched ABFSS URL. The URL format should be in <>a storage credential for connecting to AWS S3, refer to Databricks documentation.

  3. Test the newly created external location by clicking the 'Test connection' button. This will validate the connection with the external location path. Through the permissions tab, assign your user the CREATE EXTERNAL TABLE role.

  4. Go to semantic model Advanced Properties, and add the following properties:

    1. temp catalog name for create parquet table = kyvos.sqlwarehouse.catalog

    2. temp database name for create parquet table = kyvos.sqlwarehouse.tempdb

  5. You must have 'create table and Use schema' permissions on the temp catalog.

...

To create Databricks SQL Warehouse connection with OAuth, you need to complete the following prerequisites.

...

Create a Service Principal

  1. Go to the user profile, click Settings, then Identity and Access. Under Service Principal [Manage], click Add Service Principal, and note generated Application ID or Client ID; this will be OAuth2ClientId.

  2. Generate a secret for this Service Principal and note it down as it is OAuth2Secret.

...

Prepare the JDBC URL:
Get the JDBC URL from the cluster's JDBC tab. For example, if the JDBC URL is:

Code Block
jdbc:databricks://adb-650081446221384.4.azuredatabricks.net:443/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/650081446221384/0809-080132-671a9aak;AuthMech=3;UID=token;PWD=<personal-access-token>

Change the URL as follows (change AuthMech=11 and add Auth_Flow=1 to the parameters):

Code Block
jdbc:databricks://adb-650081446221384.4.azuredatabricks.net:443/default;transportMode=http;ssl=1;AuthMech=11;httpPath=sql/protocolv1/o/650081446221384/0809-080132-671a9aak;Auth_Flow=1

...

Grant Permission to process semantic model using the above connection: To process semantic models using the above connection, grant permission on the external location (create an external location if it does not already exist for the storage account being used in your environment, such as "dev_testing"). For this service principal:

  • Click Catalog.

  • Click the Settings icon.

  • Click the Storage Location Name.

  • Create external table and write files access

...

Assign a role for Databricks SQL Warehouse connection:
Just as a role is assigned in the case of the Databricks SQL Warehouse connection, you need to add the "Storage Blob Data Contributor" role assignment for the above service principal on the storage account:

  • Go to the Azure portal.

  • Go to the storage account, click Access Control (IAM).

  • Add a role assignment, select Storage Blob Data Contributor.

  • Select Managed Identity, choose the appropriate subscription [Dev-Enterprise or QA Enterprise], select the required managed identity, enter the Databricks connector name and Save.

Add the following properties to semantic model Advanced Properties:

...

  1. catalog

...

kyvos.sqlwarehouse.tempdb = <temporary database for temporary tables created while cubes building>

...

  1. .

Creating AWS SQL Warehouse connection with no-Spark
Anchor
withnospark
withnospark

...

Parameter/Field

Comments/Description

Name

Enter a unique name for the connection.

Category

Select the Warehouse option.  There may be more than one warehouse connection.

Provider

Select DATABRICKSSQL from the list.

Driver class

This field will be auto selected. The Driver class (com.databricks.client.jdbc.Driver ) is required to connect to the Azure SQL DB.

URL

In the URL field, enter the Databricks SQL Warehouse JDBC URL.

For more information, see Microsoft documentation.

Authentication Type

Select Personal Access Token or OAuth.

If the OAuth option is selected, the Client ID and Client Secret fields are displayed.

  • In the Client ID field, enter the value of the service principal’s Application ID, Client ID, or OAuth2ClientId.

  • In the Client Secret field, enter the value of the secret created for the above service principal or the OAuth2ClientId

    .

    User Name

    Enter the text token for username.

    Password

    Enter Databricks SQL Personal Access Token’s value for the Databricks SQL workspace

    Use as Source

    This checkbox is auto selected. Enter Spark Read Method as JDBC

    Is default SQL Engine

    By default, the checkbox is selected as this connection can only be used to read data (creating datasets) on which the semantic model will be created. 

    Catalog Enabled

    Select this checkbox to list different catalog created in the workspace.

    Properties

    Click Properties to view or set properties.

    ...

    Parameter

    Description

    Name

    Enter SanityConnection as a name.

    Category

    Select the Warehouse option.

    Provider

    Select the Generic option.

    Driver

    Enter the Driver class as com.databricks.client.jdbc.Driver 

    URL

    Enter JDBC Databricks URL.

    Username

    Enter Databricks token.

    Password

    Enter Databricks token.

    Authentication Type

    Select Personal Access Token.

    Use as Source

    This checkbox is auto selected. Enter Spark Read Method as JDBC.

    Is Default SQL Engine

    To enable the connection for raw data, click the Is Default SQL Engine checkbox to set this connection to run the default SQL engine.

    Properties

    Click Properties to view or set properties.

    Catalog Enabled

    Select this checkbox to list different catalog created in the workspace.

    Working with Databricks SQL warehouse with Personal Token

    ...

    Anchor
    personaltokenandoauth
    personaltokenandoauth

    Kyvos supports Databricks SQL warehouse to enable the execution of ROLAP and MOLAP queries.

    Panel
    panelIconIdatlassian-note
    panelIcon:note:
    bgColor#DEEBFF

    Note

    • Supported for AWS and Azure.

    • Supported only with premium workspace.

    • Supported with authentication types such as Personal Access Token and OAuth.

    • The serverless type Databricks SQL Warehouse cluster is only supported.

    • Unity Catalog must be enabled.

    ...

    Creating Databricks SQL Warehouse connection with Personal Token

    ...

    Creating Databricks SQL Warehouse connection with Personal Token
    Anchor
    PT
    PT

    1. From the Toolbox, click Setup, then choose Connections.

    2. From the Actions menu (  ), click Add Connection.

    3. Enter a Name for the connection.

    4. From the Category drop-down list, select the Warehouse option.

    5. From the Provider list, select the DatabricksSQL option.

    6. The Driver Class field is prepopulated.

    7. Enter Databricks SQL Warehouse JDBC URL. For more information, see Microsoft documentation.

    8. Select OAuth from the Personal Access Token list.

    9. Enter Username as token.

    10. Enter Databricks SQL Personal Access Token for the Databricks SQL workspace in the Password field.

    11. The Use as Source checkbox is disabled as this is a source connection.

    12. To use this connection as the default SQL engine, select the Is Default SQL Engine checkbox.

    13. Select the Catalog Enabled checkbox. It is mandatory to enabled it for Databricks SQL.

    14. Click the Properties link to view or set properties.

    15. After configuring the settings, click the Save button. 

    16. To refresh connections, click the Actions menu ( ⋮ ) at the top of the Connections column and select Refresh.

    ...

    You can create Databricks SQL Warehouse connection with OAuth machine-to-machine (M2M) authentication type. See Prerequisites for more details.

    ...

    1. From the Toolbox, click Setup, then choose Connections.

    2. From the Actions menu (  ), click Add Connection.

    3. Enter a Name for the connection.

    4. From the Category drop-down list, select the Warehouse option.

    5. From the Provider list, select the DatabricksSQL option.

    6. The Driver Class field is prepopulated.

    7. Enter Databricks SQL Warehouse JDBC URL. For more information, see Microsoft documentation.

    8. Select OAuth from the Authentication Type list. Upon selecting this option, the Client ID and Client Secret fields are displayed.

    9. In the Client ID field, enter the value of the service principal’s Application ID, Client ID, or OAuth2ClientId.

    10. In the Client Secret field, enter the value of the secret created for the above service principal or the OAuth2ClientId.

    11. The Use as Source checkbox is disabled as this is a source connection.

    12. To use this connection as the default SQL engine, select the Is Default SQL Engine checkbox.

    13. Select the Catalog Enabled checkbox. It is mandatory to enabled it for Databricks SQL.

    14. Click the Properties link to view or set properties.

    15. After configuring the settings, click the Save button. 

    16. To refresh connections, click the Actions menu ( ⋮ ) at the top of the Connections column and select Refresh.

    ...